WorldWideScience

Sample records for weibull distribution analysis

  1. Analysis of the upper-truncated Weibull distribution for wind speed

    International Nuclear Information System (INIS)

    Kantar, Yeliz Mert; Usta, Ilhan

    2015-01-01

    Highlights: • Upper-truncated Weibull distribution is proposed to model wind speed. • Upper-truncated Weibull distribution nests Weibull distribution as special case. • Maximum likelihood is the best method for upper-truncated Weibull distribution. • Fitting accuracy of upper-truncated Weibull is analyzed on wind speed data. - Abstract: Accurately modeling wind speed is critical in estimating the wind energy potential of a certain region. In order to model wind speed data smoothly, several statistical distributions have been studied. Truncated distributions are defined as a conditional distribution that results from restricting the domain of statistical distribution and they also cover base distribution. This paper proposes, for the first time, the use of upper-truncated Weibull distribution, in modeling wind speed data and also in estimating wind power density. In addition, a comparison is made between upper-truncated Weibull distribution and well known Weibull distribution using wind speed data measured in various regions of Turkey. The obtained results indicate that upper-truncated Weibull distribution shows better performance than Weibull distribution in estimating wind speed distribution and wind power. Therefore, upper-truncated Weibull distribution can be an alternative for use in the assessment of wind energy potential

  2. Modifications of the Weibull distribution: A review

    International Nuclear Information System (INIS)

    Almalki, Saad J.; Nadarajah, Saralees

    2014-01-01

    It is well known that the Weibull distribution is the most popular and the most widely used distribution in reliability and in analysis of lifetime data. Unfortunately, its hazard function cannot exhibit non-monotonic shapes like the bathtub shape or the unimodal shape. Since 1958, the Weibull distribution has been modified by many researchers to allow for non-monotonic hazard functions. This paper gives an extensive review of some discrete and continuous versions of the modifications of the Weibull distribution. - Highlights: • A comprehensive review of known discrete modifications and generalizations of the Weibull distribution. • A comprehensive review of known continuous modifications and generalizations of the Weibull distribution. • Over 110 references on modifications/generalizations of the Weibull distribution. • More than 55% of the cited references appeared in the last 5 years

  3. Weibull distribution in reliability data analysis in nuclear power plant

    International Nuclear Information System (INIS)

    Ma Yingfei; Zhang Zhijian; Zhang Min; Zheng Gangyang

    2015-01-01

    Reliability is an important issue affecting each stage of the life cycle ranging from birth to death of a product or a system. The reliability engineering includes the equipment failure data processing, quantitative assessment of system reliability and maintenance, etc. Reliability data refers to the variety of data that describe the reliability of system or component during its operation. These data may be in the form of numbers, graphics, symbols, texts and curves. Quantitative reliability assessment is the task of the reliability data analysis. It provides the information related to preventing, detect, and correct the defects of the reliability design. Reliability data analysis under proceed with the various stages of product life cycle and reliability activities. Reliability data of Systems Structures and Components (SSCs) in Nuclear Power Plants is the key factor of probabilistic safety assessment (PSA); reliability centered maintenance and life cycle management. The Weibull distribution is widely used in reliability engineering, failure analysis, industrial engineering to represent manufacturing and delivery times. It is commonly used to model time to fail, time to repair and material strength. In this paper, an improved Weibull distribution is introduced to analyze the reliability data of the SSCs in Nuclear Power Plants. An example is given in the paper to present the result of the new method. The Weibull distribution of mechanical equipment for reliability data fitting ability is very strong in nuclear power plant. It's a widely used mathematical model for reliability analysis. The current commonly used methods are two-parameter and three-parameter Weibull distribution. Through comparison and analysis, the three-parameter Weibull distribution fits the data better. It can reflect the reliability characteristics of the equipment and it is more realistic to the actual situation. (author)

  4. The McDonald Modified Weibull Distribution: Properties and Applications

    OpenAIRE

    Merovci, Faton; Elbatal, Ibrahim

    2013-01-01

    A six parameter distribution so-called the McDonald modified Weibull distribution is defined and studied. The new distribution contains, as special submodels, several important distributions discussed in the literature, such as the beta modified Weibull, Kumaraswamy modified Weibull, McDonald Weibull and modified Weibull distribution,among others. The new distribution can be used effectively in the analysis of survival data since it accommodates monotone, unimodal and bathtub-shaped hazard fu...

  5. Statistical Analysis Of Failure Strength Of Material Using Weibull Distribution

    International Nuclear Information System (INIS)

    Entin Hartini; Mike Susmikanti; Antonius Sitompul

    2008-01-01

    In evaluation of ceramic and glass materials strength a statistical approach is necessary Strength of ceramic and glass depend on its measure and size distribution of flaws in these material. The distribution of strength for ductile material is narrow and close to a Gaussian distribution while strength of brittle materials as ceramic and glass following Weibull distribution. The Weibull distribution is an indicator of the failure of material strength resulting from a distribution of flaw size. In this paper, cumulative probability of material strength to failure probability, cumulative probability of failure versus fracture stress and cumulative probability of reliability of material were calculated. Statistical criteria calculation supporting strength analysis of Silicon Nitride material were done utilizing MATLAB. (author)

  6. Transmuted Complementary Weibull Geometric Distribution

    Directory of Open Access Journals (Sweden)

    Ahmed Z. A…fify

    2014-12-01

    Full Text Available This paper provides a new generalization of the complementary Weibull geometric distribution that introduced by Tojeiro et al. (2014, using the quadratic rank transmutation map studied by Shaw and Buckley (2007. The new distribution is referred to as transmuted complementary Weibull geometric distribution (TCWGD. The TCWG distribution includes as special cases the complementary Weibull geometric distribution (CWGD, complementary exponential geometric distribution(CEGD,Weibull distribution (WD and exponential distribution (ED. Various structural properties of the new distribution including moments, quantiles, moment generating function and RØnyi entropy of the subject distribution are derived. We proposed the method of maximum likelihood for estimating the model parameters and obtain the observed information matrix. A real data set are used to compare the ‡exibility of the transmuted version versus the complementary Weibull geometric distribution.

  7. Transmuted Generalized Inverse Weibull Distribution

    OpenAIRE

    Merovci, Faton; Elbatal, Ibrahim; Ahmed, Alaa

    2013-01-01

    A generalization of the generalized inverse Weibull distribution so-called transmuted generalized inverse Weibull dis- tribution is proposed and studied. We will use the quadratic rank transmutation map (QRTM) in order to generate a flexible family of probability distributions taking generalized inverse Weibull distribution as the base value distribution by introducing a new parameter that would offer more distributional flexibility. Various structural properties including explicit expression...

  8. Reliability Estimation of Aero-engine Based on Mixed Weibull Distribution Model

    Science.gov (United States)

    Yuan, Zhongda; Deng, Junxiang; Wang, Dawei

    2018-02-01

    Aero-engine is a complex mechanical electronic system, based on analysis of reliability of mechanical electronic system, Weibull distribution model has an irreplaceable role. Till now, only two-parameter Weibull distribution model and three-parameter Weibull distribution are widely used. Due to diversity of engine failure modes, there is a big error with single Weibull distribution model. By contrast, a variety of engine failure modes can be taken into account with mixed Weibull distribution model, so it is a good statistical analysis model. Except the concept of dynamic weight coefficient, in order to make reliability estimation result more accurately, three-parameter correlation coefficient optimization method is applied to enhance Weibull distribution model, thus precision of mixed distribution reliability model is improved greatly. All of these are advantageous to popularize Weibull distribution model in engineering applications.

  9. The Weibull distribution a handbook

    CERN Document Server

    Rinne, Horst

    2008-01-01

    The Most Comprehensive Book on the SubjectChronicles the Development of the Weibull Distribution in Statistical Theory and Applied StatisticsExploring one of the most important distributions in statistics, The Weibull Distribution: A Handbook focuses on its origin, statistical properties, and related distributions. The book also presents various approaches to estimate the parameters of the Weibull distribution under all possible situations of sampling data as well as approaches to parameter and goodness-of-fit testing.Describes the Statistical Methods, Concepts, Theories, and Applications of T

  10. The Transmuted Generalized Inverse Weibull Distribution

    Directory of Open Access Journals (Sweden)

    Faton Merovci

    2014-05-01

    Full Text Available A generalization of the generalized inverse Weibull distribution the so-called transmuted generalized inverse Weibull distribution is proposed and studied. We will use the quadratic rank transmutation map (QRTM in order to generate a flexible family of probability distributions taking the generalized inverseWeibull distribution as the base value distribution by introducing a new parameter that would offer more distributional flexibility. Various structural properties including explicit expressions for the moments, quantiles, and moment generating function of the new distribution are derived. We propose the method of maximum likelihood for estimating the model parameters and obtain the observed information matrix. A real data set are used to compare the flexibility of the transmuted version versus the generalized inverse Weibull distribution.

  11. The Beta Transmuted Weibull Distribution

    Directory of Open Access Journals (Sweden)

    Manisha Pal

    2014-06-01

    Full Text Available The paper introduces a beta transmuted Weibull distribution, which contains a number ofdistributions as special cases. The properties of the distribution are discussed and explicit expressions are derived for the mean deviations, Bonferroni and Lorenz curves, and reliability. The distribution and moments of order statistics are also studied. Estimation of the model parameters by the method of maximum likelihood is discussed. The log beta transmuted Weibull model is introduced to analyze censored data. Finally, the usefulness of the new distribution in analyzing positive data is illustrated.

  12. Probabilistic analysis of glass elements with three-parameter Weibull distribution

    International Nuclear Information System (INIS)

    Ramos, A.; Muniz-Calvente, M.; Fernandez, P.; Fernandez Cantel, A.; Lamela, M. J.

    2015-01-01

    Glass and ceramics present a brittle behaviour so a large scatter in the test results is obtained. This dispersion is mainly due to the inevitable presence of micro-cracks on its surface, edge defects or internal defects, which must be taken into account using an appropriate failure criteria non-deterministic but probabilistic. Among the existing probability distributions, the two or three parameter Weibull distribution is generally used in adjusting material resistance results, although the method of use thereof is not always correct. Firstly, in this work, the results of a large experimental programme using annealed glass specimens of different dimensions based on four-point bending and coaxial double ring tests was performed. Then, the finite element models made for each type of test, the adjustment of the parameters of the three-parameter Weibull distribution function (cdf) (λ: location, β: shape, d: scale) for a certain failure criterion and the calculation of the effective areas from the cumulative distribution function are presented. Summarizing, this work aims to generalize the use of the three-parameter Weibull function in structural glass elements with stress distributions not analytically described, allowing to apply the probabilistic model proposed in general loading distributions. (Author)

  13. Single versus mixture Weibull distributions for nonparametric satellite reliability

    International Nuclear Information System (INIS)

    Castet, Jean-Francois; Saleh, Joseph H.

    2010-01-01

    Long recognized as a critical design attribute for space systems, satellite reliability has not yet received the proper attention as limited on-orbit failure data and statistical analyses can be found in the technical literature. To fill this gap, we recently conducted a nonparametric analysis of satellite reliability for 1584 Earth-orbiting satellites launched between January 1990 and October 2008. In this paper, we provide an advanced parametric fit, based on mixture of Weibull distributions, and compare it with the single Weibull distribution model obtained with the Maximum Likelihood Estimation (MLE) method. We demonstrate that both parametric fits are good approximations of the nonparametric satellite reliability, but that the mixture Weibull distribution provides significant accuracy in capturing all the failure trends in the failure data, as evidenced by the analysis of the residuals and their quasi-normal dispersion.

  14. Using the Weibull distribution reliability, modeling and inference

    CERN Document Server

    McCool, John I

    2012-01-01

    Understand and utilize the latest developments in Weibull inferential methods While the Weibull distribution is widely used in science and engineering, most engineers do not have the necessary statistical training to implement the methodology effectively. Using the Weibull Distribution: Reliability, Modeling, and Inference fills a gap in the current literature on the topic, introducing a self-contained presentation of the probabilistic basis for the methodology while providing powerful techniques for extracting information from data. The author explains the use of the Weibull distribution

  15. A CLASS OF WEIGHTED WEIBULL DISTRIBUTION

    Directory of Open Access Journals (Sweden)

    Saman Shahbaz

    2010-07-01

    Full Text Available The weighted Weibull model is proposed following the method of Azzalini (1985. Basic properties of the distribution; including moments, generating function, hazard rate function and estimation of parameters; have been studied. The weighted Weibull model is proposed following the method of Azzalini (1985. Basic properties of the distribution; including moments, generating function, hazard rate function and estimation of parameters; have been studied.

  16. Using Weibull Distribution Analysis to Evaluate ALARA Performance

    International Nuclear Information System (INIS)

    Frome, E.L.; Watkins, J.P.; Hagemeyer, D.A.

    2009-01-01

    As Low as Reasonably Achievable (ALARA) is the underlying principle for protecting nuclear workers from potential health outcomes related to occupational radiation exposure. Radiation protection performance is currently evaluated by measures such as collective dose and average measurable dose, which do not indicate ALARA performance. The purpose of this work is to show how statistical modeling of individual doses using the Weibull distribution can provide objective supplemental performance indicators for comparing ALARA implementation among sites and for insights into ALARA practices within a site. Maximum likelihood methods were employed to estimate the Weibull shape and scale parameters used for performance indicators. The shape parameter reflects the effectiveness of maximizing the number of workers receiving lower doses and is represented as the slope of the fitted line on a Weibull probability plot. Additional performance indicators derived from the model parameters include the 99th percentile and the exceedance fraction. When grouping sites by collective total effective dose equivalent (TEDE) and ranking by 99th percentile with confidence intervals, differences in performance among sites can be readily identified. Applying this methodology will enable more efficient and complete evaluation of the effectiveness of ALARA implementation.

  17. Bayesian Estimation of the Kumaraswamy InverseWeibull Distribution

    Directory of Open Access Journals (Sweden)

    Felipe R.S. de Gusmao

    2017-05-01

    Full Text Available The Kumaraswamy InverseWeibull distribution has the ability to model failure rates that have unimodal shapes and are quite common in reliability and biological studies. The three-parameter Kumaraswamy InverseWeibull distribution with decreasing and unimodal failure rate is introduced. We provide a comprehensive treatment of the mathematical properties of the Kumaraswany Inverse Weibull distribution and derive expressions for its moment generating function and the ligrl/ig-th generalized moment. Some properties of the model with some graphs of density and hazard function are discussed. We also discuss a Bayesian approach for this distribution and an application was made for a real data set.

  18. A Weibull distribution accrual failure detector for cloud computing.

    Science.gov (United States)

    Liu, Jiaxi; Wu, Zhibo; Wu, Jin; Dong, Jian; Zhao, Yao; Wen, Dongxin

    2017-01-01

    Failure detectors are used to build high availability distributed systems as the fundamental component. To meet the requirement of a complicated large-scale distributed system, accrual failure detectors that can adapt to multiple applications have been studied extensively. However, several implementations of accrual failure detectors do not adapt well to the cloud service environment. To solve this problem, a new accrual failure detector based on Weibull Distribution, called the Weibull Distribution Failure Detector, has been proposed specifically for cloud computing. It can adapt to the dynamic and unexpected network conditions in cloud computing. The performance of the Weibull Distribution Failure Detector is evaluated and compared based on public classical experiment data and cloud computing experiment data. The results show that the Weibull Distribution Failure Detector has better performance in terms of speed and accuracy in unstable scenarios, especially in cloud computing.

  19. A MULTIVARIATE WEIBULL DISTRIBUTION

    Directory of Open Access Journals (Sweden)

    Cheng Lee

    2010-07-01

    Full Text Available A multivariate survival function of Weibull Distribution is developed by expanding the theorem by Lu and Bhattacharyya. From the survival function, the probability density function, the cumulative probability function, the determinant of the Jacobian Matrix, and the general moment are derived.

  20. Weibull and lognormal Taguchi analysis using multiple linear regression

    International Nuclear Information System (INIS)

    Piña-Monarrez, Manuel R.; Ortiz-Yañez, Jesús F.

    2015-01-01

    The paper provides to reliability practitioners with a method (1) to estimate the robust Weibull family when the Taguchi method (TM) is applied, (2) to estimate the normal operational Weibull family in an accelerated life testing (ALT) analysis to give confidence to the extrapolation and (3) to perform the ANOVA analysis to both the robust and the normal operational Weibull family. On the other hand, because the Weibull distribution neither has the normal additive property nor has a direct relationship with the normal parameters (µ, σ), in this paper, the issues of estimating a Weibull family by using a design of experiment (DOE) are first addressed by using an L_9 (3"4) orthogonal array (OA) in both the TM and in the Weibull proportional hazard model approach (WPHM). Then, by using the Weibull/Gumbel and the lognormal/normal relationships and multiple linear regression, the direct relationships between the Weibull and the lifetime parameters are derived and used to formulate the proposed method. Moreover, since the derived direct relationships always hold, the method is generalized to the lognormal and ALT analysis. Finally, the method’s efficiency is shown through its application to the used OA and to a set of ALT data. - Highlights: • It gives the statistical relations and steps to use the Taguchi Method (TM) to analyze Weibull data. • It gives the steps to determine the unknown Weibull family to both the robust TM setting and the normal ALT level. • It gives a method to determine the expected lifetimes and to perform its ANOVA analysis in TM and ALT analysis. • It gives a method to give confidence to the extrapolation in an ALT analysis by using the Weibull family of the normal level.

  1. Statistical analysis of wind speed using two-parameter Weibull distribution in Alaçatı region

    International Nuclear Information System (INIS)

    Ozay, Can; Celiktas, Melih Soner

    2016-01-01

    Highlights: • Wind speed & direction data from September 2008 to March 2014 has been analyzed. • Mean wind speed for the whole data set has been found to be 8.11 m/s. • Highest wind speed is observed in July with a monthly mean value of 9.10 m/s. • Wind speed with the most energy has been calculated as 12.77 m/s. • Observed data has been fit to a Weibull distribution and k &c parameters have been calculated as 2.05 and 9.16. - Abstract: Weibull Statistical Distribution is a common method for analyzing wind speed measurements and determining wind energy potential. Weibull probability density function can be used to forecast wind speed, wind density and wind energy potential. In this study a two-parameter Weibull statistical distribution is used to analyze the wind characteristics of Alaçatı region, located in Çeşme, İzmir. The data used in the density function are acquired from a wind measurement station in Alaçatı. Measurements were gathered on three different heights respectively 70, 50 and 30 m between 10 min intervals for five and half years. As a result of this study; wind speed frequency distribution, wind direction trends, mean wind speed, and the shape and the scale (k&c) Weibull parameters have been calculated for the region. Mean wind speed for the entirety of the data set is found to be 8.11 m/s. k&c parameters are found as 2.05 and 9.16 in relative order. Wind direction analysis along with a wind rose graph for the region is also provided with the study. Analysis suggests that higher wind speeds which range from 6–12 m/s are prevalent between the sectors 340–360°. Lower wind speeds, from 3 to 6 m/s occur between sectors 10–29°. Results of this study contribute to the general knowledge about the regions wind energy potential and can be used as a source for investors and academics.

  2. Comparison of estimation methods for fitting weibull distribution to ...

    African Journals Online (AJOL)

    Comparison of estimation methods for fitting weibull distribution to the natural stand of Oluwa Forest Reserve, Ondo State, Nigeria. ... Journal of Research in Forestry, Wildlife and Environment ... The result revealed that maximum likelihood method was more accurate in fitting the Weibull distribution to the natural stand.

  3. Probabilistic analysis of glass elements with three-parameter Weibull distribution; Analisis probabilistico de elementos de vidrio recocido mediante una distribucion triparametrica Weibull

    Energy Technology Data Exchange (ETDEWEB)

    Ramos, A.; Muniz-Calvente, M.; Fernandez, P.; Fernandez Cantel, A.; Lamela, M. J.

    2015-10-01

    Glass and ceramics present a brittle behaviour so a large scatter in the test results is obtained. This dispersion is mainly due to the inevitable presence of micro-cracks on its surface, edge defects or internal defects, which must be taken into account using an appropriate failure criteria non-deterministic but probabilistic. Among the existing probability distributions, the two or three parameter Weibull distribution is generally used in adjusting material resistance results, although the method of use thereof is not always correct. Firstly, in this work, the results of a large experimental programme using annealed glass specimens of different dimensions based on four-point bending and coaxial double ring tests was performed. Then, the finite element models made for each type of test, the adjustment of the parameters of the three-parameter Weibull distribution function (cdf) (λ: location, β: shape, d: scale) for a certain failure criterion and the calculation of the effective areas from the cumulative distribution function are presented. Summarizing, this work aims to generalize the use of the three-parameter Weibull function in structural glass elements with stress distributions not analytically described, allowing to apply the probabilistic model proposed in general loading distributions. (Author)

  4. Bayesian estimation of Weibull distribution parameters

    International Nuclear Information System (INIS)

    Bacha, M.; Celeux, G.; Idee, E.; Lannoy, A.; Vasseur, D.

    1994-11-01

    In this paper, we expose SEM (Stochastic Expectation Maximization) and WLB-SIR (Weighted Likelihood Bootstrap - Sampling Importance Re-sampling) methods which are used to estimate Weibull distribution parameters when data are very censored. The second method is based on Bayesian inference and allow to take into account available prior informations on parameters. An application of this method, with real data provided by nuclear power plants operation feedback analysis has been realized. (authors). 8 refs., 2 figs., 2 tabs

  5. On alternative q-Weibull and q-extreme value distributions: Properties and applications

    Science.gov (United States)

    Zhang, Fode; Ng, Hon Keung Tony; Shi, Yimin

    2018-01-01

    Tsallis statistics and Tsallis distributions have been attracting a significant amount of research work in recent years. Importantly, the Tsallis statistics, q-distributions have been applied in different disciplines. Yet, a relationship between some existing q-Weibull distributions and q-extreme value distributions that is parallel to the well-established relationship between the conventional Weibull and extreme value distributions through a logarithmic transformation has not be established. In this paper, we proposed an alternative q-Weibull distribution that leads to a q-extreme value distribution via the q-logarithm transformation. Some important properties of the proposed q-Weibull and q-extreme value distributions are studied. Maximum likelihood and least squares estimation methods are used to estimate the parameters of q-Weibull distribution and their performances are investigated through a Monte Carlo simulation study. The methodologies and the usefulness of the proposed distributions are illustrated by fitting the 2014 traffic fatalities data from The National Highway Traffic Safety Administration.

  6. The McDonald’s Inverse Weibull Distribution

    Directory of Open Access Journals (Sweden)

    Saman Hanif Shahbaz

    2016-12-01

    Full Text Available We have proposed a new Inverse Weibull distribution by using the generalized Beta distribution of McDonald (1984. Basic properties of the proposed distribution has been studied. Parameter estimation has been discussed alongside an illustrative example.

  7. Evaluation of burst probability for tubes by Weibull distributions

    International Nuclear Information System (INIS)

    Kao, S.

    1975-10-01

    The investigations of candidate distributions that best describe the burst pressure failure probability characteristics of nuclear power steam generator tubes has been continued. To date it has been found that the Weibull distribution provides an acceptable fit for the available data from both the statistical and physical viewpoints. The reasons for the acceptability of the Weibull distribution are stated together with the results of tests for the suitability of fit. In exploring the acceptability of the Weibull distribution for the fitting, a graphical method to be called the ''density-gram'' is employed instead of the usual histogram. With this method a more sensible graphical observation on the empirical density may be made for cases where the available data is very limited. Based on these methods estimates of failure pressure are made for the left-tail probabilities

  8. Transmuted New Generalized Inverse Weibull Distribution

    Directory of Open Access Journals (Sweden)

    Muhammad Shuaib Khan

    2017-06-01

    Full Text Available This paper introduces the transmuted new generalized inverse Weibull distribution by using the quadratic rank transmutation map (QRTM scheme studied by Shaw et al. (2007. The proposed model contains the twenty three lifetime distributions as special sub-models. Some mathematical properties of the new distribution are formulated, such as quantile function, Rényi entropy, mean deviations, moments, moment generating function and order statistics. The method of maximum likelihood is used for estimating the model parameters. We illustrate the flexibility and potential usefulness of the new distribution by using reliability data.

  9. (AJST) MULTIPLE DEFECT DISTRIBUTIONS ON WEIBULL ...

    African Journals Online (AJOL)

    such as ceramics, which cannot correctly be statistically described by single Weibull distribution models. (Equations (1) and (2)) [6]. .... bottom filling through a filter at an initial runner velocity of less than 0.5 ms−1 beyond the filter, producing turbulence free conditions. The as-cast test bars were subjected to identical T6 heat.

  10. Analysis of wind speed distributions: Wind distribution function derived from minimum cross entropy principles as better alternative to Weibull function

    International Nuclear Information System (INIS)

    Kantar, Yeliz Mert; Usta, Ilhan

    2008-01-01

    In this study, the minimum cross entropy (MinxEnt) principle is applied for the first time to the wind energy field. This principle allows the inclusion of previous information of a wind speed distribution and covers the maximum entropy (MaxEnt) principle, which is also discussed by Li and Li and Ramirez as special cases in their wind power study. The MinxEnt probability density function (pdf) derived from the MinxEnt principle are used to determine the diurnal, monthly, seasonal and annual wind speed distributions. A comparison between MinxEnt pdfs defined on the basis of the MinxEnt principle and the Weibull pdf on wind speed data, which are taken from different sources and measured in various regions, is conducted. The wind power densities of the considered regions obtained from Weibull and MinxEnt pdfs are also compared. The results indicate that the pdfs derived from the MinxEnt principle fit better to a variety of measured wind speed data than the conventionally applied empirical Weibull pdf. Therefore, it is shown that the MinxEnt principle can be used as an alternative method to estimate both wind distribution and wind power accurately

  11. On changing points of mean residual life and failure rate function for some generalized Weibull distributions

    International Nuclear Information System (INIS)

    Xie, M.; Goh, T.N.; Tang, Y.

    2004-01-01

    The failure rate function and mean residual life function are two important characteristics in reliability analysis. Although many papers have studied distributions with bathtub-shaped failure rate and their properties, few have focused on the underlying associations between the mean residual life and failure rate function of these distributions, especially with respect to their changing points. It is known that the change point for mean residual life can be much earlier than that of failure rate function. In fact, the failure rate function should be flat for a long period of time for a distribution to be useful in practice. When the difference between the change points is large, the flat portion tends to be longer. This paper investigates the change points and focuses on the difference of the changing points. The exponentiated Weibull, a modified Weibull, and an extended Weibull distribution, all with bathtub-shaped failure rate function will be used. Some other issues related to the flatness of the bathtub curve are discussed

  12. Weibull statistic analysis of bending strength in the cemented carbide coatings

    International Nuclear Information System (INIS)

    Yi Yong; Shen Baoluo; Qiu Shaoyu; Li Cong

    2003-01-01

    The theoretical basis using Weibull statistics to analyze the strength of coating has been established that the Weibull distribution will be the asymptotic distribution of strength for coating as the volume of coating increase, provided that the local strength of coating is statistic independent, and has been confirmed in the following test for the bending strength of two cemented carbide coatings. The result shows that Weibull statistics can be well used to analyze the strength of two coatings. (authors)

  13. A spatial scan statistic for survival data based on Weibull distribution.

    Science.gov (United States)

    Bhatt, Vijaya; Tiwari, Neeraj

    2014-05-20

    The spatial scan statistic has been developed as a geographical cluster detection analysis tool for different types of data sets such as Bernoulli, Poisson, ordinal, normal and exponential. We propose a scan statistic for survival data based on Weibull distribution. It may also be used for other survival distributions, such as exponential, gamma, and log normal. The proposed method is applied on the survival data of tuberculosis patients for the years 2004-2005 in Nainital district of Uttarakhand, India. Simulation studies reveal that the proposed method performs well for different survival distribution functions. Copyright © 2013 John Wiley & Sons, Ltd.

  14. Statistics for Ratios of Rayleigh, Rician, Nakagami-m, and Weibull Distributed Random Variables

    Directory of Open Access Journals (Sweden)

    Dragana Č. Pavlović

    2013-01-01

    Full Text Available The distributions of ratios of random variables are of interest in many areas of the sciences. In this brief paper, we present the joint probability density function (PDF and PDF of maximum of ratios μ1=R1/r1 and μ2=R2/r2 for the cases where R1, R2, r1, and r2 are Rayleigh, Rician, Nakagami-m, and Weibull distributed random variables. Random variables R1 and R2, as well as random variables r1 and r2, are correlated. Ascertaining on the suitability of the Weibull distribution to describe fading in both indoor and outdoor environments, special attention is dedicated to the case of Weibull random variables. For this case, analytical expressions for the joint PDF, PDF of maximum, PDF of minimum, and product moments of arbitrary number of ratios μi=Ri/ri, i=1,…,L are obtained. Random variables in numerator, Ri, as well as random variables in denominator, ri, are exponentially correlated. To the best of the authors' knowledge, analytical expressions for the PDF of minimum and product moments of {μi}i=1L are novel in the open technical literature. The proposed mathematical analysis is complemented by various numerical results. An application of presented theoretical results is illustrated with respect to performance assessment of wireless systems.

  15. Statistical analysis of the Vickers micro hardness of precipitates in a Cu-10% wt. Ni-3% wt. Al alloy using the Weibull distribution function

    International Nuclear Information System (INIS)

    Diaz, Gerardo; Donoso, Eduardo; Varschavsky, Ari

    2004-01-01

    A statistical analysis was carried out of the distribution of Vickers micro hardness values of nickel and aluminum atom precipitates from a solid solution of Cu-Ni-Al. Non isothermal calorimetric curves confirmed the formation of two types of precipitates: Ni Al from 45 K to 600 K, and Ni 3 Al from 650 K to 800 K. The micro hardness measurements were done at room temperature in the previously quenched material and submitted to isothermal and iso chronic annealing treatments. A lower dispersion in the distribution of the Vickers micro hardness values in the Ni Al precipitate for the entire formation temperature was determined with a lesser average micro hardness than the Ni 3 Al precipitate. The Weibull modules were estimated from the respective Weibull diagrams. The lesser dispersion was proven by the elevated values of the Wobble modules. The maximum average micro hardness attained by the Ni Al phase was 148, with a Weibull module of 26 and an annealing temperature of 553 K maintained for 40 minutes. The Ni 3 Al reached a maximum average micro hardness of 248 with a Weibull module of 10 and a annealing temperature of 793 K during 40 minutes (CW)

  16. Weibull Distribution for Estimating the Parameters and Application of Hilbert Transform in case of a Low Wind Speed at Kolaghat

    Directory of Open Access Journals (Sweden)

    P Bhattacharya

    2016-09-01

    Full Text Available The wind resource varies with of the day and the season of the year and even some extent from year to year. Wind energy has inherent variances and hence it has been expressed by distribution functions. In this paper, we present some methods for estimating Weibull parameters in case of a low wind speed characterization, namely, shape parameter (k, scale parameter (c and characterize the discrete wind data sample by the discrete Hilbert transform. We know that the Weibull distribution is an important distribution especially for reliability and maintainability analysis. The suitable values for both shape parameter and scale parameters of Weibull distribution are important for selecting locations of installing wind turbine generators. The scale parameter of Weibull distribution also important to determine whether a wind farm is good or not. Thereafter the use of discrete Hilbert transform (DHT for wind speed characterization provides a new era of using DHT besides its application in digital signal processing. Basically in this paper, discrete Hilbert transform has been applied to characterize the wind sample data measured on College of Engineering and Management, Kolaghat, East Midnapore, India in January 2011.

  17. Reliability Implications in Wood Systems of a Bivariate Gaussian-Weibull Distribution and the Associated Univariate Pseudo-truncated Weibull

    Science.gov (United States)

    Steve P. Verrill; James W. Evans; David E. Kretschmann; Cherilyn A. Hatfield

    2014-01-01

    Two important wood properties are the modulus of elasticity (MOE) and the modulus of rupture (MOR). In the past, the statistical distribution of the MOE has often been modeled as Gaussian, and that of the MOR as lognormal or as a two- or three-parameter Weibull distribution. It is well known that MOE and MOR are positively correlated. To model the simultaneous behavior...

  18. Weibull statistical analysis of Krouse type bending fatigue of nuclear materials

    Energy Technology Data Exchange (ETDEWEB)

    Haidyrah, Ahmed S., E-mail: ashdz2@mst.edu [Nuclear Engineering, Missouri University of Science & Technology, 301 W. 14th, Rolla, MO 65409 (United States); Nuclear Science Research Institute, King Abdulaziz City for Science and Technology (KACST), P.O. Box 6086, Riyadh 11442 (Saudi Arabia); Newkirk, Joseph W. [Materials Science & Engineering, Missouri University of Science & Technology, 1440 N. Bishop Ave, Rolla, MO 65409 (United States); Castaño, Carlos H. [Nuclear Engineering, Missouri University of Science & Technology, 301 W. 14th, Rolla, MO 65409 (United States)

    2016-03-15

    A bending fatigue mini-specimen (Krouse-type) was used to study the fatigue properties of nuclear materials. The objective of this paper is to study fatigue for Grade 91 ferritic-martensitic steel using a mini-specimen (Krouse-type) suitable for reactor irradiation studies. These mini-specimens are similar in design (but smaller) to those described in the ASTM B593 standard. The mini specimen was machined by waterjet and tested as-received. The bending fatigue machine was modified to test the mini-specimen with a specially designed adapter. The cycle bending fatigue behavior of Grade 91 was studied under constant deflection. The S–N curve was created and mean fatigue life was analyzed using mean fatigue life. In this study, the Weibull function was predicted probably for high stress to low stress at 563, 310 and 265 MPa. The commercial software Minitab 17 was used to calculate the distribution of fatigue life under different stress levels. We have used 2 and 3- parameters Weibull analysis to introduce the probability of failure. The plots indicated that the 3- parameter Weibull distribution fits the data well.

  19. Weibull statistical analysis of Krouse type bending fatigue of nuclear materials

    International Nuclear Information System (INIS)

    Haidyrah, Ahmed S.; Newkirk, Joseph W.; Castaño, Carlos H.

    2016-01-01

    A bending fatigue mini-specimen (Krouse-type) was used to study the fatigue properties of nuclear materials. The objective of this paper is to study fatigue for Grade 91 ferritic-martensitic steel using a mini-specimen (Krouse-type) suitable for reactor irradiation studies. These mini-specimens are similar in design (but smaller) to those described in the ASTM B593 standard. The mini specimen was machined by waterjet and tested as-received. The bending fatigue machine was modified to test the mini-specimen with a specially designed adapter. The cycle bending fatigue behavior of Grade 91 was studied under constant deflection. The S–N curve was created and mean fatigue life was analyzed using mean fatigue life. In this study, the Weibull function was predicted probably for high stress to low stress at 563, 310 and 265 MPa. The commercial software Minitab 17 was used to calculate the distribution of fatigue life under different stress levels. We have used 2 and 3- parameters Weibull analysis to introduce the probability of failure. The plots indicated that the 3- parameter Weibull distribution fits the data well.

  20. A Monte Carlo study of Weibull reliability analysis for space shuttle main engine components

    Science.gov (United States)

    Abernethy, K.

    1986-01-01

    The incorporation of a number of additional capabilities into an existing Weibull analysis computer program and the results of Monte Carlo computer simulation study to evaluate the usefulness of the Weibull methods using samples with a very small number of failures and extensive censoring are discussed. Since the censoring mechanism inherent in the Space Shuttle Main Engine (SSME) data is hard to analyze, it was decided to use a random censoring model, generating censoring times from a uniform probability distribution. Some of the statistical techniques and computer programs that are used in the SSME Weibull analysis are described. The methods documented in were supplemented by adding computer calculations of approximate (using iteractive methods) confidence intervals for several parameters of interest. These calculations are based on a likelihood ratio statistic which is asymptotically a chisquared statistic with one degree of freedom. The assumptions built into the computer simulations are described. The simulation program and the techniques used in it are described there also. Simulation results are tabulated for various combinations of Weibull shape parameters and the numbers of failures in the samples.

  1. A robust approach based on Weibull distribution for clustering gene expression data

    Directory of Open Access Journals (Sweden)

    Gong Binsheng

    2011-05-01

    Full Text Available Abstract Background Clustering is a widely used technique for analysis of gene expression data. Most clustering methods group genes based on the distances, while few methods group genes according to the similarities of the distributions of the gene expression levels. Furthermore, as the biological annotation resources accumulated, an increasing number of genes have been annotated into functional categories. As a result, evaluating the performance of clustering methods in terms of the functional consistency of the resulting clusters is of great interest. Results In this paper, we proposed the WDCM (Weibull Distribution-based Clustering Method, a robust approach for clustering gene expression data, in which the gene expressions of individual genes are considered as the random variables following unique Weibull distributions. Our WDCM is based on the concept that the genes with similar expression profiles have similar distribution parameters, and thus the genes are clustered via the Weibull distribution parameters. We used the WDCM to cluster three cancer gene expression data sets from the lung cancer, B-cell follicular lymphoma and bladder carcinoma and obtained well-clustered results. We compared the performance of WDCM with k-means and Self Organizing Map (SOM using functional annotation information given by the Gene Ontology (GO. The results showed that the functional annotation ratios of WDCM are higher than those of the other methods. We also utilized the external measure Adjusted Rand Index to validate the performance of the WDCM. The comparative results demonstrate that the WDCM provides the better clustering performance compared to k-means and SOM algorithms. The merit of the proposed WDCM is that it can be applied to cluster incomplete gene expression data without imputing the missing values. Moreover, the robustness of WDCM is also evaluated on the incomplete data sets. Conclusions The results demonstrate that our WDCM produces clusters

  2. Calculation of life distributions, in particular Weibull distributions, from operational observations

    International Nuclear Information System (INIS)

    Rauhut, J.

    1982-01-01

    Established methods are presented by which life distributions of machine elements can be determined on the basis of laboratory experiments and operational observations. Practical observations are given special attention as the results estimated on the basis of conventional have not been accurate enough. As an introduction, the stochastic life concept, the general method of determining life distributions, various sampling methods, and the Weibull distribution are explained. Further, possible life testing schedules and maximum-likelihood estimates are discussed for the complete sample case and for censered sampling without replacement in laboratory experiments. Finally, censered sampling with replacement in laboratory experiments is discussed; it is shown how suitable parameter estimates can be obtained for given life distributions by means of the maximum-likelihood method. (orig./RW) [de

  3. Anomalous diffusion and q-Weibull velocity distributions in epithelial cell migration.

    Directory of Open Access Journals (Sweden)

    Tatiane Souza Vilela Podestá

    Full Text Available In multicellular organisms, cell motility is central in all morphogenetic processes, tissue maintenance, wound healing and immune surveillance. Hence, the control of cell motion is a major demand in the creation of artificial tissues and organs. Here, cell migration assays on plastic 2D surfaces involving normal (MDCK and tumoral (B16F10 epithelial cell lines were performed varying the initial density of plated cells. Through time-lapse microscopy quantities such as speed distributions, velocity autocorrelations and spatial correlations, as well as the scaling of mean-squared displacements were determined. We find that these cells exhibit anomalous diffusion with q-Weibull speed distributions that evolves non-monotonically to a Maxwellian distribution as the initial density of plated cells increases. Although short-ranged spatial velocity correlations mark the formation of small cell clusters, the emergence of collective motion was not observed. Finally, simulational results from a correlated random walk and the Vicsek model of collective dynamics evidence that fluctuations in cell velocity orientations are sufficient to produce q-Weibull speed distributions seen in our migration assays.

  4. Inference on the reliability of Weibull distribution with multiply Type-I censored data

    International Nuclear Information System (INIS)

    Jia, Xiang; Wang, Dong; Jiang, Ping; Guo, Bo

    2016-01-01

    In this paper, we focus on the reliability of Weibull distribution under multiply Type-I censoring, which is a general form of Type-I censoring. In multiply Type-I censoring in this study, all units in the life testing experiment are terminated at different times. Reliability estimation with the maximum likelihood estimate of Weibull parameters is conducted. With the delta method and Fisher information, we propose a confidence interval for reliability and compare it with the bias-corrected and accelerated bootstrap confidence interval. Furthermore, a scenario involving a few expert judgments of reliability is considered. A method is developed to generate extended estimations of reliability according to the original judgments and transform them to estimations of Weibull parameters. With Bayes theory and the Monte Carlo Markov Chain method, a posterior sample is obtained to compute the Bayes estimate and credible interval for reliability. Monte Carlo simulation demonstrates that the proposed confidence interval outperforms the bootstrap one. The Bayes estimate and credible interval for reliability are both satisfactory. Finally, a real example is analyzed to illustrate the application of the proposed methods. - Highlights: • We focus on reliability of Weibull distribution under multiply Type-I censoring. • The proposed confidence interval for the reliability is superior after comparison. • The Bayes estimates with a few expert judgements on reliability are satisfactory. • We specify the cases where the MLEs do not exist and present methods to remedy it. • The distribution of estimate of reliability should be used for accurate estimate.

  5. An Approach to Determine the Weibull Parameters and Wind Power Analysis of Saint Martin’s Island, Bangladesh

    Directory of Open Access Journals (Sweden)

    Islam Khandaker Dahirul

    2016-01-01

    Full Text Available This paper explores wind speed distribution using Weibull probability distribution and Rayleigh distribution methods that are proven to provide accurate and efficient estimation of energy output in terms of wind energy conversion systems. Two parameters of Weibull (shape and scale parameters k and c respectively and scale parameter of Rayleigh distribution have been determined based on hourly time-series wind speed data recorded from October 2014 to October 2015 at Saint Martin’s island, Bangladesh. This research has been carried out to examine three numerical methods namely Graphical Method (GM, Empirical Method (EM, Energy Pattern Factor method (EPF to estimate Weibull parameters. Also, Rayleigh distribution method has been analyzed throughout the study. The results in the research revealed that the Graphical method followed by Empirical method and Energy Pattern Factor method were the most accurate and efficient way for determining the value of k and c to approximate wind speed distribution in terms of estimating power error. Rayleigh distribution gives the most power error in the research. Potential for wind energy development in Saint Martin’s island, Bangladesh as found from the data analysis has been explained in this paper.

  6. Comparing performances of clements, box-cox, Johnson methods with weibull distributions for assessing process capability

    Energy Technology Data Exchange (ETDEWEB)

    Senvar, O.; Sennaroglu, B.

    2016-07-01

    This study examines Clements’ Approach (CA), Box-Cox transformation (BCT), and Johnson transformation (JT) methods for process capability assessments through Weibull-distributed data with different parameters to figure out the effects of the tail behaviours on process capability and compares their estimation performances in terms of accuracy and precision. Design/methodology/approach: Usage of process performance index (PPI) Ppu is handled for process capability analysis (PCA) because the comparison issues are performed through generating Weibull data without subgroups. Box plots, descriptive statistics, the root-mean-square deviation (RMSD), which is used as a measure of error, and a radar chart are utilized all together for evaluating the performances of the methods. In addition, the bias of the estimated values is important as the efficiency measured by the mean square error. In this regard, Relative Bias (RB) and the Relative Root Mean Square Error (RRMSE) are also considered. Findings: The results reveal that the performance of a method is dependent on its capability to fit the tail behavior of the Weibull distribution and on targeted values of the PPIs. It is observed that the effect of tail behavior is more significant when the process is more capable. Research limitations/implications: Some other methods such as Weighted Variance method, which also give good results, were also conducted. However, we later realized that it would be confusing in terms of comparison issues between the methods for consistent interpretations... (Author)

  7. Comparing performances of clements, box-cox, Johnson methods with weibull distributions for assessing process capability

    Directory of Open Access Journals (Sweden)

    Ozlem Senvar

    2016-08-01

    Full Text Available Purpose: This study examines Clements’ Approach (CA, Box-Cox transformation (BCT, and Johnson transformation (JT methods for process capability assessments through Weibull-distributed data with different parameters to figure out the effects of the tail behaviours on process capability and compares their estimation performances in terms of accuracy and precision. Design/methodology/approach: Usage of process performance index (PPI Ppu is handled for process capability analysis (PCA because the comparison issues are performed through generating Weibull data without subgroups. Box plots, descriptive statistics, the root-mean-square deviation (RMSD, which is used as a measure of error, and a radar chart are utilized all together for evaluating the performances of the methods. In addition, the bias of the estimated values is important as the efficiency measured by the mean square error. In this regard, Relative Bias (RB and the Relative Root Mean Square Error (RRMSE are also considered. Findings: The results reveal that the performance of a method is dependent on its capability to fit the tail behavior of the Weibull distribution and on targeted values of the PPIs. It is observed that the effect of tail behavior is more significant when the process is more capable. Research limitations/implications: Some other methods such as Weighted Variance method, which also give good results, were also conducted. However, we later realized that it would be confusing in terms of comparison issues between the methods for consistent interpretations. Practical implications: Weibull distribution covers a wide class of non-normal processes due to its capability to yield a variety of distinct curves based on its parameters. Weibull distributions are known to have significantly different tail behaviors, which greatly affects the process capability. In quality and reliability applications, they are widely used for the analyses of failure data in order to understand how

  8. The Transmuted Geometric-Weibull distribution: Properties, Characterizations and Regression Models

    Directory of Open Access Journals (Sweden)

    Zohdy M Nofal

    2017-06-01

    Full Text Available We propose a new lifetime model called the transmuted geometric-Weibull distribution. Some of its structural properties including ordinary and incomplete moments, quantile and generating functions, probability weighted moments, Rényi and q-entropies and order statistics are derived. The maximum likelihood method is discussed to estimate the model parameters by means of Monte Carlo simulation study. A new location-scale regression model is introduced based on the proposed distribution. The new distribution is applied to two real data sets to illustrate its flexibility. Empirical results indicate that proposed distribution can be alternative model to other lifetime models available in the literature for modeling real data in many areas.

  9. Evaluating failure rate of fault-tolerant multistage interconnection networks using Weibull life distribution

    International Nuclear Information System (INIS)

    Bistouni, Fathollah; Jahanshahi, Mohsen

    2015-01-01

    Fault-tolerant multistage interconnection networks (MINs) play a vital role in the performance of multiprocessor systems where reliability evaluation becomes one of the main concerns in analyzing these networks properly. In many cases, the primary objective in system reliability analysis is to compute a failure distribution of the entire system according to that of its components. However, since the problem is known to be NP-hard, in none of the previous efforts, the precise evaluation of the system failure rate has been performed. Therefore, our goal is to investigate this parameter for different fault-tolerant MINs using Weibull life distribution that is one of the most commonly used distributions in reliability. In this paper, four important groups of fault-tolerant MINs will be examined to find the best fault-tolerance techniques in terms of failure rate; (1) Extra-stage MINs, (2) Parallel MINs, (3) Rearrangeable non-blocking MINs, and (4) Replicated MINs. This paper comprehensively analyzes all perspectives of the reliability (terminal, broadcast, and network reliability). Moreover, in this study, all reliability equations are calculated for different network sizes. - Highlights: • The failure rate of different MINs is analyzed by using Weibull life distribution. • This article tries to find the best fault-tolerance technique in the field of MINs. • Complex series-parallel RBDs are used to determine the reliability of the MINs. • All aspects of the reliability (i.e. terminal, broadcast, and network) are analyzed. • All reliability equations will be calculated for different size N×N.

  10. A flexible Weibull extension

    International Nuclear Information System (INIS)

    Bebbington, Mark; Lai, C.-D.; Zitikis, Ricardas

    2007-01-01

    We propose a new two-parameter ageing distribution which is a generalization of the Weibull and study its properties. It has a simple failure rate (hazard rate) function. With appropriate choice of parameter values, it is able to model various ageing classes of life distributions including IFR, IFRA and modified bathtub (MBT). The ranges of the two parameters are clearly demarcated to separate these classes. It thus provides an alternative to many existing life distributions. Details of parameter estimation are provided through a Weibull-type probability plot and maximum likelihood. We also derive explicit formulas for the turning points of the failure rate function in terms of its parameters. This, combined with the parameter estimation procedures, will allow empirical estimation of the turning points for real data sets, which provides useful information for reliability policies

  11. Based on Weibull Information Fusion Analysis Semiconductors Quality the Key Technology of Manufacturing Execution Systems Reliability

    Science.gov (United States)

    Huang, Zhi-Hui; Tang, Ying-Chun; Dai, Kai

    2016-05-01

    Semiconductor materials and Product qualified rate are directly related to the manufacturing costs and survival of the enterprise. Application a dynamic reliability growth analysis method studies manufacturing execution system reliability growth to improve product quality. Refer to classical Duane model assumptions and tracking growth forecasts the TGP programming model, through the failure data, established the Weibull distribution model. Combining with the median rank of average rank method, through linear regression and least squares estimation method, match respectively weibull information fusion reliability growth curve. This assumption model overcome Duane model a weakness which is MTBF point estimation accuracy is not high, through the analysis of the failure data show that the method is an instance of the test and evaluation modeling process are basically identical. Median rank in the statistics is used to determine the method of random variable distribution function, which is a good way to solve the problem of complex systems such as the limited sample size. Therefore this method has great engineering application value.

  12. A log-Weibull spatial scan statistic for time to event data.

    Science.gov (United States)

    Usman, Iram; Rosychuk, Rhonda J

    2018-06-13

    Spatial scan statistics have been used for the identification of geographic clusters of elevated numbers of cases of a condition such as disease outbreaks. These statistics accompanied by the appropriate distribution can also identify geographic areas with either longer or shorter time to events. Other authors have proposed the spatial scan statistics based on the exponential and Weibull distributions. We propose the log-Weibull as an alternative distribution for the spatial scan statistic for time to events data and compare and contrast the log-Weibull and Weibull distributions through simulation studies. The effect of type I differential censoring and power have been investigated through simulated data. Methods are also illustrated on time to specialist visit data for discharged patients presenting to emergency departments for atrial fibrillation and flutter in Alberta during 2010-2011. We found northern regions of Alberta had longer times to specialist visit than other areas. We proposed the spatial scan statistic for the log-Weibull distribution as a new approach for detecting spatial clusters for time to event data. The simulation studies suggest that the test performs well for log-Weibull data.

  13. MEP family of wind speed distribution function and comparison with the empirical Weibull distribution. Paper no. IGEC-1-156

    International Nuclear Information System (INIS)

    Li, M.; Li, X.

    2005-01-01

    The probabilistic distribution of wind speed is one of the important wind characteristics for the assessment of wind energy potential and for the performance of wind energy conversion systems, as well as for the structural and environmental design and analysis. In this study, an exponential family of distribution functions has been developed for the description of the probabilistic distribution of wind speed, and comparison with the wind speed data taken from different sources and measured at different geographical locations in the world has been made. This family of distributions is developed by introducing a pre-exponential term to the theoretical distribution derived from the Maximum Entropy Principle (MEP). The statistical analysis parameter based on the wind power density is used as the suitability judgement for the distribution functions. It is shown that the MEP-type distributions not only agree better with a variety of the measured wind speed data than the conventionally used empirical Weibull distribution, but also can represent the wind power density much more accurately. Therefore, the MEP-type distributions are more suitable for the assessment of the wind energy potential and the performance of wind energy conversion systems. (author)

  14. arXiv Describing dynamical fluctuations and genuine correlations by Weibull regularity

    CERN Document Server

    Nayak, Ranjit K.; Sarkisyan-Grinbaum, Edward K.; Tasevsky, Marek

    The Weibull parametrization of the multiplicity distribution is used to describe the multidimensional local fluctuations and genuine multiparticle correlations measured by OPAL in the large statistics $e^{+}e^{-} \\to Z^{0} \\to hadrons$ sample. The data are found to be well reproduced by the Weibull model up to higher orders. The Weibull predictions are compared to the predictions by the two other models, namely by the negative binomial and modified negative binomial distributions which mostly failed to fit the data. The Weibull regularity, which is found to reproduce the multiplicity distributions along with the genuine correlations, looks to be the optimal model to describe the multiparticle production process.

  15. An EOQ Model with Two-Parameter Weibull Distribution Deterioration and Price-Dependent Demand

    Science.gov (United States)

    Mukhopadhyay, Sushanta; Mukherjee, R. N.; Chaudhuri, K. S.

    2005-01-01

    An inventory replenishment policy is developed for a deteriorating item and price-dependent demand. The rate of deterioration is taken to be time-proportional and the time to deterioration is assumed to follow a two-parameter Weibull distribution. A power law form of the price dependence of demand is considered. The model is solved analytically…

  16. PERFORMANCE ANALYSIS OF METHODS FOR ESTIMATING WEIBULL PARAMETERS FOR WIND SPEED DISTRIBUTION IN THE DISTRICT OF MAROUA

    Directory of Open Access Journals (Sweden)

    D. Kidmo Kaoga

    2015-07-01

    Full Text Available In this study, five numerical Weibull distribution methods, namely, the maximum likelihood method, the modified maximum likelihood method (MLM, the energy pattern factor method (EPF, the graphical method (GM, and the empirical method (EM were explored using hourly synoptic data collected from 1985 to 2013 in the district of Maroua in Cameroon. The performance analysis revealed that the MLM was the most accurate model followed by the EPF and the GM. Furthermore, the comparison between the wind speed standard deviation predicted by the proposed models and the measured data showed that the MLM has a smaller relative error of -3.33% on average compared to -11.67% on average for the EPF and -8.86% on average for the GM. As a result, the MLM was precisely recommended to estimate the scale and shape parameters for an accurate and efficient wind energy potential evaluation.

  17. PERFORMANCE ANALYSIS OF METHODS FOR ESTIMATING WEIBULL PARAMETERS FOR WIND SPEED DISTRIBUTION IN THE DISTRICT OF MAROUA

    Directory of Open Access Journals (Sweden)

    D. Kidmo Kaoga

    2014-12-01

    Full Text Available In this study, five numerical Weibull distribution methods, namely, the maximum likelihood method, the modified maximum likelihood method (MLM, the energy pattern factor method (EPF, the graphical method (GM, and the empirical method (EM were explored using hourly synoptic data collected from 1985 to 2013 in the district of Maroua in Cameroon. The performance analysis revealed that the MLM was the most accurate model followed by the EPF and the GM. Furthermore, the comparison between the wind speed standard deviation predicted by the proposed models and the measured data showed that the MLM has a smaller relative error of -3.33% on average compared to -11.67% on average for the EPF and -8.86% on average for the GM. As a result, the MLM was precisely recommended to estimate the scale and shape parameters for an accurate and efficient wind energy potential evaluation.

  18. On the q-Weibull distribution for reliability applications: An adaptive hybrid artificial bee colony algorithm for parameter estimation

    International Nuclear Information System (INIS)

    Xu, Meng; Droguett, Enrique López; Lins, Isis Didier; Chagas Moura, Márcio das

    2017-01-01

    The q-Weibull model is based on the Tsallis non-extensive entropy and is able to model various behaviors of the hazard rate function, including bathtub curves, by using a single set of parameters. Despite its flexibility, the q-Weibull has not been widely used in reliability applications partly because of the complicated parameters estimation. In this work, the parameters of the q-Weibull are estimated by the maximum likelihood (ML) method. Due to the intricate system of nonlinear equations, derivative-based optimization methods may fail to converge. Thus, the heuristic optimization method of artificial bee colony (ABC) is used instead. To deal with the slow convergence of ABC, it is proposed an adaptive hybrid ABC (AHABC) algorithm that dynamically combines Nelder-Mead simplex search method with ABC for the ML estimation of the q-Weibull parameters. Interval estimates for the q-Weibull parameters, including confidence intervals based on the ML asymptotic theory and on bootstrap methods, are also developed. The AHABC is validated via numerical experiments involving the q-Weibull ML for reliability applications and results show that it produces faster and more accurate convergence when compared to ABC and similar approaches. The estimation procedure is applied to real reliability failure data characterized by a bathtub-shaped hazard rate. - Highlights: • Development of an Adaptive Hybrid ABC (AHABC) algorithm for q-Weibull distribution. • AHABC combines local Nelder-Mead simplex method with ABC to enhance local search. • AHABC efficiently finds the optimal solution for the q-Weibull ML problem. • AHABC outperforms ABC and self-adaptive hybrid ABC in accuracy and convergence speed. • Useful model for reliability data with non-monotonic hazard rate.

  19. Improvement for Amelioration Inventory Model with Weibull Distribution

    Directory of Open Access Journals (Sweden)

    Han-Wen Tuan

    2017-01-01

    Full Text Available Most inventory models dealt with deteriorated items. On the contrary, just a few papers considered inventory systems under amelioration environment. We study an amelioration inventory model with Weibull distribution. However, there are some questionable results in the amelioration paper. We will first point out those questionable results in the previous paper that did not derive the optimal solution and then provide some improvements. We will provide a rigorous analytical work for different cases dependent on the size of the shape parameter. We present a detailed numerical example for different ranges of the sharp parameter to illustrate that our solution method attains the optimal solution. We developed a new amelioration model and then provided a detailed analyzed procedure to find the optimal solution. Our findings will help researchers develop their new inventory models.

  20. A Study on The Mixture of Exponentiated-Weibull Distribution

    Directory of Open Access Journals (Sweden)

    Adel Tawfik Elshahat

    2016-12-01

    Full Text Available Mixtures of measures or distributions occur frequently in the theory and applications of probability and statistics. In the simplest case it may, for example, be reasonable to assume that one is dealing with the mixture in given proportions of a finite number of normal populations with different means or variances. The mixture parameter may also be denumerable infinite, as in the theory of sums of a random number of random variables, or continuous, as in the compound Poisson distribution. The use of finite mixture distributions, to control for unobserved heterogeneity, has become increasingly popular among those estimating dynamic discrete choice models. One of the barriers to using mixture models is that parameters that could previously be estimated in stages must now be estimated jointly: using mixture distributions destroys any additive reparability of the log likelihood function. In this thesis, the maximum likelihood estimators have been obtained for the parameters of the mixture of exponentiated Weibull distribution when sample is available from censoring scheme. The maximum likelihood estimators of the parameters and the asymptotic variance covariance matrix have been also obtained. A numerical illustration for these new results is given.

  1. Influence of the Determination Methods of K and C Parameters on the Ability of Weibull Distribution to Suitably Estimate Wind Potential and Electric Energy

    Directory of Open Access Journals (Sweden)

    Ruben M. Mouangue

    2014-05-01

    Full Text Available The modeling of the wind speed distribution is of great importance for the assessment of wind energy potential and the performance of wind energy conversion system. In this paper, the choice of two determination methods of Weibull parameters shows theirs influences on the Weibull distribution performances. Because of important calm winds on the site of Ngaoundere airport, we characterize the wind potential using the approach of Weibull distribution with parameters which are determined by the modified maximum likelihood method. This approach is compared to the Weibull distribution with parameters which are determined by the maximum likelihood method and the hybrid distribution which is recommended for wind potential assessment of sites having nonzero probability of calm. Using data provided by the ASECNA Weather Service (Agency for the Safety of Air Navigation in Africa and Madagascar, we evaluate the goodness of fit of the various fitted distributions to the wind speed data using the Q – Q plots, the Pearson’s coefficient of correlation, the mean wind speed, the mean square error, the energy density and its relative error. It appears from the results that the accuracy of the Weibull distribution with parameters which are determined by the modified maximum likelihood method is higher than others. Then, this approach is used to estimate the monthly and annual energy productions of the site of the Ngaoundere airport. The most energy contribution is made in March with 255.7 MWh. It also appears from the results that a wind turbine generator installed on this particular site could not work for at least a half of the time because of higher frequency of calm. For this kind of sites, the modified maximum likelihood method proposed by Seguro and Lambert in 2000 is one of the best methods which can be used to determinate the Weibull parameters.

  2. On the Performance Analysis of Digital Communications over Weibull-Gamma Channels

    KAUST Repository

    Ansari, Imran Shafique; Alouini, Mohamed-Slim

    2015-01-01

    In this work, the performance analysis of digital communications over a composite Weibull-Gamma (WG) multipath-fading and shadowing channel is presented wherein WG distribution is appropriate for modeling fading environments when multipath is superimposed on shadowing. More specifically, in this work, exact closed-form expressions are derived for the probability density function, the cumulative distribution function, the moment generating function, and the moments of a composite WG channel. Capitalizing on these results, new exact closed-form expressions are offered for the outage probability, the higher- order amount of fading, the average error rate for binary and M-ary modulation schemes, and the ergodic capacity under various types of transmission policies, mostly in terms of Meijer's G functions. These new analytical results were also verified via computer-based Monte-Carlo simulation results. © 2015 IEEE.

  3. Competing risk models in reliability systems, a Weibull distribution model with Bayesian analysis approach

    International Nuclear Information System (INIS)

    Iskandar, Ismed; Gondokaryono, Yudi Satria

    2016-01-01

    In reliability theory, the most important problem is to determine the reliability of a complex system from the reliability of its components. The weakness of most reliability theories is that the systems are described and explained as simply functioning or failed. In many real situations, the failures may be from many causes depending upon the age and the environment of the system and its components. Another problem in reliability theory is one of estimating the parameters of the assumed failure models. The estimation may be based on data collected over censored or uncensored life tests. In many reliability problems, the failure data are simply quantitatively inadequate, especially in engineering design and maintenance system. The Bayesian analyses are more beneficial than the classical one in such cases. The Bayesian estimation analyses allow us to combine past knowledge or experience in the form of an apriori distribution with life test data to make inferences of the parameter of interest. In this paper, we have investigated the application of the Bayesian estimation analyses to competing risk systems. The cases are limited to the models with independent causes of failure by using the Weibull distribution as our model. A simulation is conducted for this distribution with the objectives of verifying the models and the estimators and investigating the performance of the estimators for varying sample size. The simulation data are analyzed by using Bayesian and the maximum likelihood analyses. The simulation results show that the change of the true of parameter relatively to another will change the value of standard deviation in an opposite direction. For a perfect information on the prior distribution, the estimation methods of the Bayesian analyses are better than those of the maximum likelihood. The sensitivity analyses show some amount of sensitivity over the shifts of the prior locations. They also show the robustness of the Bayesian analysis within the range

  4. Probabilistic Analysis for Comparing Fatigue Data Based on Johnson-Weibull Parameters

    Science.gov (United States)

    Vlcek, Brian L.; Hendricks, Robert C.; Zaretsky, Erwin V.

    2013-01-01

    Leonard Johnson published a methodology for establishing the confidence that two populations of data are different. Johnson's methodology is dependent on limited combinations of test parameters (Weibull slope, mean life ratio, and degrees of freedom) and a set of complex mathematical equations. In this report, a simplified algebraic equation for confidence numbers is derived based on the original work of Johnson. The confidence numbers calculated with this equation are compared to those obtained graphically by Johnson. Using the ratios of mean life, the resultant values of confidence numbers at the 99 percent level deviate less than 1 percent from those of Johnson. At a 90 percent confidence level, the calculated values differ between +2 and 4 percent. The simplified equation is used to rank the experimental lives of three aluminum alloys (AL 2024, AL 6061, and AL 7075), each tested at three stress levels in rotating beam fatigue, analyzed using the Johnson- Weibull method, and compared to the ASTM Standard (E739 91) method of comparison. The ASTM Standard did not statistically distinguish between AL 6061 and AL 7075. However, it is possible to rank the fatigue lives of different materials with a reasonable degree of statistical certainty based on combined confidence numbers using the Johnson- Weibull analysis. AL 2024 was found to have the longest fatigue life, followed by AL 7075, and then AL 6061. The ASTM Standard and the Johnson-Weibull analysis result in the same stress-life exponent p for each of the three aluminum alloys at the median, or L(sub 50), lives

  5. On the Performance Analysis of Digital Communications over Weibull-Gamma Channels

    KAUST Repository

    Ansari, Imran Shafique

    2015-05-01

    In this work, the performance analysis of digital communications over a composite Weibull-Gamma (WG) multipath-fading and shadowing channel is presented wherein WG distribution is appropriate for modeling fading environments when multipath is superimposed on shadowing. More specifically, in this work, exact closed-form expressions are derived for the probability density function, the cumulative distribution function, the moment generating function, and the moments of a composite WG channel. Capitalizing on these results, new exact closed-form expressions are offered for the outage probability, the higher- order amount of fading, the average error rate for binary and M-ary modulation schemes, and the ergodic capacity under various types of transmission policies, mostly in terms of Meijer\\'s G functions. These new analytical results were also verified via computer-based Monte-Carlo simulation results. © 2015 IEEE.

  6. Estimation of the inverse Weibull distribution based on progressively censored data: Comparative study

    International Nuclear Information System (INIS)

    Musleh, Rola M.; Helu, Amal

    2014-01-01

    In this article we consider statistical inferences about the unknown parameters of the Inverse Weibull distribution based on progressively type-II censoring using classical and Bayesian procedures. For classical procedures we propose using the maximum likelihood; the least squares methods and the approximate maximum likelihood estimators. The Bayes estimators are obtained based on both the symmetric and asymmetric (Linex, General Entropy and Precautionary) loss functions. There are no explicit forms for the Bayes estimators, therefore, we propose Lindley's approximation method to compute the Bayes estimators. A comparison between these estimators is provided by using extensive simulation and three criteria, namely, Bias, mean squared error and Pitman nearness (PN) probability. It is concluded that the approximate Bayes estimators outperform the classical estimators most of the time. Real life data example is provided to illustrate our proposed estimators. - Highlights: • We consider progressively type-II censored data from the Inverse Weibull distribution (IW). • We derive MLEs, approximate MLEs, LS and Bayes estimate methods of scale and shape parameters of the IW. • Bayes estimator of shape parameter cannot be expressed in closed forms. • We suggest using Lindley's approximation. • We conclude that the Bayes estimates outperform the classical methods

  7. Weibull Model Allowing Nearly Instantaneous Failures

    Directory of Open Access Journals (Sweden)

    C. D. Lai

    2007-01-01

    expressed as a mixture of the uniform distribution and the Weibull distribution. Properties of the resulting distribution are derived; in particular, the probability density function, survival function, and the hazard rate function are obtained. Some selected plots of these functions are also presented. An R script was written to fit the model parameters. An application of the modified model is illustrated.

  8. Bayesian Estimation of Two-Parameter Weibull Distribution Using Extension of Jeffreys' Prior Information with Three Loss Functions

    Directory of Open Access Journals (Sweden)

    Chris Bambey Guure

    2012-01-01

    Full Text Available The Weibull distribution has been observed as one of the most useful distribution, for modelling and analysing lifetime data in engineering, biology, and others. Studies have been done vigorously in the literature to determine the best method in estimating its parameters. Recently, much attention has been given to the Bayesian estimation approach for parameters estimation which is in contention with other estimation methods. In this paper, we examine the performance of maximum likelihood estimator and Bayesian estimator using extension of Jeffreys prior information with three loss functions, namely, the linear exponential loss, general entropy loss, and the square error loss function for estimating the two-parameter Weibull failure time distribution. These methods are compared using mean square error through simulation study with varying sample sizes. The results show that Bayesian estimator using extension of Jeffreys' prior under linear exponential loss function in most cases gives the smallest mean square error and absolute bias for both the scale parameter α and the shape parameter β for the given values of extension of Jeffreys' prior.

  9. Weibull-Based Design Methodology for Rotating Structures in Aircraft Engines

    Directory of Open Access Journals (Sweden)

    Erwin V. Zaretsky

    2003-01-01

    Full Text Available The NASA Energy-Efficient Engine (E3-Engine is used as the basis of a Weibull-based life and reliability analysis. Each component's life, and thus the engine's life, is defined by high-cycle fatigue or low-cycle fatigue. Knowing the cumulative life distribution of each of the components making up the engine as represented by a Weibull slope is a prerequisite to predicting the life and reliability of the entire engine. As the engine's Weibull slope increases, the predicted life decreases. The predicted engine lives L5 (95% probability of survival of approximately 17,000 and 32,000 hr do correlate with current engine-maintenance practices without and with refurbishment, respectively. The individual high-pressure turbine (HPT blade lives necessary to obtain a blade system life L0.1 (99.9% probability of survival of 9000 hr for Weibull slopes of 3, 6, and 9 are 47,391; 20,652; and 15,658 hr, respectively. For a design life of the HPT disks having probable points of failure equal to or greater than 36,000 hr at a probability of survival of 99.9%, the predicted disk system life L0.1 can vary from 9408 to 24,911 hr.

  10. Optimal pricing and lot-sizing decisions under Weibull distribution deterioration and trade credit policy

    Directory of Open Access Journals (Sweden)

    Manna S.K.

    2008-01-01

    Full Text Available In this paper, we consider the problem of simultaneous determination of retail price and lot-size (RPLS under the assumption that the supplier offers a fixed credit period to the retailer. It is assumed that the item in stock deteriorates over time at a rate that follows a two-parameter Weibull distribution and that the price-dependent demand is represented by a constant-price-elasticity function of retail price. The RPLS decision model is developed and solved analytically. Results are illustrated with the help of a base example. Computational results show that the supplier earns more profits when the credit period is greater than the replenishment cycle length. Sensitivity analysis of the solution to changes in the value of input parameters of the base example is also discussed.

  11. Bayesian Estimation of the Scale Parameter of Inverse Weibull Distribution under the Asymmetric Loss Functions

    Directory of Open Access Journals (Sweden)

    Farhad Yahgmaei

    2013-01-01

    Full Text Available This paper proposes different methods of estimating the scale parameter in the inverse Weibull distribution (IWD. Specifically, the maximum likelihood estimator of the scale parameter in IWD is introduced. We then derived the Bayes estimators for the scale parameter in IWD by considering quasi, gamma, and uniform priors distributions under the square error, entropy, and precautionary loss functions. Finally, the different proposed estimators have been compared by the extensive simulation studies in corresponding the mean square errors and the evolution of risk functions.

  12. A drawback and an improvement of the classical Weibull probability plot

    International Nuclear Information System (INIS)

    Jiang, R.

    2014-01-01

    The classical Weibull Probability Paper (WPP) plot has been widely used to identify a model for fitting a given dataset. It is based on a match between the WPP plots of the model and data in shape. This paper carries out an analysis for the Weibull transformations that create the WPP plot and shows that the shape of the WPP plot of the data randomly generated from a distribution model can be significantly different from the shape of the WPP plot of the model due to the high non-linearity of the Weibull transformations. As such, choosing model based on the shape of the WPP plot of data can be unreliable. A cdf-based weighted least squares method is proposed to improve the parameter estimation accuracy; and an improved WPP plot is suggested to avoid the drawback of the classical WPP plot. The appropriateness and usefulness of the proposed estimation method and probability plot are illustrated by simulation and real-world examples

  13. Comparing Johnson’s SBB, Weibull and Logit-Logistic bivariate distributions for modeling tree diameters and heights using copulas

    Energy Technology Data Exchange (ETDEWEB)

    Cardil Forradellas, A.; Molina Terrén, D.M.; Oliveres, J.; Castellnou, M.

    2016-07-01

    Aim of study: In this study we compare the accuracy of three bivariate distributions: Johnson’s SBB, Weibull-2P and LL-2P functions for characterizing the joint distribution of tree diameters and heights. Area of study: North-West of Spain. Material and methods: Diameter and height measurements of 128 plots of pure and even-aged Tasmanian blue gum (Eucalyptus globulus Labill.) stands located in the North-west of Spain were considered in the present study. The SBB bivariate distribution was obtained from SB marginal distributions using a Normal Copula based on a four-parameter logistic transformation. The Plackett Copula was used to obtain the bivariate models from the Weibull and Logit-logistic univariate marginal distributions. The negative logarithm of the maximum likelihood function was used to compare the results and the Wilcoxon signed-rank test was used to compare the related samples of these logarithms calculated for each sample plot and each distribution. Main results: The best results were obtained by using the Plackett copula and the best marginal distribution was the Logit-logistic. Research highlights: The copulas used in this study have shown a good performance for modeling the joint distribution of tree diameters and heights. They could be easily extended for modelling multivariate distributions involving other tree variables, such as tree volume or biomass. (Author)

  14. Power Loss Analysis for Wind Power Grid Integration Based on Weibull Distribution

    Directory of Open Access Journals (Sweden)

    Ahmed Al Ameri

    2017-04-01

    Full Text Available The growth of electrical demand increases the need of renewable energy sources, such as wind energy, to meet that need. Electrical power losses are an important factor when wind farm location and size are selected. The capitalized cost of constant power losses during the life of a wind farm will continue to high levels. During the operation period, a method to determine if the losses meet the requirements of the design is significantly needed. This article presents a Simulink simulation of wind farm integration into the grid; the aim is to achieve a better understanding of wind variation impact on grid losses. The real power losses are set as a function of the annual variation, considering a Weibull distribution. An analytical method has been used to select the size and placement of a wind farm, taking into account active power loss reduction. It proposes a fast linear model estimation to find the optimal capacity of a wind farm based on DC power flow and graph theory. The results show that the analytical approach is capable of predicting the optimal size and location of wind turbines. Furthermore, it revealed that the annual variation of wind speed could have a strong effect on real power loss calculations. In addition to helping to improve utility efficiency, the proposed method can develop specific designs to speeding up integration of wind farms into grids.

  15. Characterization of the wind behavior in Botucatu-SP region (Brazil) by Weibull distributing; Caracterizacao do comportamento eolico da regiao de Botucatu-SP atraves da distribuicao de Weibull

    Energy Technology Data Exchange (ETDEWEB)

    Gabriel Filho, Luis Roberto Almeida [Universidade Estadual Paulista (CE/UNESP), Tupa, SP (Brazil). Coordenacao de Estagio; Cremasco, Camila Pires [Faculdade de Tecnologia de Presidente Prudente, SP (Brazil); Seraphim, Odivaldo Jose [Universidade Estadual Paulista (FCA/UNESP), Botucatu, SP (Brazil). Fac. de Ciencias Agronomicas; Cagnon, Jose Angelo [Universidade Estadual Paulista (FEB/UNESP), Bauru, SP (Brazil). Faculdade de Engenharia

    2008-07-01

    The wind behavior of a region can be described by frequency distribution that provide information and characteristics needed for a possible deployment of wind energy harvesting in the region. These characteristics, such as the annual average speed, the variance and shunting line standard of the registered speeds and the density of aeolian power average hourly, can be gotten by the frequency of occurrence of determined speed, that in turn must be studied through analytical expressions. The more adjusted analytical function for aeolian distributions is the function of density of Weibull, that can be determined by numerical methods and linear regressions. Once you have determined this function, all wind characteristics mentioned above may be determined accurately. The objective of this work is to characterize the aeolian behavior in the region of Botucatu-SP and to determine the energy potential for implementation of aeolian turbines. For the development of the present research, was used an Monitorial Young Wind anemometer of Campbell company installed a 10 meters of height. The experiment was developed in the Nucleus of Alternative Energies and Renewed - NEAR of the Laboratory of Agricultural Energize of the Department of Agricultural Engineering of the UNESP, Agronomy Sciences Faculty, Lageado Experimental Farm, located in the city of Botucatu - SP. The geographic localization is defined by the coordinates 22 deg 51' South latitude (S) and 48 deg 26' Longitude West (W) and average altitude of 786 meters above sea level. The analysis was carried through using registers of speed of the wind during the period of September of 2004 the September of 2005. After determined the distribution of frequencies of the hourly average speed of the wind, it was determined function of associated Weibull, thus making possible the determination of the annual average speed of the wind (2,77 m/s), of the shunting line standard of the registered speeds (0,55 m/s), of the

  16. Statistical distribution of the estimator of Weibull modulus

    OpenAIRE

    Barbero, Enrique; Fernández-Sáez, José; Navarro Ugena, Carlos

    2001-01-01

    3 pages, 3 figures. The Weibull statistic has been widely used to study the inherent scatter existing in the strength properties of many advanced materials, as well as in the fracture toughness of steels in the ductile-brittle transition region. The authors are indebted to the Fundación Ramón Areces (Área de Materiales, IX Concurso Nacional) for its financial support of this research. Publicado

  17. Statistical Diagnosis of the Best Weibull Methods for Wind Power Assessment for Agricultural Applications

    Directory of Open Access Journals (Sweden)

    Abul Kalam Azad

    2014-05-01

    Full Text Available The best Weibull distribution methods for the assessment of wind energy potential at different altitudes in desired locations are statistically diagnosed in this study. Seven different methods, namely graphical method (GM, method of moments (MOM, standard deviation method (STDM, maximum likelihood method (MLM, power density method (PDM, modified maximum likelihood method (MMLM and equivalent energy method (EEM were used to estimate the Weibull parameters and six statistical tools, namely relative percentage of error, root mean square error (RMSE, mean percentage of error, mean absolute percentage of error, chi-square error and analysis of variance were used to precisely rank the methods. The statistical fittings of the measured and calculated wind speed data are assessed for justifying the performance of the methods. The capacity factor and total energy generated by a small model wind turbine is calculated by numerical integration using Trapezoidal sums and Simpson’s rules. The results show that MOM and MLM are the most efficient methods for determining the value of k and c to fit Weibull distribution curves.

  18. Prediction of Strength for Inhomogeneous : Defective Glass Elements Based on the Sequential Partitioning of the Data and Weibull Statistical Distribution

    NARCIS (Netherlands)

    Shabetia, Alexander; Rodichev, Yurii; Veer, F.A.; Soroka, Elena; Louter, Christian; Bos, Freek; Belis, Jan; Veer, Fred; Nijsse, Rob

    An analytical approach based on the on the sequential partitioning of the data and Weibull Statistical Distribution for inhomogeneous - defective materials is proposed. It allows assessing the guaranteed strength of glass structures for the low probability of fracture with a higher degree of

  19. Redundancy allocation problem of a system with increasing failure rates of components based on Weibull distribution: A simulation-based optimization approach

    International Nuclear Information System (INIS)

    Guilani, Pedram Pourkarim; Azimi, Parham; Niaki, S.T.A.; Niaki, Seyed Armin Akhavan

    2016-01-01

    The redundancy allocation problem (RAP) is a useful method to enhance system reliability. In most works involving RAP, failure rates of the system components are assumed to follow either exponential or k-Erlang distributions. In real world problems however, many systems have components with increasing failure rates. This indicates that as time passes by, the failure rates of the system components increase in comparison to their initial failure rates. In this paper, the redundancy allocation problem of a series–parallel system with components having an increasing failure rate based on Weibull distribution is investigated. An optimization method via simulation is proposed for modeling and a genetic algorithm is developed to solve the problem. - Highlights: • The redundancy allocation problem of a series–parallel system is aimed. • Components possess an increasing failure rate based on Weibull distribution. • An optimization method via simulation is proposed for modeling. • A genetic algorithm is developed to solve the problem.

  20. Weibull analysis of fracture test data on bovine cortical bone: influence of orientation.

    Science.gov (United States)

    Khandaker, Morshed; Ekwaro-Osire, Stephen

    2013-01-01

    The fracture toughness, K IC, of a cortical bone has been experimentally determined by several researchers. The variation of K IC values occurs from the variation of specimen orientation, shape, and size during the experiment. The fracture toughness of a cortical bone is governed by the severest flaw and, hence, may be analyzed using Weibull statistics. To the best of the authors' knowledge, however, no studies of this aspect have been published. The motivation of the study is the evaluation of Weibull parameters at the circumferential-longitudinal (CL) and longitudinal-circumferential (LC) directions. We hypothesized that Weibull parameters vary depending on the bone microstructure. In the present work, a two-parameter Weibull statistical model was applied to calculate the plane-strain fracture toughness of bovine femoral cortical bone obtained using specimens extracted from CL and LC directions of the bone. It was found that the Weibull modulus of fracture toughness was larger for CL specimens compared to LC specimens, but the opposite trend was seen for the characteristic fracture toughness. The reason for these trends is the microstructural and extrinsic toughening mechanism differences between CL and LC directions bone. The Weibull parameters found in this study can be applied to develop a damage-mechanics model for bone.

  1. Bayesian Analysis of the Survival Function and Failure Rate of Weibull Distribution with Censored Data

    Directory of Open Access Journals (Sweden)

    Chris Bambey Guure

    2012-01-01

    Full Text Available The survival function of the Weibull distribution determines the probability that a unit or an individual will survive beyond a certain specified time while the failure rate is the rate at which a randomly selected individual known to be alive at time will die at time (. The classical approach for estimating the survival function and the failure rate is the maximum likelihood method. In this study, we strive to determine the best method, by comparing the classical maximum likelihood against the Bayesian estimators using an informative prior and a proposed data-dependent prior known as generalised noninformative prior. The Bayesian estimation is considered under three loss functions. Due to the complexity in dealing with the integrals using the Bayesian estimator, Lindley’s approximation procedure is employed to reduce the ratio of the integrals. For the purpose of comparison, the mean squared error (MSE and the absolute bias are obtained. This study is conducted via simulation by utilising different sample sizes. We observed from the study that the generalised prior we assumed performed better than the others under linear exponential loss function with respect to MSE and under general entropy loss function with respect to absolute bias.

  2. A comparison of Weibull and βIc analyses of transition range data

    International Nuclear Information System (INIS)

    McCabe, D.E.

    1991-01-01

    Specimen size effects on K Jc data scatter in the transition range of fracture toughness have been explained by external (weakest link) statistics. In this investigation, compact specimens of A 533 grade B steel were tested in sizes ranging from 1/2TC(T) to 4TC(T) with sufficient replication to obtain good three-parameter Weibull characterization of data distributions. The optimum fitting parameters for an assumed Weibull slope of 4 were calculated. External statistics analysis was applied to the 1/2TC(T) data to predict median K Jc values for 1TC(T), 2TC(T), and 4TC(T) specimens. The distributions from experimentally developed 1TC(T), 2TC(T), and 4TC(T) data tended to confirm the predictions. However, the extremal prediction model does not work well at lower-shelf toughness. At -150 degree C the extremal model predicts a specimen size effect where in reality there is no size effect

  3. Generalized renewal process for repairable systems based on finite Weibull mixture

    International Nuclear Information System (INIS)

    Veber, B.; Nagode, M.; Fajdiga, M.

    2008-01-01

    Repairable systems can be brought to one of possible states following a repair. These states are: 'as good as new', 'as bad as old' and 'better than old but worse than new'. The probabilistic models traditionally used to estimate the expected number of failures account for the first two states, but they do not properly apply to the last one, which is more realistic in practice. In this paper, a probabilistic model that is applicable to all of the three after-repair states, called generalized renewal process (GRP), is applied. Simplistically, GRP addresses the repair assumption by introducing the concept of virtual age into the stochastic point processes to enable them to represent the full spectrum of repair assumptions. The shape of measured or design life distributions of systems can vary considerably, and therefore frequently cannot be approximated by simple distribution functions. The scope of the paper is to prove that a finite Weibull mixture, with positive component weights only, can be used as underlying distribution of the time to first failure (TTFF) of the GRP model, on condition that the unknown parameters can be estimated. To support the main idea, three examples are presented. In order to estimate the unknown parameters of the GRP model with m-fold Weibull mixture, the EM algorithm is applied. The GRP model with m mixture components distributions is compared to the standard GRP model based on two-parameter Weibull distribution by calculating the expected number of failures. It can be concluded that the suggested GRP model with Weibull mixture with an arbitrary but finite number of components is suitable for predicting failures based on the past performance of the system

  4. Parameter Estimations and Optimal Design of Simple Step-Stress Model for Gamma Dual Weibull Distribution

    Directory of Open Access Journals (Sweden)

    Hamdy Mohamed Salem

    2018-03-01

    Full Text Available This paper considers life-testing experiments and how it is effected by stress factors: namely temperature, electricity loads, cycling rate and pressure. A major type of accelerated life tests is a step-stress model that allows the experimenter to increase stress levels more than normal use during the experiment to see the failure items. The test items are assumed to follow Gamma Dual Weibull distribution. Different methods for estimating the parameters are discussed. These include Maximum Likelihood Estimations and Confidence Interval Estimations which is based on asymptotic normality generate narrow intervals to the unknown distribution parameters with high probability. MathCAD (2001 program is used to illustrate the optimal time procedure through numerical examples.

  5. A Weibull statistics-based lignocellulose saccharification model and a built-in parameter accurately predict lignocellulose hydrolysis performance.

    Science.gov (United States)

    Wang, Mingyu; Han, Lijuan; Liu, Shasha; Zhao, Xuebing; Yang, Jinghua; Loh, Soh Kheang; Sun, Xiaomin; Zhang, Chenxi; Fang, Xu

    2015-09-01

    Renewable energy from lignocellulosic biomass has been deemed an alternative to depleting fossil fuels. In order to improve this technology, we aim to develop robust mathematical models for the enzymatic lignocellulose degradation process. By analyzing 96 groups of previously published and newly obtained lignocellulose saccharification results and fitting them to Weibull distribution, we discovered Weibull statistics can accurately predict lignocellulose saccharification data, regardless of the type of substrates, enzymes and saccharification conditions. A mathematical model for enzymatic lignocellulose degradation was subsequently constructed based on Weibull statistics. Further analysis of the mathematical structure of the model and experimental saccharification data showed the significance of the two parameters in this model. In particular, the λ value, defined the characteristic time, represents the overall performance of the saccharification system. This suggestion was further supported by statistical analysis of experimental saccharification data and analysis of the glucose production levels when λ and n values change. In conclusion, the constructed Weibull statistics-based model can accurately predict lignocellulose hydrolysis behavior and we can use the λ parameter to assess the overall performance of enzymatic lignocellulose degradation. Advantages and potential applications of the model and the λ value in saccharification performance assessment were discussed. Copyright © 2015 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  6. On Weibull's Spectrum of Nonrelativistic Energetic Particles at IP Shocks: Observations and Theoretical Interpretation

    Energy Technology Data Exchange (ETDEWEB)

    Pallocchia, G.; Laurenza, M.; Consolini, G. [INAF—Istituto di Astrofisica e Planetologia Spaziali, Via Fosso del Cavaliere 100, I-00133 Roma (Italy)

    2017-03-10

    Some interplanetary shocks are associated with short-term and sharp particle flux enhancements near the shock front. Such intensity enhancements, known as shock-spike events (SSEs), represent a class of relatively energetic phenomena as they may extend to energies of some tens of MeV or even beyond. Here we present an SSE case study in order to shed light on the nature of the particle acceleration involved in this kind of event. Our observations refer to an SSE registered on 2011 October 3 at 22:23 UT, by STEREO B instrumentation when, at a heliocentric distance of 1.08 au, the spacecraft was swept by a perpendicular shock moving away from the Sun. The main finding from the data analysis is that a Weibull distribution represents a good fitting function to the measured particle spectrum over the energy range from 0.1 to 30 MeV. To interpret such an observational result, we provide a theoretical derivation of the Weibull spectrum in the framework of the acceleration by “killed” stochastic processes exhibiting power-law growth in time of the velocity expectation, such as the classical Fermi process. We find an overall coherence between the experimental values of the Weibull spectrum parameters and their physical meaning within the above scenario. Hence, our approach based on the Weibull distribution proves to be useful for understanding SSEs. With regard to the present event, we also provide an alternative explanation of the Weibull spectrum in terms of shock-surfing acceleration.

  7. Bayesian Approach for Constant-Stress Accelerated Life Testing for Kumaraswamy Weibull Distribution with Censoring

    Directory of Open Access Journals (Sweden)

    Abeer Abd-Alla EL-Helbawy

    2016-09-01

    Full Text Available The accelerated life tests provide quick information on the life time distributions by testing materials or products at higher than basic conditional levels of stress such as pressure, high temperature, vibration, voltage or load to induce failures. In this paper, the acceleration model assumed is log linear model. Constant stress tests are discussed based on Type I and Type II censoring. The Kumaraswmay Weibull distribution is used. The estimators of the parameters, reliability, hazard rate functions and p-th percentile at normal condition, low stress, and high stress are obtained. In addition, credible intervals for parameters of the models are constructed. Optimum test plan are designed. Some numerical studies are used to solve the complicated integrals such as Laplace and Markov Chain Monte Carlo methods.

  8. Bayesian Approach for Constant-Stress Accelerated Life Testing for Kumaraswamy Weibull Distribution with Censoring

    Directory of Open Access Journals (Sweden)

    Abeer Abd-Alla EL-Helbawy

    2016-12-01

    Full Text Available The accelerated life tests provide quick information on the life time distributions by testing materials or products at higher than basic conditional levels of stress such as pressure, high temperature, vibration, voltage or load to induce failures. In this paper, the acceleration model assumed is log linear model. Constant stress tests are discussed based on Type I and Type II censoring. The Kumaraswmay Weibull distribution is used. The estimators of the parameters, reliability, hazard rate functions and p-th percentile at normal condition, low stress, and high stress are obtained. In addition, credible intervals for parameters of the models are constructed. Optimum test plan are designed. Some numerical studies are used to solve the complicated integrals such as Laplace and Markov Chain Monte Carlo methods.

  9. THE COMPARATIVE ANALYSIS OF TWO DIFFERENT STATISTICAL DISTRIBUTIONS USED TO ESTIMATE THE WIND ENERGY POTENTIAL

    Directory of Open Access Journals (Sweden)

    Mehmet KURBAN

    2007-01-01

    Full Text Available In this paper, the wind energy potential of the region is analyzed with Weibull and Reyleigh statistical distribution functions by using the wind speed data measured per 15 seconds in July, August, September, and October of 2005 at 10 m height of 30-m observation pole in the wind observation station constructed in the coverage of the scientific research project titled "The Construction of Hybrid (Wind-Solar Power Plant Model by Determining the Wind and Solar Potential in the Iki Eylul Campus of A.U." supported by Anadolu University. The Maximum likelihood method is used for finding the parameters of these distributions. The conclusion of the analysis for the months taken represents that the Weibull distribution models the wind speeds better than the Rayleigh distribution. Furthermore, the error rate in the monthly values of power density computed by using the Weibull distribution is smaller than the values by Rayleigh distribution.

  10. Kinetic Analysis of Isothermal Decomposition Process of Sodium Bicarbonate Using the Weibull Probability Function—Estimation of Density Distribution Functions of the Apparent Activation Energies

    Science.gov (United States)

    Janković, Bojan

    2009-10-01

    The decomposition process of sodium bicarbonate (NaHCO3) has been studied by thermogravimetry in isothermal conditions at four different operating temperatures (380 K, 400 K, 420 K, and 440 K). It was found that the experimental integral and differential conversion curves at the different operating temperatures can be successfully described by the isothermal Weibull distribution function with a unique value of the shape parameter ( β = 1.07). It was also established that the Weibull distribution parameters ( β and η) show independent behavior on the operating temperature. Using the integral and differential (Friedman) isoconversional methods, in the conversion (α) range of 0.20 ≤ α ≤ 0.80, the apparent activation energy ( E a ) value was approximately constant ( E a, int = 95.2 kJmol-1 and E a, diff = 96.6 kJmol-1, respectively). The values of E a calculated by both isoconversional methods are in good agreement with the value of E a evaluated from the Arrhenius equation (94.3 kJmol-1), which was expressed through the scale distribution parameter ( η). The Málek isothermal procedure was used for estimation of the kinetic model for the investigated decomposition process. It was found that the two-parameter Šesták-Berggren (SB) autocatalytic model best describes the NaHCO3 decomposition process with the conversion function f(α) = α0.18(1-α)1.19. It was also concluded that the calculated density distribution functions of the apparent activation energies ( ddfE a ’s) are not dependent on the operating temperature, which exhibit the highly symmetrical behavior (shape factor = 1.00). The obtained isothermal decomposition results were compared with corresponding results of the nonisothermal decomposition process of NaHCO3.

  11. Weibull modeling of particle cracking in metal matrix composites

    International Nuclear Information System (INIS)

    Lewis, C.A.; Withers, P.J.

    1995-01-01

    An investigation into the occurrence of reinforcement cracking within a particulate ZrO 2 /2618 Al alloy metal matrix composite under tensile plastic straining has been carried out, special attention being paid to the dependence of fracture on particle size and shape. The probability of particle cracking has been modeled using a Weibull approach, giving good agreement with the experimental data. Values for the Weibull modulus and the stress required to crack the particles were found to be within the range expected for the cracking of ceramic particles. Additional information regarding the fracture behavior of the particles was provided by in-situ neutron diffraction monitoring of the internal strains, measurement of the variation in the composite Young's modulus with straining and by direct observation of the cracked particles. The values of the particle stress required for the initiation of particle cracking deduced from these supplementary experiments were found to be in good agreement with each other and with the results from the Weibull analysis. Further, it is shown that while both the current experiments, as well as the previous work of others, can be well described by the Weibull approach, the exact values of the Weibull parameters do deduced are very sensitive to the approximations and the assumptions made in constructing the model

  12. Determining the theoretical reliability function of thermal power system using simple and complex Weibull distribution

    Directory of Open Access Journals (Sweden)

    Kalaba Dragan V.

    2014-01-01

    Full Text Available The main subject of this paper is the representation of the probabilistic technique for thermal power system reliability assessment. Exploitation research of the reliability of the fossil fuel power plant system has defined the function, or the probabilistic law, according to which the random variable behaves (occurrence of complete unplanned standstill. Based on these data, and by applying the reliability theory to this particular system, using simple and complex Weibull distribution, a hypothesis has been confirmed that the distribution of the observed random variable fully describes the behaviour of such a system in terms of reliability. Establishing a comprehensive insight in the field of probabilistic power system reliability assessment technique could serve as an input for further research and development in the area of power system planning and operation.

  13. Estimation for a Weibull accelerated life testing model

    International Nuclear Information System (INIS)

    Glaser, R.E.

    1984-01-01

    It is sometimes reasonable to assume that the lifetime distribution of an item belongs to a certain parametric family, and that actual parameter values depend upon the testing environment of the item. In the two-parameter Weibull family setting, suppose both the shape and scale parameters are expressible as functions of the testing environment. For various models of functional dependency on environment, maximum likelihood methods are used to estimate characteristics of interest at specified environmental levels. The methodology presented handles exact, censored, and grouped data. A detailed accelerated life testing analysis of stress-rupture data for Kevlar/epoxy composites is given. 10 references, 1 figure, 2 tables

  14. Failure-censored accelerated life test sampling plans for Weibull distribution under expected test time constraint

    International Nuclear Information System (INIS)

    Bai, D.S.; Chun, Y.R.; Kim, J.G.

    1995-01-01

    This paper considers the design of life-test sampling plans based on failure-censored accelerated life tests. The lifetime distribution of products is assumed to be Weibull with a scale parameter that is a log linear function of a (possibly transformed) stress. Two levels of stress higher than the use condition stress, high and low, are used. Sampling plans with equal expected test times at high and low test stresses which satisfy the producer's and consumer's risk requirements and minimize the asymptotic variance of the test statistic used to decide lot acceptability are obtained. The properties of the proposed life-test sampling plans are investigated

  15. Probabilistic physics-of-failure models for component reliabilities using Monte Carlo simulation and Weibull analysis: a parametric study

    International Nuclear Information System (INIS)

    Hall, P.L.; Strutt, J.E.

    2003-01-01

    In reliability engineering, component failures are generally classified in one of three ways: (1) early life failures; (2) failures having random onset times; and (3) late life or 'wear out' failures. When the time-distribution of failures of a population of components is analysed in terms of a Weibull distribution, these failure types may be associated with shape parameters β having values 1 respectively. Early life failures are frequently attributed to poor design (e.g. poor materials selection) or problems associated with manufacturing or assembly processes. We describe a methodology for the implementation of physics-of-failure models of component lifetimes in the presence of parameter and model uncertainties. This treats uncertain parameters as random variables described by some appropriate statistical distribution, which may be sampled using Monte Carlo methods. The number of simulations required depends upon the desired accuracy of the predicted lifetime. Provided that the number of sampled variables is relatively small, an accuracy of 1-2% can be obtained using typically 1000 simulations. The resulting collection of times-to-failure are then sorted into ascending order and fitted to a Weibull distribution to obtain a shape factor β and a characteristic life-time η. Examples are given of the results obtained using three different models: (1) the Eyring-Peck (EP) model for corrosion of printed circuit boards; (2) a power-law corrosion growth (PCG) model which represents the progressive deterioration of oil and gas pipelines; and (3) a random shock-loading model of mechanical failure. It is shown that for any specific model the values of the Weibull shape parameters obtained may be strongly dependent on the degree of uncertainty of the underlying input parameters. Both the EP and PCG models can yield a wide range of values of β, from β>1, characteristic of wear-out behaviour, to β<1, characteristic of early-life failure, depending on the degree of

  16. SEMI-COMPETING RISKS ON A TRIVARIATE WEIBULL SURVIVAL MODEL

    Directory of Open Access Journals (Sweden)

    Jenq-Daw Lee

    2008-07-01

    Full Text Available A setting of a trivairate survival function using semi-competing risks concept is proposed, in which a terminal event can only occur after other events. The Stanford Heart Transplant data is reanalyzed using a trivariate Weibull distribution model with the proposed survival function.

  17. An EOQ model for weibull distribution deterioration with time-dependent cubic demand and backlogging

    Science.gov (United States)

    Santhi, G.; Karthikeyan, K.

    2017-11-01

    In this article we introduce an economic order quantity model with weibull deterioration and time dependent cubic demand rate where holding costs as a linear function of time. Shortages are allowed in the inventory system are partially and fully backlogging. The objective of this model is to minimize the total inventory cost by using the optimal order quantity and the cycle length. The proposed model is illustrated by numerical examples and the sensitivity analysis is performed to study the effect of changes in parameters on the optimum solutions.

  18. Assessing different parameters estimation methods of Weibull distribution to compute wind power density

    International Nuclear Information System (INIS)

    Mohammadi, Kasra; Alavi, Omid; Mostafaeipour, Ali; Goudarzi, Navid; Jalilvand, Mahdi

    2016-01-01

    Highlights: • Effectiveness of six numerical methods is evaluated to determine wind power density. • More appropriate method for computing the daily wind power density is estimated. • Four windy stations located in the south part of Alberta, Canada namely is investigated. • The more appropriate parameters estimation method was not identical among all examined stations. - Abstract: In this study, the effectiveness of six numerical methods is evaluated to determine the shape (k) and scale (c) parameters of Weibull distribution function for the purpose of calculating the wind power density. The selected methods are graphical method (GP), empirical method of Justus (EMJ), empirical method of Lysen (EML), energy pattern factor method (EPF), maximum likelihood method (ML) and modified maximum likelihood method (MML). The purpose of this study is to identify the more appropriate method for computing the wind power density in four stations distributed in Alberta province of Canada namely Edmonton City Center Awos, Grande Prairie A, Lethbridge A and Waterton Park Gate. To provide a complete analysis, the evaluations are performed on both daily and monthly scales. The results indicate that the precision of computed wind power density values change when different parameters estimation methods are used to determine the k and c parameters. Four methods of EMJ, EML, EPF and ML present very favorable efficiency while the GP method shows weak ability for all stations. However, it is found that the more effective method is not similar among stations owing to the difference in the wind characteristics.

  19. A study of two estimation approaches for parameters of Weibull distribution based on WPP

    International Nuclear Information System (INIS)

    Zhang, L.F.; Xie, M.; Tang, L.C.

    2007-01-01

    Least-squares estimation (LSE) based on Weibull probability plot (WPP) is the most basic method for estimating the Weibull parameters. The common procedure of this method is using the least-squares regression of Y on X, i.e. minimizing the sum of squares of the vertical residuals, to fit a straight line to the data points on WPP and then calculate the LS estimators. This method is known to be biased. In the existing literature the least-squares regression of X on Y, i.e. minimizing the sum of squares of the horizontal residuals, has been used by the Weibull researchers. This motivated us to carry out this comparison between the estimators of the two LS regression methods using intensive Monte Carlo simulations. Both complete and censored data are examined. Surprisingly, the result shows that LS Y on X performs better for small, complete samples, while the LS X on Y performs better in other cases in view of bias of the estimators. The two methods are also compared in terms of other model statistics. In general, when the shape parameter is less than one, LS Y on X provides a better model; otherwise, LS X on Y tends to be better

  20. The discrete additive Weibull distribution: A bathtub-shaped hazard for discontinuous failure data

    International Nuclear Information System (INIS)

    Bebbington, Mark; Lai, Chin-Diew; Wellington, Morgan; Zitikis, Ričardas

    2012-01-01

    Although failure data are usually treated as being continuous, they may have been collected in a discrete manner, or in fact be discrete in nature. Reliability models with bathtub-shaped hazard rate are fundamental to the concepts of burn-in and maintenance, but how well do they incorporate discrete data? We explore discrete versions of the additive Weibull distribution, which has the twin virtues of mathematical tractability and the ability to produce bathtub-shaped hazard rate functions. We derive conditions on the parameters for the hazard rate function to be increasing, decreasing, or bathtub shaped. While discrete versions may have the same shaped hazard rate for the same parameter values, we find that when fitted to data the fitted hazard rate shapes can vary between versions. Our results are illustrated using several real-life data sets, and the implications of using continuous models for discrete data discussed.

  1. Distribution of crushing strength of tablets

    DEFF Research Database (Denmark)

    Sonnergaard, Jørn

    2002-01-01

    The distribution of a given set of data is important since most parametric statistical tests are based on the assumption that the studied data are normal distributed. In analysis of fracture mechanics the Weibull distribution is widely used and the derived Weibull modulus is interpreted as a mate...... data from nine model tablet formulations and four commercial tablets are shown to follow the normal distribution. The importance of proper cleaning of the crushing strength apparatus is demonstrated....

  2. An inventory model for generalized weibull deteriorating items with price dependent demand and permissible delay in payments under inflation

    Directory of Open Access Journals (Sweden)

    S.P.Singh

    2015-09-01

    Full Text Available This paper derives an inventory model is developed for items that deteriorates at a generalized Weibull distributed rate when demand for the items is dependent on the selling price. Shortages are not allowed and price inflation is taken into consideration over finite planning horizon. A brief analysis of the cost involved is carried out by theoretical analysis.

  3. Mixture distributions of wind speed in the UAE

    Science.gov (United States)

    Shin, J.; Ouarda, T.; Lee, T. S.

    2013-12-01

    Wind speed probability distribution is commonly used to estimate potential wind energy. The 2-parameter Weibull distribution has been most widely used to characterize the distribution of wind speed. However, it is unable to properly model wind speed regimes when wind speed distribution presents bimodal and kurtotic shapes. Several studies have concluded that the Weibull distribution should not be used for frequency analysis of wind speed without investigation of wind speed distribution. Due to these mixture distributional characteristics of wind speed data, the application of mixture distributions should be further investigated in the frequency analysis of wind speed. A number of studies have investigated the potential wind energy in different parts of the Arabian Peninsula. Mixture distributional characteristics of wind speed were detected from some of these studies. Nevertheless, mixture distributions have not been employed for wind speed modeling in the Arabian Peninsula. In order to improve our understanding of wind energy potential in Arabian Peninsula, mixture distributions should be tested for the frequency analysis of wind speed. The aim of the current study is to assess the suitability of mixture distributions for the frequency analysis of wind speed in the UAE. Hourly mean wind speed data at 10-m height from 7 stations were used in the current study. The Weibull and Kappa distributions were employed as representatives of the conventional non-mixture distributions. 10 mixture distributions are used and constructed by mixing four probability distributions such as Normal, Gamma, Weibull and Extreme value type-one (EV-1) distributions. Three parameter estimation methods such as Expectation Maximization algorithm, Least Squares method and Meta-Heuristic Maximum Likelihood (MHML) method were employed to estimate the parameters of the mixture distributions. In order to compare the goodness-of-fit of tested distributions and parameter estimation methods for

  4. Weibull aging models for the single protective channel unavailability analysis by the device of stages

    International Nuclear Information System (INIS)

    Nunes, M.E.C.; Noriega, H.C.; Melo, P.F.F.

    1997-01-01

    Among the features to take into account in the unavailability analysis of protective channels, there is one that plays a dominant role - that of considering the equipment aging. In this sense, the exponential failure model is not adequate, since some transition rates are no more constant. As a consequence, Markovian models cannot be used anymore. As an alternative, one may use the device of stages that allows for transforming a Non Markovian model into an equivalent Markovian one by insertion of a fictitious states set, called stages. For a given time-dependent transition rate, its failure density is analysed as to the best combination of exponential distributions and then the moments of the original distribution and those of the combination are matched to estimate the necessary parameters. In this paper, the aging of the protective channel is supposed to follow Weibull distributions. Typical means and variances for the times to failure are considered and combinations of stages are checked. Initial conditions features are discussed in connection with states that are fictitious and to check the validity of the developed models. Alternative solutions by the discretization of the failure rates are generated. The results obtained agree quite well. (author). 7 refs., 6 figs

  5. Comparison of Weibull and Probit Analysis in Toxicity Testing of ...

    African Journals Online (AJOL)

    HP

    Keywords: Hunteria umbellata, Weibull model, Acute toxicity, Median lethal dose (LD50). Received: 7 November ... (PBPK) models [14,15], and (v) biologically-. Based Models: Moolgavkar-Venzon-Kundson. (MVK) model [16] and Ellwein and Cohen model [17]. ... Nigeria, Ibadan, where a sample with number. FHI107618 ...

  6. The Weibull probabilities analysis on the single kenaf fiber

    Science.gov (United States)

    Ibrahim, I.; Sarip, S.; Bani, N. A.; Ibrahim, M. H.; Hassan, M. Z.

    2018-05-01

    Kenaf fiber has a great potential to be replaced with the synthetic composite due to their advantages such as environmentally friendly and outstanding performance. However, the main issue of this natural fiber that to be used in structural composite is inconsistency of their mechanical properties. Here, the influence of the gage length on the mechanical properties of single kenaf fiber was evaluated. This fiber was tested using the Universal testing machine at a loading rate of 1mm per min following ASTM D3822 standard. In this study, the different length of treated fiber including 20, 30 and 40mm were being tested. Following, Weibull probabilities analysis was used to characterize the tensile strength and Young modulus of kenaf fiber. The predicted average tensile strength from this approach is in good agreement with experimental results for the obtained parameter.

  7. pT spectra in pp and AA collisions at RHIC and LHC energies using the Tsallis-Weibull approach

    Science.gov (United States)

    Dash, Sadhana; Mahapatra, D. P.

    2018-04-01

    The Tsallis q -statistics have been incorporated in the Weibull model of particle production, in the form of q-Weibull distribution, to describe the transverse momentum (pT) distribution of charged hadrons at mid-rapidity, measured at RHIC and LHC energies. The q-Weibull distribution is found to describe the observed pT distributions over all ranges of measured pT. Below 2.2 GeV/c, while going from peripheral to central collisions, the parameter q is found to decrease systematically towards unity, indicating an evolution from a non-equilibrated system in peripheral collisions, towards a more thermalized system in central collisions. However, the trend is reversed in the all inclusive pT regime. This can be attributed to an increase in relative contribution of hard pQCD processes in central collisions. The λ-parameter is found to be associated with the mean pT or the collective expansion velocity of the produced hadrons, which shows an expected increase with centrality of collisions. The k parameter is observed to increase with the onset of hard QCD scatterings, initial fluctuations, and other processes leading to non-equilibrium conditions.

  8. Weibull Analysis of Electrical Breakdown Strength as an Effective Means of Evaluating Elastomer Thin Film Quality

    DEFF Research Database (Denmark)

    Silau, Harald; Stabell, Nicolai Bogø; Petersen, Frederik Riddersholm

    2018-01-01

    To realize the commercial potential of dielectric elastomers, reliable, large-scale film production is required. Ensuring proper mixing and subsequently avoiding demixing after, for example, pumping and coating of elastomer premix in an online process is not facile. Weibull analysis...... of the electrical breakdown strength of dielectric elastomer films is shown to be an effective means of evaluating the film quality. The analysis is shown to be capable of distinguishing between proper and improper mixing schemes where similar analysis of ultimate mechanical properties fails to distinguish....

  9. Analysis of Statistical Distributions Used for Modeling Reliability and Failure Rate of Temperature Alarm Circuit

    International Nuclear Information System (INIS)

    EI-Shanshoury, G.I.

    2011-01-01

    Several statistical distributions are used to model various reliability and maintainability parameters. The applied distribution depends on the' nature of the data being analyzed. The presented paper deals with analysis of some statistical distributions used in reliability to reach the best fit of distribution analysis. The calculations rely on circuit quantity parameters obtained by using Relex 2009 computer program. The statistical analysis of ten different distributions indicated that Weibull distribution gives the best fit distribution for modeling the reliability of the data set of Temperature Alarm Circuit (TAC). However, the Exponential distribution is found to be the best fit distribution for modeling the failure rate

  10. A Weibull Approach for Enabling Safety-Oriented Decision-Making for Electronic Railway Signaling Systems

    Directory of Open Access Journals (Sweden)

    Emanuele Pascale

    2018-04-01

    Full Text Available This paper presents the advantages of using Weibull distributions, within the context of railway signaling systems, for enabling safety-oriented decision-making. Failure rates are used to statistically model the basic event of fault-tree analysis, and their value sizes the maximum allowable latency of failures to fulfill the safety target for which the system has been designed. Relying on field-return failure data, Weibull parameters have been calculated for an existing electronic signaling system and a comparison with existing predictive reliability data, based on exponential distribution, is provided. Results are discussed in order to drive considerations on the respect of quantitative targets and on the impact that a wrong hypothesis might have on the choice of a given architecture. Despite the huge amount of information gathered through the after-sales logbook used to build reliability distribution, several key elements for reliable estimation of failure rate values are still missing. This might affect the uncertainty of reliability parameters and the effort required to collect all the information. We then present how to intervene when operational failure rates present higher values compared to the theoretical approach: increasing the redundancies of the system or performing preventive maintenance tasks. Possible consequences of unjustified adoption of constant failure rate are presented. Some recommendations are also shared in order to build reliability-oriented logbooks and avoid data censoring phenomena by enhancing the functions of the electronic boards composing the system.

  11. Determining the parameters of Weibull function to estimate the wind power potential in conditions of limited source meteorological data

    Science.gov (United States)

    Fetisova, Yu. A.; Ermolenko, B. V.; Ermolenko, G. V.; Kiseleva, S. V.

    2017-04-01

    We studied the information basis for the assessment of wind power potential on the territory of Russia. We described the methodology to determine the parameters of the Weibull function, which reflects the density of distribution of probabilities of wind flow speeds at a defined basic height above the surface of the earth using the available data on the average speed at this height and its repetition by gradations. The application of the least square method for determining these parameters, unlike the use of graphical methods, allows performing a statistical assessment of the results of approximation of empirical histograms by the Weibull formula. On the basis of the computer-aided analysis of the statistical data, it was shown that, at a fixed point where the wind speed changes at different heights, the range of parameter variation of the Weibull distribution curve is relatively small, the sensitivity of the function to parameter changes is quite low, and the influence of changes on the shape of speed distribution curves is negligible. Taking this into consideration, we proposed and mathematically verified the methodology of determining the speed parameters of the Weibull function at other heights using the parameter computations for this function at a basic height, which is known or defined by the average speed of wind flow, or the roughness coefficient of the geological substrate. We gave examples of practical application of the suggested methodology in the development of the Atlas of Renewable Energy Resources in Russia in conditions of deficiency of source meteorological data. The proposed methodology, to some extent, may solve the problem related to the lack of information on the vertical profile of repeatability of the wind flow speeds in the presence of a wide assortment of wind turbines with different ranges of wind-wheel axis heights and various performance characteristics in the global market; as a result, this methodology can become a powerful tool for

  12. The distribution of first-passage times and durations in FOREX and future markets

    Science.gov (United States)

    Sazuka, Naoya; Inoue, Jun-ichi; Scalas, Enrico

    2009-07-01

    Possible distributions are discussed for intertrade durations and first-passage processes in financial markets. The view-point of renewal theory is assumed. In order to represent market data with relatively long durations, two types of distributions are used, namely a distribution derived from the Mittag-Leffler survival function and the Weibull distribution. For the Mittag-Leffler type distribution, the average waiting time (residual life time) is strongly dependent on the choice of a cut-off parameter tmax, whereas the results based on the Weibull distribution do not depend on such a cut-off. Therefore, a Weibull distribution is more convenient than a Mittag-Leffler type if one wishes to evaluate relevant statistics such as average waiting time in financial markets with long durations. On the other hand, we find that the Gini index is rather independent of the cut-off parameter. Based on the above considerations, we propose a good candidate for describing the distribution of first-passage time in a market: The Weibull distribution with a power-law tail. This distribution compensates the gap between theoretical and empirical results more efficiently than a simple Weibull distribution. It should be stressed that a Weibull distribution with a power-law tail is more flexible than the Mittag-Leffler distribution, which itself can be approximated by a Weibull distribution and a power-law. Indeed, the key point is that in the former case there is freedom of choice for the exponent of the power-law attached to the Weibull distribution, which can exceed 1 in order to reproduce decays faster than possible with a Mittag-Leffler distribution. We also give a useful formula to determine an optimal crossover point minimizing the difference between the empirical average waiting time and the one predicted from renewal theory. Moreover, we discuss the limitation of our distributions by applying our distribution to the analysis of the BTP future and calculating the average waiting

  13. Use of finite mixture distribution models in the analysis of wind energy in the Canarian Archipelago

    International Nuclear Information System (INIS)

    Carta, Jose Antonio; Ramirez, Penelope

    2007-01-01

    The statistical characteristics of hourly mean wind speed data recorded at 16 weather stations located in the Canarian Archipelago are analyzed in this paper. As a result of this analysis we see that the typical two parameter Weibull wind speed distribution (W-pdf) does not accurately represent all wind regimes observed in that region. However, a Singly Truncated from below Normal Weibull mixture distribution (TNW-pdf) and a two component mixture Weibull distribution (WW-pdf) developed here do provide very good fits for both unimodal and bimodal wind speed frequency distributions observed in that region and offer less relative errors in determining the annual mean wind power density. The parameters of the distributions are estimated using the least squares method, which is resolved in this paper using the Levenberg-Marquardt algorithm. The suitability of the distributions is judged from the probability plot correlation coefficient plot R 2 , adjusted for degrees of freedom. Based on the results obtained, we conclude that the two mixture distributions proposed here provide very flexible models for wind speed studies and can be applied in a widespread manner to represent the wind regimes in the Canarian archipelago and in other regions with similar characteristics. The TNW-pdf takes into account the frequency of null winds, whereas the WW-pdf and W-pdf do not. It can, therefore, better represent wind regimes with high percentages of null wind speeds. However, calculation of the TNW-pdf is markedly slower

  14. Bayesian and Classical Estimation of Stress-Strength Reliability for Inverse Weibull Lifetime Models

    Directory of Open Access Journals (Sweden)

    Qixuan Bi

    2017-06-01

    Full Text Available In this paper, we consider the problem of estimating stress-strength reliability for inverse Weibull lifetime models having the same shape parameters but different scale parameters. We obtain the maximum likelihood estimator and its asymptotic distribution. Since the classical estimator doesn’t hold explicit forms, we propose an approximate maximum likelihood estimator. The asymptotic confidence interval and two bootstrap intervals are obtained. Using the Gibbs sampling technique, Bayesian estimator and the corresponding credible interval are obtained. The Metropolis-Hastings algorithm is used to generate random variates. Monte Carlo simulations are conducted to compare the proposed methods. Analysis of a real dataset is performed.

  15. Dependence of Weibull distribution parameters on the CNR threshold i wind lidar data

    DEFF Research Database (Denmark)

    Gryning, Sven-Erik; Batchvarova, Ekaterina; Floors, Rogier Ralph

    2015-01-01

    in the boundary layer. Observations from tall towers in combination with observations from a lidar of wind speed up to 600 m are used to study the long-term variability of the wind profile over sub-urban, rural, coastal and marine areas. The variability is expressed in terms of the shape parameter in the Weibull...... over land, both terms are about equally important in the coastal area where the height of the reversal height is low and in the marine conditions, the second term dominates....

  16. Comparison of Weibull strength parameters from flexure and spin tests of brittle materials

    Science.gov (United States)

    Holland, Frederic A., Jr.; Zaretsky, Erwin V.

    1991-01-01

    Fracture data from five series of four point bend tests of beam and spin tests of flat annular disks were reanalyzed. Silicon nitride and graphite were the test materials. The experimental fracture strengths of the disks were compared with the predicted strengths based on both volume flaw and surface flaw analyses of four point bend data. Volume flaw analysis resulted in a better correlation between disks and beams in three of the five test series than did surface flaw analysis. The Weibull (moduli) and characteristic gage strengths for the disks and beams were also compared. Differences in the experimental Weibull slopes were not statistically significant. It was shown that results from the beam tests can predict the fracture strength of rotating disks.

  17. Foam-forming properties of Ilex paraguariensis (mate saponin: foamability and foam lifetime analysis by Weibull equation

    Directory of Open Access Journals (Sweden)

    Janine Treter

    2010-01-01

    Full Text Available Saponins are natural soaplike foam-forming compounds widely used in foods, cosmetic and pharmaceutical preparations. In this work foamability and foam lifetime of foams obtained from Ilex paraguariensis unripe fruits were analyzed. Polysorbate 80 and sodium dodecyl sulfate were used as reference surfactants. Aiming a better data understanding a linearized 4-parameters Weibull function was proposed. The mate hydroethanolic extract (ME and a mate saponin enriched fraction (MSF afforded foamability and foam lifetime comparable to the synthetic surfactants. The linearization of the Weibull equation allowed the statistical comparison of foam decay curves, improving former mathematical approaches.

  18. Spatial and temporal patterns of global onshore wind speed distribution

    International Nuclear Information System (INIS)

    Zhou, Yuyu; Smith, Steven J

    2013-01-01

    Wind power, a renewable energy source, can play an important role in electrical energy generation. Information regarding wind energy potential is important both for energy related modeling and for decision-making in the policy community. While wind speed datasets with high spatial and temporal resolution are often ultimately used for detailed planning, simpler assumptions are often used in analysis work. An accurate representation of the wind speed frequency distribution is needed in order to properly characterize wind energy potential. Using a power density method, this study estimated global variation in wind parameters as fitted to a Weibull density function using NCEP/climate forecast system reanalysis (CFSR) data over land areas. The Weibull distribution performs well in fitting the time series wind speed data at most locations according to R 2 , root mean square error, and power density error. The wind speed frequency distribution, as represented by the Weibull k parameter, exhibits a large amount of spatial variation, a regionally varying amount of seasonal variation, and relatively low decadal variation. We also analyzed the potential error in wind power estimation when a commonly assumed Rayleigh distribution (Weibull k = 2) is used. We find that the assumption of the same Weibull parameter across large regions can result in non-negligible errors. While large-scale wind speed data are often presented in the form of mean wind speeds, these results highlight the need to also provide information on the wind speed frequency distribution. (letter)

  19. Time-dependent fiber bundles with local load sharing. II. General Weibull fibers.

    Science.gov (United States)

    Phoenix, S Leigh; Newman, William I

    2009-12-01

    Fiber bundle models (FBMs) are useful tools in understanding failure processes in a variety of material systems. While the fibers and load sharing assumptions are easily described, FBM analysis is typically difficult. Monte Carlo methods are also hampered by the severe computational demands of large bundle sizes, which overwhelm just as behavior relevant to real materials starts to emerge. For large size scales, interest continues in idealized FBMs that assume either equal load sharing (ELS) or local load sharing (LLS) among fibers, rules that reflect features of real load redistribution in elastic lattices. The present work focuses on a one-dimensional bundle of N fibers under LLS where life consumption in a fiber follows a power law in its load, with exponent rho , and integrated over time. This life consumption function is further embodied in a functional form resulting in a Weibull distribution for lifetime under constant fiber stress and with Weibull exponent, beta. Thus the failure rate of a fiber depends on its past load history, except for beta=1 . We develop asymptotic results validated by Monte Carlo simulation using a computational algorithm developed in our previous work [Phys. Rev. E 63, 021507 (2001)] that greatly increases the size, N , of treatable bundles (e.g., 10(6) fibers in 10(3) realizations). In particular, our algorithm is O(N ln N) in contrast with former algorithms which were O(N2) making this investigation possible. Regimes are found for (beta,rho) pairs that yield contrasting behavior for large N. For rho>1 and large N, brittle weakest volume behavior emerges in terms of characteristic elements (groupings of fibers) derived from critical cluster formation, and the lifetime eventually goes to zero as N-->infinity , unlike ELS, which yields a finite limiting mean. For 1/21 but with 0Weibull exponent for fiber strength.

  20. Quality-Related Monitoring and Grading of Granulated Products by Weibull-Distribution Modeling of Visual Images with Semi-Supervised Learning.

    Science.gov (United States)

    Liu, Jinping; Tang, Zhaohui; Xu, Pengfei; Liu, Wenzhong; Zhang, Jin; Zhu, Jianyong

    2016-06-29

    The topic of online product quality inspection (OPQI) with smart visual sensors is attracting increasing interest in both the academic and industrial communities on account of the natural connection between the visual appearance of products with their underlying qualities. Visual images captured from granulated products (GPs), e.g., cereal products, fabric textiles, are comprised of a large number of independent particles or stochastically stacking locally homogeneous fragments, whose analysis and understanding remains challenging. A method of image statistical modeling-based OPQI for GP quality grading and monitoring by a Weibull distribution(WD) model with a semi-supervised learning classifier is presented. WD-model parameters (WD-MPs) of GP images' spatial structures, obtained with omnidirectional Gaussian derivative filtering (OGDF), which were demonstrated theoretically to obey a specific WD model of integral form, were extracted as the visual features. Then, a co-training-style semi-supervised classifier algorithm, named COSC-Boosting, was exploited for semi-supervised GP quality grading, by integrating two independent classifiers with complementary nature in the face of scarce labeled samples. Effectiveness of the proposed OPQI method was verified and compared in the field of automated rice quality grading with commonly-used methods and showed superior performance, which lays a foundation for the quality control of GP on assembly lines.

  1. Determination of Reliability Index and Weibull Modulus as a Measure of Hypereutectic Silumins Survival

    Directory of Open Access Journals (Sweden)

    J. Szymszal

    2007-07-01

    Full Text Available The first part of the study describes the methods used to determine Weibull modulus and the related reliability index of hypereutectic silumins containing about 17% Si, assigned for manufacture of high-duty castings to be used in automotive applications and aviation. The second part of the study discusses the importance of chemical composition, including the additions of 3% Cu, 1,5% Ni and 1,5% Mg, while in the third part attention was focussed on the effect of process history, including mould type (sand or metal as well as the inoculation process and heat treatment (solutioning and ageing applied to the cast AlSi17Cu3Mg1,5Ni1,5 alloy, on the run of Weibull distribution function and reliability index calculated for the tensile strength Rm of the investigated alloys.

  2. A comparison of estimation methods for fitting Weibull, Johnson's SB and beta functions to Pinus pinaster, Pinus radiata and Pinus sylvestris stands in northwest Spain

    Energy Technology Data Exchange (ETDEWEB)

    Gorgoseo, J. J.; Rojo, A.; Camara-Obregon, A.; Dieguez-Aranda, U.

    2012-07-01

    The purpose of this study was to compare the accuracy of the Weibull, Johnson's SB and beta distributions, fitted with some of the most usual methods and with different fixed values for the location parameters, for describing diameter distributions in even-aged stands of Pinus pinaster, Pinus radiata and Pinus sylvestris in northwest Spain. A total of 155 permanent plots in Pinus sylvestris stands throughout Galicia, 183 plots in Pinus pinaster stands throughout Galicia and Asturias and 325 plots in Pinus radiata stands in both regions were measured to describe the diameter distributions. Parameters of the Weibull function were estimated by Moments and Maximum Likelihood approaches, those of Johnson's SB function by Conditional Maximum Likelihood and by Knoebel and Burkhart's method, and those of the beta function with the method based on the moments of the distribution. The beta and the Johnson's SB functions were slightly superior to Weibull function for Pinus pinaster stands; the Johnson's SB and beta functions were more accurate in the best fits for Pinus radiata stands, and the best results of the Weibull and the Johnson's SB functions were slightly superior to beta function for Pinus sylvestris stands. However, the three functions are suitable for this stands with an appropriate value of the location parameter and estimation of parameters method. (Author) 44 refs.

  3. The distribution choice for the threshold of solid state relay

    International Nuclear Information System (INIS)

    Sun Beiyun; Zhou Hui; Cheng Xiangyue; Mao Congguang

    2009-01-01

    Either normal distribution or Weibull distribution can be accepted as sample distribution of the threshold of solid state relay. By goodness-of-fit method, bootstrap method and Bayesian method, the Weibull distribution is chosen later. (authors)

  4. Testing homogeneity in Weibull-regression models.

    Science.gov (United States)

    Bolfarine, Heleno; Valença, Dione M

    2005-10-01

    In survival studies with families or geographical units it may be of interest testing whether such groups are homogeneous for given explanatory variables. In this paper we consider score type tests for group homogeneity based on a mixing model in which the group effect is modelled as a random variable. As opposed to hazard-based frailty models, this model presents survival times that conditioned on the random effect, has an accelerated failure time representation. The test statistics requires only estimation of the conventional regression model without the random effect and does not require specifying the distribution of the random effect. The tests are derived for a Weibull regression model and in the uncensored situation, a closed form is obtained for the test statistic. A simulation study is used for comparing the power of the tests. The proposed tests are applied to real data sets with censored data.

  5. Small Sample Properties of Asymptotically Efficient Estimators of the Parameters of a Bivariate Gaussian–Weibull Distribution

    Science.gov (United States)

    Steve P. Verrill; James W. Evans; David E. Kretschmann; Cherilyn A. Hatfield

    2012-01-01

    Two important wood properties are stiffness (modulus of elasticity or MOE) and bending strength (modulus of rupture or MOR). In the past, MOE has often been modeled as a Gaussian and MOR as a lognormal or a two or three parameter Weibull. It is well known that MOE and MOR are positively correlated. To model the simultaneous behavior of MOE and MOR for the purposes of...

  6. Fissure formation in coke. 3: Coke size distribution and statistical analysis

    Energy Technology Data Exchange (ETDEWEB)

    D.R. Jenkins; D.E. Shaw; M.R. Mahoney [CSIRO, North Ryde, NSW (Australia). Mathematical and Information Sciences

    2010-07-15

    A model of coke stabilization, based on a fundamental model of fissuring during carbonisation is used to demonstrate the applicability of the fissuring model to actual coke size distributions. The results indicate that the degree of stabilization is important in determining the size distribution. A modified form of the Weibull distribution is shown to provide a better representation of the whole coke size distribution compared to the Rosin-Rammler distribution, which is generally only fitted to the lump coke. A statistical analysis of a large number of experiments in a pilot scale coke oven shows reasonably good prediction of the coke mean size, based on parameters related to blend rank, amount of low rank coal, fluidity and ash. However, the prediction of measures of the spread of the size distribution is more problematic. The fissuring model, the size distribution representation and the statistical analysis together provide a comprehensive capability for understanding and predicting the mean size and distribution of coke lumps produced during carbonisation. 12 refs., 16 figs., 4 tabs.

  7. On the Distribution of Earthquake Interevent Times and the Impact of Spatial Scale

    Science.gov (United States)

    Hristopulos, Dionissios

    2013-04-01

    The distribution of earthquake interevent times is a subject that has attracted much attention in the statistical physics literature [1-3]. A recent paper proposes that the distribution of earthquake interevent times follows from the the interplay of the crustal strength distribution and the loading function (stress versus time) of the Earth's crust locally [4]. It was also shown that the Weibull distribution describes earthquake interevent times provided that the crustal strength also follows the Weibull distribution and that the loading function follows a power-law during the loading cycle. I will discuss the implications of this work and will present supporting evidence based on the analysis of data from seismic catalogs. I will also discuss the theoretical evidence in support of the Weibull distribution based on models of statistical physics [5]. Since other-than-Weibull interevent times distributions are not excluded in [4], I will illustrate the use of the Kolmogorov-Smirnov test in order to determine which probability distributions are not rejected by the data. Finally, we propose a modification of the Weibull distribution if the size of the system under investigation (i.e., the area over which the earthquake activity occurs) is finite with respect to a critical link size. keywords: hypothesis testing, modified Weibull, hazard rate, finite size References [1] Corral, A., 2004. Long-term clustering, scaling, and universality in the temporal occurrence of earthquakes, Phys. Rev. Lett., 9210) art. no. 108501. [2] Saichev, A., Sornette, D. 2007. Theory of earthquake recurrence times, J. Geophys. Res., Ser. B 112, B04313/1-26. [3] Touati, S., Naylor, M., Main, I.G., 2009. Origin and nonuniversality of the earthquake interevent time distribution Phys. Rev. Lett., 102 (16), art. no. 168501. [4] Hristopulos, D.T., 2003. Spartan Gibbs random field models for geostatistical applications, SIAM Jour. Sci. Comput., 24, 2125-2162. [5] I. Eliazar and J. Klafter, 2006

  8. Statistical distributions as applied to environmental surveillance data

    International Nuclear Information System (INIS)

    Speer, D.R.; Waite, D.A.

    1976-01-01

    Application of normal, lognormal, and Weibull distributions to radiological environmental surveillance data was investigated for approximately 300 nuclide-medium-year-location combinations. The fit of data to distributions was compared through probability plotting (special graph paper provides a visual check) and W test calculations. Results show that 25% of the data fit the normal distribution, 50% fit the lognormal, and 90% fit the Weibull.Demonstration of how to plot each distribution shows that normal and lognormal distributions are comparatively easy to use while Weibull distribution is complicated and difficult to use. Although current practice is to use normal distribution statistics, normal fit the least number of data groups considered in this study

  9. A comparative study of mixed exponential and Weibull distributions in a stochastic model replicating a tropical rainfall process

    Science.gov (United States)

    Abas, Norzaida; Daud, Zalina M.; Yusof, Fadhilah

    2014-11-01

    A stochastic rainfall model is presented for the generation of hourly rainfall data in an urban area in Malaysia. In view of the high temporal and spatial variability of rainfall within the tropical rain belt, the Spatial-Temporal Neyman-Scott Rectangular Pulse model was used. The model, which is governed by the Neyman-Scott process, employs a reasonable number of parameters to represent the physical attributes of rainfall. A common approach is to attach each attribute to a mathematical distribution. With respect to rain cell intensity, this study proposes the use of a mixed exponential distribution. The performance of the proposed model was compared to a model that employs the Weibull distribution. Hourly and daily rainfall data from four stations in the Damansara River basin in Malaysia were used as input to the models, and simulations of hourly series were performed for an independent site within the basin. The performance of the models was assessed based on how closely the statistical characteristics of the simulated series resembled the statistics of the observed series. The findings obtained based on graphical representation revealed that the statistical characteristics of the simulated series for both models compared reasonably well with the observed series. However, a further assessment using the AIC, BIC and RMSE showed that the proposed model yields better results. The results of this study indicate that for tropical climates, the proposed model, using a mixed exponential distribution, is the best choice for generation of synthetic data for ungauged sites or for sites with insufficient data within the limit of the fitted region.

  10. Maximum likelihood estimation of the parameters of a bivariate Gaussian-Weibull distribution from machine stress-rated data

    Science.gov (United States)

    Steve P. Verrill; David E. Kretschmann; James W. Evans

    2016-01-01

    Two important wood properties are stiffness (modulus of elasticity, MOE) and bending strength (modulus of rupture, MOR). In the past, MOE has often been modeled as a Gaussian and MOR as a lognormal or a two- or threeparameter Weibull. It is well known that MOE and MOR are positively correlated. To model the simultaneous behavior of MOE and MOR for the purposes of wood...

  11. The effect of wall thickness distribution on mechanical reliability and strength in unidirectional porous ceramics

    Science.gov (United States)

    Seuba, Jordi; Deville, Sylvain; Guizard, Christian; Stevenson, Adam J.

    2016-01-01

    Macroporous ceramics exhibit an intrinsic strength variability caused by the random distribution of defects in their structure. However, the precise role of microstructural features, other than pore volume, on reliability is still unknown. Here, we analyze the applicability of the Weibull analysis to unidirectional macroporous yttria-stabilized-zirconia (YSZ) prepared by ice-templating. First, we performed crush tests on samples with controlled microstructural features with the loading direction parallel to the porosity. The compressive strength data were fitted using two different fitting techniques, ordinary least squares and Bayesian Markov Chain Monte Carlo, to evaluate whether Weibull statistics are an adequate descriptor of the strength distribution. The statistical descriptors indicated that the strength data are well described by the Weibull statistical approach, for both fitting methods used. Furthermore, we assess the effect of different microstructural features (volume, size, densification of the walls, and morphology) on Weibull modulus and strength. We found that the key microstructural parameter controlling reliability is wall thickness. In contrast, pore volume is the main parameter controlling the strength. The highest Weibull modulus (?) and mean strength (198.2 MPa) were obtained for the samples with the smallest and narrowest wall thickness distribution (3.1 ?m) and lower pore volume (54.5%).

  12. Calculation of the ceramics Weibull parameters

    Czech Academy of Sciences Publication Activity Database

    Fuis, Vladimír; Návrat, Tomáš

    2011-01-01

    Roč. 58, - (2011), s. 642-647 ISSN 2010-376X. [International Conference on Bioinformatics and Biomedicine 2011. Bali, 26.10.2011-28.10.2011] Institutional research plan: CEZ:AV0Z20760514 Keywords : biomaterial parameters * Weibull statistics * ceramics Subject RIV: BO - Biophysics http://www.waset.org/journals/waset/v58/v58-132.pdf

  13. The effect of core material, veneering porcelain, and fabrication technique on the biaxial flexural strength and weibull analysis of selected dental ceramics.

    Science.gov (United States)

    Lin, Wei-Shao; Ercoli, Carlo; Feng, Changyong; Morton, Dean

    2012-07-01

    The objective of this study was to compare the effect of veneering porcelain (monolithic or bilayer specimens) and core fabrication technique (heat-pressed or CAD/CAM) on the biaxial flexural strength and Weibull modulus of leucite-reinforced and lithium-disilicate glass ceramics. In addition, the effect of veneering technique (heat-pressed or powder/liquid layering) for zirconia ceramics on the biaxial flexural strength and Weibull modulus was studied. Five ceramic core materials (IPS Empress Esthetic, IPS Empress CAD, IPS e.max Press, IPS e.max CAD, IPS e.max ZirCAD) and three corresponding veneering porcelains (IPS Empress Esthetic Veneer, IPS e.max Ceram, IPS e.max ZirPress) were selected for this study. Each core material group contained three subgroups based on the core material thickness and the presence of corresponding veneering porcelain as follows: 1.5 mm core material only (subgroup 1.5C), 0.8 mm core material only (subgroup 0.8C), and 1.5 mm core/veneer group: 0.8 mm core with 0.7 mm corresponding veneering porcelain with a powder/liquid layering technique (subgroup 0.8C-0.7VL). The ZirCAD group had one additional 1.5 mm core/veneer subgroup with 0.7 mm heat-pressed veneering porcelain (subgroup 0.8C-0.7VP). The biaxial flexural strengths were compared for each subgroup (n = 10) according to ISO standard 6872:2008 with ANOVA and Tukey's post hoc multiple comparison test (p≤ 0.05). The reliability of strength was analyzed with the Weibull distribution. For all core materials, the 1.5 mm core/veneer subgroups (0.8C-0.7VL, 0.8C-0.7VP) had significantly lower mean biaxial flexural strengths (p Empress and e.max groups, regardless of core thickness and fabrication techniques. Comparing fabrication techniques, Empress Esthetic/CAD, e.max Press/CAD had similar biaxial flexural strength (p= 0.28 for Empress pair; p= 0.87 for e.max pair); however, e.max CAD/Press groups had significantly higher flexural strength (p Empress Esthetic/CAD groups. Monolithic core

  14. The effect of mis-specification on mean and selection between the Weibull and lognormal models

    Science.gov (United States)

    Jia, Xiang; Nadarajah, Saralees; Guo, Bo

    2018-02-01

    The lognormal and Weibull models are commonly used to analyse data. Although selection procedures have been extensively studied, it is possible that the lognormal model could be selected when the true model is Weibull or vice versa. As the mean is important in applications, we focus on the effect of mis-specification on mean. The effect on lognormal mean is first considered if the lognormal sample is wrongly fitted by a Weibull model. The maximum likelihood estimate (MLE) and quasi-MLE (QMLE) of lognormal mean are obtained based on lognormal and Weibull models. Then, the impact is evaluated by computing ratio of biases and ratio of mean squared errors (MSEs) between MLE and QMLE. For completeness, the theoretical results are demonstrated by simulation studies. Next, the effect of the reverse mis-specification on Weibull mean is discussed. It is found that the ratio of biases and the ratio of MSEs are independent of the location and scale parameters of the lognormal and Weibull models. The influence could be ignored if some special conditions hold. Finally, a model selection method is proposed by comparing ratios concerning biases and MSEs. We also present a published data to illustrate the study in this paper.

  15. Combining Generalized Renewal Processes with Non-Extensive Entropy-Based q-Distributions for Reliability Applications

    Directory of Open Access Journals (Sweden)

    Isis Didier Lins

    2018-03-01

    Full Text Available The Generalized Renewal Process (GRP is a probabilistic model for repairable systems that can represent the usual states of a system after a repair: as new, as old, or in a condition between new and old. It is often coupled with the Weibull distribution, widely used in the reliability context. In this paper, we develop novel GRP models based on probability distributions that stem from the Tsallis’ non-extensive entropy, namely the q-Exponential and the q-Weibull distributions. The q-Exponential and Weibull distributions can model decreasing, constant or increasing failure intensity functions. However, the power law behavior of the q-Exponential probability density function for specific parameter values is an advantage over the Weibull distribution when adjusting data containing extreme values. The q-Weibull probability distribution, in turn, can also fit data with bathtub-shaped or unimodal failure intensities in addition to the behaviors already mentioned. Therefore, the q-Exponential-GRP is an alternative for the Weibull-GRP model and the q-Weibull-GRP generalizes both. The method of maximum likelihood is used for their parameters’ estimation by means of a particle swarm optimization algorithm, and Monte Carlo simulations are performed for the sake of validation. The proposed models and algorithms are applied to examples involving reliability-related data of complex systems and the obtained results suggest GRP plus q-distributions are promising techniques for the analyses of repairable systems.

  16. An exponential distribution

    International Nuclear Information System (INIS)

    Anon

    2009-01-01

    In this presentation author deals with the probabilistic evaluation of product life on the example of the exponential distribution. The exponential distribution is special one-parametric case of the weibull distribution.

  17. A general Bayes weibull inference model for accelerated life testing

    International Nuclear Information System (INIS)

    Dorp, J. Rene van; Mazzuchi, Thomas A.

    2005-01-01

    This article presents the development of a general Bayes inference model for accelerated life testing. The failure times at a constant stress level are assumed to belong to a Weibull distribution, but the specification of strict adherence to a parametric time-transformation function is not required. Rather, prior information is used to indirectly define a multivariate prior distribution for the scale parameters at the various stress levels and the common shape parameter. Using the approach, Bayes point estimates as well as probability statements for use-stress (and accelerated) life parameters may be inferred from a host of testing scenarios. The inference procedure accommodates both the interval data sampling strategy and type I censored sampling strategy for the collection of ALT test data. The inference procedure uses the well-known MCMC (Markov Chain Monte Carlo) methods to derive posterior approximations. The approach is illustrated with an example

  18. Accurate bearing remaining useful life prediction based on Weibull distribution and artificial neural network

    Science.gov (United States)

    Ben Ali, Jaouher; Chebel-Morello, Brigitte; Saidi, Lotfi; Malinowski, Simon; Fnaiech, Farhat

    2015-05-01

    Accurate remaining useful life (RUL) prediction of critical assets is an important challenge in condition based maintenance to improve reliability and decrease machine's breakdown and maintenance's cost. Bearing is one of the most important components in industries which need to be monitored and the user should predict its RUL. The challenge of this study is to propose an original feature able to evaluate the health state of bearings and to estimate their RUL by Prognostics and Health Management (PHM) techniques. In this paper, the proposed method is based on the data-driven prognostic approach. The combination of Simplified Fuzzy Adaptive Resonance Theory Map (SFAM) neural network and Weibull distribution (WD) is explored. WD is used just in the training phase to fit measurement and to avoid areas of fluctuation in the time domain. SFAM training process is based on fitted measurements at present and previous inspection time points as input. However, the SFAM testing process is based on real measurements at present and previous inspections. Thanks to the fuzzy learning process, SFAM has an important ability and a good performance to learn nonlinear time series. As output, seven classes are defined; healthy bearing and six states for bearing degradation. In order to find the optimal RUL prediction, a smoothing phase is proposed in this paper. Experimental results show that the proposed method can reliably predict the RUL of rolling element bearings (REBs) based on vibration signals. The proposed prediction approach can be applied to prognostic other various mechanical assets.

  19. Homogeneity and scale testing of generalized gamma distribution

    International Nuclear Information System (INIS)

    Stehlik, Milan

    2008-01-01

    The aim of this paper is to derive the exact distributions of the likelihood ratio tests of homogeneity and scale hypothesis when the observations are generalized gamma distributed. The special cases of exponential, Rayleigh, Weibull or gamma distributed observations are discussed exclusively. The photoemulsion experiment analysis and scale test with missing time-to-failure observations are present to illustrate the applications of methods discussed

  20. A deterministic inventory model for deteriorating items with selling price dependent demand and three-parameter Weibull distributed deterioration

    Directory of Open Access Journals (Sweden)

    Asoke Kumar Bhunia

    2014-06-01

    Full Text Available In this paper, an attempt is made to develop two inventory models for deteriorating items with variable demand dependent on the selling price and frequency of advertisement of items. In the first model, shortages are not allowed whereas in the second, these are allowed and partially backlogged with a variable rate dependent on the duration of waiting time up to the arrival of next lot. In both models, the deterioration rate follows three-parameter Weibull distribution and the transportation cost is considered explicitly for replenishing the order quantity. This cost is dependent on the lot-size as well as the distance from the source to the destination. The corresponding models have been formulated and solved. Two numerical examples have been considered to illustrate the results and the significant features of the results are discussed. Finally, based on these examples, the effects of different parameters on the initial stock level, shortage level (in case of second model only, cycle length along with the optimal profit have been studied by sensitivity analyses taking one parameter at a time keeping the other parameters as same.

  1. Adhesively bonded joints composed of pultruded adherends: Considerations at the upper tail of the material strength statistical distribution

    International Nuclear Information System (INIS)

    Vallee, T.; Keller, Th.; Fourestey, G.; Fournier, B.; Correia, J.R.

    2009-01-01

    The Weibull distribution, used to describe the scaling of strength of materials, has been verified on a wide range of materials and geometries: however, the quality of the fitting tended to be less good towards the upper tail. Based on a previously developed probabilistic strength prediction method for adhesively bonded joints composed of pultruded glass fiber-reinforced polymer (GFRP) adherends, where it was verified that a two-parameter Weibull probabilistic distribution was not able to model accurately the upper tail of a material strength distribution, different improved probabilistic distributions were compared to enhance the quality of strength predictions. The following probabilistic distributions were examined: a two-parameter Weibull (as a reference), m-fold Weibull, a Grafted Distribution, a Birnbaum-Saunders Distribution and a Generalized Lambda Distribution. The Generalized Lambda Distribution turned out to be the best analytical approximation for the strength data, providing a good fit to the experimental data, and leading to more accurate joint strength predictions than the original two-parameter Weibull distribution. It was found that a proper modeling of the upper tail leads to a noticeable increase of the quality of the predictions. (authors)

  2. Adhesively bonded joints composed of pultruded adherends: Considerations at the upper tail of the material strength statistical distribution

    Energy Technology Data Exchange (ETDEWEB)

    Vallee, T.; Keller, Th. [Ecole Polytech Fed Lausanne, CCLab, CH-1015 Lausanne, (Switzerland); Fourestey, G. [Ecole Polytech Fed Lausanne, IACS, Chair Modeling and Sci Comp, CH-1015 Lausanne, (Switzerland); Fournier, B. [CEA SACLAY ENSMP, DEN, DANS, DMN, SRMA, LC2M, F-91191 Gif Sur Yvette, (France); Correia, J.R. [Univ Tecn Lisbon, Inst Super Tecn, Civil Engn and Architecture Dept, P-1049001 Lisbon, (Portugal)

    2009-07-01

    The Weibull distribution, used to describe the scaling of strength of materials, has been verified on a wide range of materials and geometries: however, the quality of the fitting tended to be less good towards the upper tail. Based on a previously developed probabilistic strength prediction method for adhesively bonded joints composed of pultruded glass fiber-reinforced polymer (GFRP) adherends, where it was verified that a two-parameter Weibull probabilistic distribution was not able to model accurately the upper tail of a material strength distribution, different improved probabilistic distributions were compared to enhance the quality of strength predictions. The following probabilistic distributions were examined: a two-parameter Weibull (as a reference), m-fold Weibull, a Grafted Distribution, a Birnbaum-Saunders Distribution and a Generalized Lambda Distribution. The Generalized Lambda Distribution turned out to be the best analytical approximation for the strength data, providing a good fit to the experimental data, and leading to more accurate joint strength predictions than the original two-parameter Weibull distribution. It was found that a proper modeling of the upper tail leads to a noticeable increase of the quality of the predictions. (authors)

  3. Evaluation of the Weibull and log normal distribution functions as survival models of Escherichia coli under isothermal and non isothermal conditions.

    Science.gov (United States)

    Aragao, Glaucia M F; Corradini, Maria G; Normand, Mark D; Peleg, Micha

    2007-11-01

    Published survival curves of Escherichia coli in two growth media, with and without the presence of salt, at various temperatures and in a Greek eggplant salad having various levels of essential oil, all had a characteristic downward concavity when plotted on semi logarithmic coordinates. Some also exhibited what appeared as a 'shoulder' of considerable length. Regardless of whether a shoulder was noticed, the survival pattern could be considered as a manifestation of an underlying unimodal distribution of the cells' death times. Mathematically, the data could be described equally well by the Weibull and log normal distribution functions, which had similar modes, means, standard deviations and coefficients of skewness. When plotted in their probability density function (PDF) form, the curves also appeared very similar visually. This enabled us to quantify and compare the effect of temperature or essential oil concentration on the organism's survival in terms of these temporal distributions' characteristics. Increased lethality was generally expressed in a shorter mean and mode, a smaller standard deviation and increased overall symmetry as judged by the distributions' degree of skewness. The 'shoulder', as expected, simply indicated that the distribution's standard deviation was much smaller than its mode. Rate models based on the two distribution functions could be used to predict non isothermal survival patterns. They were derived on the assumption that the momentary inactivation rate is the isothermal rate at the momentary temperature at a time that corresponds to the momentary survival ratio. In this application, however, the Weibullian model with a fixed power was not only simpler and more convenient mathematically than the one based on the log normal distribution, but it also provided more accurate estimates of the dynamic inactivation patterns.

  4. Statistical Analysis of Data for Timber Strengths

    DEFF Research Database (Denmark)

    Sørensen, John Dalsgaard

    2003-01-01

    Statistical analyses are performed for material strength parameters from a large number of specimens of structural timber. Non-parametric statistical analysis and fits have been investigated for the following distribution types: Normal, Lognormal, 2 parameter Weibull and 3-parameter Weibull...... fits to the data available, especially if tail fits are used whereas the Log Normal distribution generally gives a poor fit and larger coefficients of variation, especially if tail fits are used. The implications on the reliability level of typical structural elements and on partial safety factors...... for timber are investigated....

  5. Maximum Likelihood Estimates of Parameters in Various Types of Distribution Fitted to Important Data Cases.

    OpenAIRE

    HIROSE,Hideo

    1998-01-01

    TYPES OF THE DISTRIBUTION:13;Normal distribution (2-parameter)13;Uniform distribution (2-parameter)13;Exponential distribution ( 2-parameter)13;Weibull distribution (2-parameter)13;Gumbel Distribution (2-parameter)13;Weibull/Frechet Distribution (3-parameter)13;Generalized extreme-value distribution (3-parameter)13;Gamma distribution (3-parameter)13;Extended Gamma distribution (3-parameter)13;Log-normal distribution (3-parameter)13;Extended Log-normal distribution (3-parameter)13;Generalized ...

  6. Calculation of Wind Speeds for Return Period Using Weibull Parameter: A Case Study of Hanbit NPP Area

    Energy Technology Data Exchange (ETDEWEB)

    Lee, Jongk Uk; Lee, Kwan Hee; Kim, Sung Il; Yook, Dae Sik; Ahn, Sang Myeon [KINS, Daejeon (Korea, Republic of)

    2016-05-15

    Evaluation of the meteorological characteristics at the nuclear power plant and in the surrounding area should be performed in determining the site suitability for safe operation of the nuclear power plant. Under unexpected emergency condition, knowledge of meteorological information on the site area is important to provide the basis for estimating environmental impacts resulting from radioactive materials released in gaseous effluents during the accident condition. In the meteorological information, wind speed and direction are the important meteorological factors for examination of the safety analysis in the nuclear power plant area. Wind characteristics was analyzed on Hanbit NPP area. It was found that the Weibull parameters k and c vary 2.56 to 4.77 and 4.53 to 6.79 for directional wind speed distribution, respectively. Maximum wind frequency was NE and minimum was NNW.

  7. A Weibull-based compositional approach for hierarchical dynamic fault trees

    International Nuclear Information System (INIS)

    Chiacchio, F.; Cacioppo, M.; D'Urso, D.; Manno, G.; Trapani, N.; Compagno, L.

    2013-01-01

    The solution of a dynamic fault tree (DFT) for the reliability assessment can be achieved using a wide variety of techniques. These techniques have a strong theoretical foundation as both the analytical and the simulation methods have been extensively developed. Nevertheless, they all present the same limits that appear with the increasing of the size of the fault trees (i.e., state space explosion, time-consuming simulations), compromising the resolution. We have tested the feasibility of a composition algorithm based on a Weibull distribution, addressed to the resolution of a general class of dynamic fault trees characterized by non-repairable basic events and generally distributed failure times. The proposed composition algorithm is used to generalize the traditional hierarchical technique that, as previous literature have extensively confirmed, is able to reduce the computational effort of a large DFT through the modularization of independent parts of the tree. The results of this study are achieved both through simulation and analytical techniques, thus confirming the capability to solve a quite general class of dynamic fault trees and overcome the limits of traditional techniques.

  8. Distributions of Autocorrelated First-Order Kinetic Outcomes: Illness Severity.

    Directory of Open Access Journals (Sweden)

    James D Englehardt

    Full Text Available Many complex systems produce outcomes having recurring, power law-like distributions over wide ranges. However, the form necessarily breaks down at extremes, whereas the Weibull distribution has been demonstrated over the full observed range. Here the Weibull distribution is derived as the asymptotic distribution of generalized first-order kinetic processes, with convergence driven by autocorrelation, and entropy maximization subject to finite positive mean, of the incremental compounding rates. Process increments represent multiplicative causes. In particular, illness severities are modeled as such, occurring in proportion to products of, e.g., chronic toxicant fractions passed by organs along a pathway, or rates of interacting oncogenic mutations. The Weibull form is also argued theoretically and by simulation to be robust to the onset of saturation kinetics. The Weibull exponential parameter is shown to indicate the number and widths of the first-order compounding increments, the extent of rate autocorrelation, and the degree to which process increments are distributed exponential. In contrast with the Gaussian result in linear independent systems, the form is driven not by independence and multiplicity of process increments, but by increment autocorrelation and entropy. In some physical systems the form may be attracting, due to multiplicative evolution of outcome magnitudes towards extreme values potentially much larger and smaller than control mechanisms can contain. The Weibull distribution is demonstrated in preference to the lognormal and Pareto I for illness severities versus (a toxicokinetic models, (b biologically-based network models, (c scholastic and psychological test score data for children with prenatal mercury exposure, and (d time-to-tumor data of the ED01 study.

  9. Recurrent frequency-size distribution of characteristic events

    Directory of Open Access Journals (Sweden)

    S. G. Abaimov

    2009-04-01

    Full Text Available Statistical frequency-size (frequency-magnitude properties of earthquake occurrence play an important role in seismic hazard assessments. The behavior of earthquakes is represented by two different statistics: interoccurrent behavior in a region and recurrent behavior at a given point on a fault (or at a given fault. The interoccurrent frequency-size behavior has been investigated by many authors and generally obeys the power-law Gutenberg-Richter distribution to a good approximation. It is expected that the recurrent frequency-size behavior should obey different statistics. However, this problem has received little attention because historic earthquake sequences do not contain enough events to reconstruct the necessary statistics. To overcome this lack of data, this paper investigates the recurrent frequency-size behavior for several problems. First, the sequences of creep events on a creeping section of the San Andreas fault are investigated. The applicability of the Brownian passage-time, lognormal, and Weibull distributions to the recurrent frequency-size statistics of slip events is tested and the Weibull distribution is found to be the best-fit distribution. To verify this result the behaviors of numerical slider-block and sand-pile models are investigated and the Weibull distribution is confirmed as the applicable distribution for these models as well. Exponents β of the best-fit Weibull distributions for the observed creep event sequences and for the slider-block model are found to have similar values ranging from 1.6 to 2.2 with the corresponding aperiodicities CV of the applied distribution ranging from 0.47 to 0.64. We also note similarities between recurrent time-interval statistics and recurrent frequency-size statistics.

  10. Comparação entre as distribuições normal e de Weibull para análise da resistência à compressão do concreto (doi:10.5216/reec.v9i3.28814

    Directory of Open Access Journals (Sweden)

    Paulo Eduardo Teodoro

    2014-11-01

    Full Text Available RESUMO: O projeto de estruturas de concreto possui uma modelagem matemática de natureza bastante subjetiva. Portanto, objetivou-se com esta pesquisa verificar se as distribuições Normal e de Weibull podem ser aplicadas aos dados resistências à compressão do concreto pronto, agrupados comercialmente. O estudo foi realizado durante o ano de 2011 na cidade de Campo Grande/MS. A resistência à compressão foi avaliada em ensaios de 189 amostras aos 28 dias a partir de diferentes construções de concreto armado realizados na cidade. Os ensaios ocorreram conforme prescrito pela NBR 5739 (ABNT, 2007. Para quantificar o grau em que a distribuição Normal e de Weibull se ajustaram os dados experimentais foram utilizados três testes de adequação: qui-quadrado, Anderson-Darling e Kolmogorov-Smirnov. Com base no presente estudo, a distribuição Weibull pode ser aplicada aos dados de resistência à compressão para concreto. Isto sugere que, apesar de os complexos processos envolvidos na falha de compressão para um material compósito quase frágil como o concreto, um modelo de força estatística é eficaz. Além disso, ao comparar os testes de ajuste, há grande diferença prática entre as distribuições Normal e de Weibull. Esta informação é uma importante adição experimental para a literatura científica no que diz respeito à ruptura de materiais “semi-frágeis”. ABSTRACT: The design of concrete structures and their mathematical modeling is rather subjective in its nature. Therefore, it is the purpose of this study to see whether the Weibull or Normal distributions can be applied to the compressive strengths of commercially batched ready-mixed concrete. The study was conducted during the year 2011 in the city of Campo Grande / MS. The compressive strength was evaluated in 189 test samples at 28 days from different concrete constructions conducted in the city. The trials took place as prescribed by NBR 5739 (ABNT, 2007. To

  11. Theoretical derivation of wind power probability distribution function and applications

    International Nuclear Information System (INIS)

    Altunkaynak, Abdüsselam; Erdik, Tarkan; Dabanlı, İsmail; Şen, Zekai

    2012-01-01

    Highlights: ► Derivation of wind power stochastic characteristics are standard deviation and the dimensionless skewness. ► The perturbation is expressions for the wind power statistics from Weibull probability distribution function (PDF). ► Comparisons with the corresponding characteristics of wind speed PDF abides by the Weibull PDF. ► The wind power abides with the Weibull-PDF. -- Abstract: The instantaneous wind power contained in the air current is directly proportional with the cube of the wind speed. In practice, there is a record of wind speeds in the form of a time series. It is, therefore, necessary to develop a formulation that takes into consideration the statistical parameters of such a time series. The purpose of this paper is to derive the general wind power formulation in terms of the statistical parameters by using the perturbation theory, which leads to a general formulation of the wind power expectation and other statistical parameter expressions such as the standard deviation and the coefficient of variation. The formulation is very general and can be applied specifically for any wind speed probability distribution function. Its application to two-parameter Weibull probability distribution of wind speeds is presented in full detail. It is concluded that provided wind speed is distributed according to a Weibull distribution, the wind power could be derived based on wind speed data. It is possible to determine wind power at any desired risk level, however, in practical studies most often 5% or 10% risk levels are preferred and the necessary simple procedure is presented for this purpose in this paper.

  12. Modeling the reliability and maintenance costs of wind turbines using Weibull analysis

    Energy Technology Data Exchange (ETDEWEB)

    Vachon, W.A. [W.A. Vachon & Associates, Inc., Manchester, MA (United States)

    1996-12-31

    A general description is provided of the basic mathematics and use of Weibull statistical models for modeling component failures and maintenance costs as a function of time. The applicability of the model to wind turbine components and subsystems is discussed with illustrative examples of typical component reliabilities drawn from actual field experiences. Example results indicate the dominant role of key subsystems based on a combination of their failure frequency and repair/replacement costs. The value of the model is discussed as a means of defining (1) maintenance practices, (2) areas in which to focus product improvements, (3) spare parts inventory, and (4) long-term trends in maintenance costs as an important element in project cash flow projections used by developers, investors, and lenders. 6 refs., 8 figs., 3 tabs.

  13. Influence of the Testing Gage Length on the Strength, Young's Modulus and Weibull Modulus of Carbon Fibres and Glass Fibres

    Directory of Open Access Journals (Sweden)

    Luiz Claudio Pardini

    2002-10-01

    Full Text Available Carbon fibres and glass fibres are reinforcements for advanced composites and the fiber strength is the most influential factor on the strength of the composites. They are essentially brittle and fail with very little reduction in cross section. Composites made with these fibres are characterized by a high strength/density ratio and their properties are intrisically related to their microstructure, i.e., amount and orientation of the fibres, surface treatment, among other factors. Processing parameters have an important role in the fibre mechanical behaviour (strength and modulus. Cracks, voids and impurities in the case of glass fibres and fibrillar misalignments in the case of carbon fibres are created during processing. Such inhomogeneities give rise to an appreciable scatter in properties. The most used statistical tool that deals with this characteristic variability in properties is the Weibull distribution. The present work investigates the influence of the testing gage length on the strength, Young's modulus and Weibull modulus of carbon fibres and glass fibres. The Young's modulus is calculated by two methods: (i ASTM D 3379M, and (ii interaction between testing equipment/specimen The first method resulted in a Young modulus of 183 GPa for carbon fibre, and 76 GPa for glass fibre. The second method gave a Young modulus of 250 GPa for carbon fibre and 50 GPa for glass fibre. These differences revelead differences on how the interaction specimen/testing machine can interfere in the Young modulus calculations. Weibull modulus can be a tool to evaluate the fibre's homogeneity in terms of properties and it is a good quality control parameter during processing. In the range of specimen gage length tested the Weibull modulus for carbon fibre is ~ 3.30 and for glass fibres is ~ 5.65, which indicates that for the batch of fibres tested, the glass fibre is more uniform in properties.

  14. Stress-strength reliability for general bivariate distributions

    Directory of Open Access Journals (Sweden)

    Alaa H. Abdel-Hamid

    2016-10-01

    Full Text Available An expression for the stress-strength reliability R=P(X1distribution. Such distribution includes bivariate compound Weibull, bivariate compound Gompertz, bivariate compound Pareto, among others. In the parametric case, the maximum likelihood estimates of the parameters and reliability function R are obtained. In the non-parametric case, point and interval estimates of R are developed using Govindarajulu's asymptotic distribution-free method when X1 and X2 are dependent. An example is given when the population distribution is bivariate compound Weibull. Simulation is performed, based on different sample sizes to study the performance of estimates.

  15. Choosing an optimal model for failure data analysis by graphical approach

    International Nuclear Information System (INIS)

    Zhang, Tieling; Dwight, Richard

    2013-01-01

    Many models involving combination of multiple Weibull distributions, modification of Weibull distribution or extension of its modified ones, etc. have been developed to model a given set of failure data. The application of these models to modeling a given data set can be based on plotting the data on Weibull probability paper (WPP). Of them, two or more models are appropriate to model one typical shape of the fitting plot, whereas a specific model may be fit for analyzing different shapes of the plots. Hence, a problem arises, that is how to choose an optimal model for a given data set and how to model the data. The motivation of this paper is to address this issue. This paper summarizes the characteristics of Weibull-related models with more than three parameters including sectional models involving two or three Weibull distributions, competing risk model and mixed Weibull model. The models as discussed in this present paper are appropriate to model the data of which the shapes of plots on WPP can be concave, convex, S-shaped or inversely S-shaped. Then, the method for model selection is proposed, which is based on the shapes of the fitting plots. The main procedure for parameter estimation of the models is described accordingly. In addition, the range of data plots on WPP is clearly highlighted from the practical point of view. To note this is important as mathematical analysis of a model with neglecting the applicable range of the model plot will incur discrepancy or big errors in model selection and parameter estimates

  16. Statistical analysis of the distribution of critical current and the correlation of n value to the critical current of bent Bi2223 composite tape

    International Nuclear Information System (INIS)

    Ochiai, S; Matsubayashi, H; Okuda, H; Osamura, K; Otto, A; Malozemoff, A

    2009-01-01

    Distributions of local and overall critical currents and correlation of n value to the critical current of bent Bi2223 composite tape were studied from the statistical viewpoint. The data of the local and overall transport critical currents and n values of the Bi2223 composite tape specimens were collected experimentally for a wide range of bending strain (0-1.1%) by using the specimens, designed so as to characterize the local and overall critical currents and n values. The measured local and overall critical currents were analyzed with various types of Weibull distribution function. Which of the Weibull distribution functions is suitable for the description of the distribution of local and overall critical currents at each bending strain, and also how much the Weibull parameter values characterizing the distribution vary with bending strain, were revealed. Then we attempted to reproduce the overall critical current distribution and correlation of the overall n value to the overall critical current from the distribution of local critical currents and the correlation of the local n value to the local critical current by a Monte Carlo simulation. The measured average values of critical current and n value at each bending strain and the correlation of n value to critical current were reproduced well by the present simulation, while the distribution of critical current values was reproduced fairly well but not fully. The reason for this is discussed.

  17. Scaling in the distribution of intertrade durations of Chinese stocks

    Science.gov (United States)

    Jiang, Zhi-Qiang; Chen, Wei; Zhou, Wei-Xing

    2008-10-01

    The distribution of intertrade durations, defined as the waiting times between two consecutive transactions, is investigated based upon the limit order book data of 23 liquid Chinese stocks listed on the Shenzhen Stock Exchange in the whole year 2003. A scaling pattern is observed in the distributions of intertrade durations, where the empirical density functions of the normalized intertrade durations of all 23 stocks collapse onto a single curve. The scaling pattern is also observed in the intertrade duration distributions for filled and partially filled trades and in the conditional distributions. The ensemble distributions for all stocks are modeled by the Weibull and the Tsallis q-exponential distributions. Maximum likelihood estimation shows that the Weibull distribution outperforms the q-exponential for not-too-large intertrade durations which account for more than 98.5% of the data. Alternatively, nonlinear least-squares estimation selects the q-exponential as a better model, in which the optimization is conducted on the distance between empirical and theoretical values of the logarithmic probability densities. The distribution of intertrade durations is Weibull followed by a power-law tail with an asymptotic tail exponent close to 3.

  18. Expectation Maximization Algorithm for Box-Cox Transformation Cure Rate Model and Assessment of Model Misspecification Under Weibull Lifetimes.

    Science.gov (United States)

    Pal, Suvra; Balakrishnan, Narayanaswamy

    2018-05-01

    In this paper, we develop likelihood inference based on the expectation maximization algorithm for the Box-Cox transformation cure rate model assuming the lifetimes to follow a Weibull distribution. A simulation study is carried out to demonstrate the performance of the proposed estimation method. Through Monte Carlo simulations, we also study the effect of model misspecification on the estimate of cure rate. Finally, we analyze a well-known data on melanoma with the model and the inferential method developed here.

  19. Stochastic Analysis of Wind Energy for Wind Pump Irrigation in Coastal Andhra Pradesh, India

    Science.gov (United States)

    Raju, M. M.; Kumar, A.; Bisht, D.; Rao, D. B.

    2014-09-01

    The rapid escalation in the prices of oil and gas as well as increasing demand for energy has attracted the attention of scientists and researchers to explore the possibility of generating and utilizing the alternative and renewable sources of wind energy in the long coastal belt of India with considerable wind energy resources. A detailed analysis of wind potential is a prerequisite to harvest the wind energy resources efficiently. Keeping this in view, the present study was undertaken to analyze the wind energy potential to assess feasibility of the wind-pump operated irrigation system in the coastal region of Andhra Pradesh, India, where high ground water table conditions are available. The stochastic analysis of wind speed data were tested to fit a probability distribution, which describes the wind energy potential in the region. The normal and Weibull probability distributions were tested; and on the basis of Chi square test, the Weibull distribution gave better results. Hence, it was concluded that the Weibull probability distribution may be used to stochastically describe the annual wind speed data of coastal Andhra Pradesh with better accuracy. The size as well as the complete irrigation system with mass curve analysis was determined to satisfy various daily irrigation demands at different risk levels.

  20. Bayesian analysis of general failure data from an ageing distribution: advances in numerical methods

    International Nuclear Information System (INIS)

    Procaccia, H.; Villain, B.; Clarotti, C.A.

    1996-01-01

    EDF and ENEA carried out a joint research program for developing the numerical methods and computer codes needed for Bayesian analysis of component-lives in the case of ageing. Early results of this study were presented at ESREL'94. Since then the following further steps have been gone: input data have been generalized to the case that observed lives are censored both on the right and on the left; allowable life distributions are Weibull and gamma - their parameters are both unknown and can be statistically dependent; allowable priors are histograms relative to different parametrizations of the life distribution of concern; first-and-second-order-moments of the posterior distributions can be computed. In particular the covariance will give some important information about the degree of the statistical dependence between the parameters of interest. An application of the code to the appearance of a stress corrosion cracking in a tube of the PWR Steam Generator system is presented. (authors)

  1. Comparision of the different probability distributions for earthquake hazard assessment in the North Anatolian Fault Zone

    Energy Technology Data Exchange (ETDEWEB)

    Yilmaz, Şeyda, E-mail: seydayilmaz@ktu.edu.tr; Bayrak, Erdem, E-mail: erdmbyrk@gmail.com [Karadeniz Technical University, Trabzon (Turkey); Bayrak, Yusuf, E-mail: bayrak@ktu.edu.tr [Ağrı İbrahim Çeçen University, Ağrı (Turkey)

    2016-04-18

    In this study we examined and compared the three different probabilistic distribution methods for determining the best suitable model in probabilistic assessment of earthquake hazards. We analyzed a reliable homogeneous earthquake catalogue between a time period 1900-2015 for magnitude M ≥ 6.0 and estimated the probabilistic seismic hazard in the North Anatolian Fault zone (39°-41° N 30°-40° E) using three distribution methods namely Weibull distribution, Frechet distribution and three-parameter Weibull distribution. The distribution parameters suitability was evaluated Kolmogorov-Smirnov (K-S) goodness-of-fit test. We also compared the estimated cumulative probability and the conditional probabilities of occurrence of earthquakes for different elapsed time using these three distribution methods. We used Easyfit and Matlab software to calculate these distribution parameters and plotted the conditional probability curves. We concluded that the Weibull distribution method was the most suitable than other distribution methods in this region.

  2. Utilization of Weibull equation to obtain soil-water diffusivity in horizontal infiltration

    International Nuclear Information System (INIS)

    Guerrini, I.A.

    1982-06-01

    Water movement was studied in horizontal infiltration experiments using laboratory columns of air-dry and homogeneous soil to obtain a simple and suitable equation for soil-water diffusivity. Many water content profiles for each one of the ten soil columns utilized were obtained through gamma-ray attenuation technique using a 137 Cs source. During the measurement of a particular water content profile, the soil column was held in the same position in order to measure changes in time and so to reduce the errors in water content determination. The Weibull equation utilized was excellent in fitting water content profiles experimental data. The use of an analytical function for ν, the Boltzmann variable, according to Weibull model, allowed to obtain a simple equation for soil water diffusivity. Comparisons among the equation here obtained for diffusivity and others solutions found in literature were made, and the unsuitability of a simple exponential variation of diffusivity with water content for the full range of the latter was shown. The necessity of admitting the time dependency for diffusivity was confirmed and also the possibility fixing that dependency on a well known value extended to generalized soil water infiltration studies was found. Finally, it was shown that the soil water diffusivity function given by the equation here proposed can be obtained just by the analysis of the wetting front advance as a function of time. (Author) [pt

  3. The Distribution of Minimum of Ratios of Two Random Variables and Its Application in Analysis of Multi-hop Systems

    Directory of Open Access Journals (Sweden)

    A. Stankovic

    2012-12-01

    Full Text Available The distributions of random variables are of interest in many areas of science. In this paper, ascertaining on the importance of multi-hop transmission in contemporary wireless communications systems operating over fading channels in the presence of cochannel interference, the probability density functions (PDFs of minimum of arbitrary number of ratios of Rayleigh, Rician, Nakagami-m, Weibull and α-µ random variables are derived. These expressions can be used to study the outage probability as an important multi-hop system performance measure. Various numerical results complement the proposed mathematical analysis.

  4. Efficient Weibull channel model for salinity induced turbulent underwater wireless optical communications

    KAUST Repository

    Oubei, Hassan M.; Zedini, Emna; Elafandy, Rami T.; Kammoun, Abla; Ng, Tien Khee; Alouini, Mohamed-Slim; Ooi, Boon S.

    2017-01-01

    Recent advances in underwater wireless optical communications necessitate a better understanding of the underwater channel. We propose the Weibull model to characterize the fading of salinity induced turbulent underwater wireless optical channels

  5. Test Population Selection from Weibull-Based, Monte Carlo Simulations of Fatigue Life

    Science.gov (United States)

    Vlcek, Brian L.; Zaretsky, Erwin V.; Hendricks, Robert C.

    2012-01-01

    Fatigue life is probabilistic and not deterministic. Experimentally establishing the fatigue life of materials, components, and systems is both time consuming and costly. As a result, conclusions regarding fatigue life are often inferred from a statistically insufficient number of physical tests. A proposed methodology for comparing life results as a function of variability due to Weibull parameters, variability between successive trials, and variability due to size of the experimental population is presented. Using Monte Carlo simulation of randomly selected lives from a large Weibull distribution, the variation in the L10 fatigue life of aluminum alloy AL6061 rotating rod fatigue tests was determined as a function of population size. These results were compared to the L10 fatigue lives of small (10 each) populations from AL2024, AL7075 and AL6061. For aluminum alloy AL6061, a simple algebraic relationship was established for the upper and lower L10 fatigue life limits as a function of the number of specimens failed. For most engineering applications where less than 30 percent variability can be tolerated in the maximum and minimum values, at least 30 to 35 test samples are necessary. The variability of test results based on small sample sizes can be greater than actual differences, if any, that exists between materials and can result in erroneous conclusions. The fatigue life of AL2024 is statistically longer than AL6061 and AL7075. However, there is no statistical difference between the fatigue lives of AL6061 and AL7075 even though AL7075 had a fatigue life 30 percent greater than AL6061.

  6. comparison of estimation methods for fitting weibull distribution

    African Journals Online (AJOL)

    Tersor

    Tree diameter characterisation using probability distribution functions is essential for determining the structure of forest stands. This has been an intrinsic part of forest management planning, decision-making and research in recent times. The distribution of species and tree size in a forest area gives the structure of the stand.

  7. Prediction and reconstruction of future and missing unobservable modified Weibull lifetime based on generalized order statistics

    Directory of Open Access Journals (Sweden)

    Amany E. Aly

    2016-04-01

    Full Text Available When a system consisting of independent components of the same type, some appropriate actions may be done as soon as a portion of them have failed. It is, therefore, important to be able to predict later failure times from earlier ones. One of the well-known failure distributions commonly used to model component life, is the modified Weibull distribution (MWD. In this paper, two pivotal quantities are proposed to construct prediction intervals for future unobservable lifetimes based on generalized order statistics (gos from MWD. Moreover, a pivotal quantity is developed to reconstruct missing observations at the beginning of experiment. Furthermore, Monte Carlo simulation studies are conducted and numerical computations are carried out to investigate the efficiency of presented results. Finally, two illustrative examples for real data sets are analyzed.

  8. Statistical wind analysis for near-space applications

    Science.gov (United States)

    Roney, Jason A.

    2007-09-01

    Statistical wind models were developed based on the existing observational wind data for near-space altitudes between 60 000 and 100 000 ft (18 30 km) above ground level (AGL) at two locations, Akon, OH, USA, and White Sands, NM, USA. These two sites are envisioned as playing a crucial role in the first flights of high-altitude airships. The analysis shown in this paper has not been previously applied to this region of the stratosphere for such an application. Standard statistics were compiled for these data such as mean, median, maximum wind speed, and standard deviation, and the data were modeled with Weibull distributions. These statistics indicated, on a yearly average, there is a lull or a “knee” in the wind between 65 000 and 72 000 ft AGL (20 22 km). From the standard statistics, trends at both locations indicated substantial seasonal variation in the mean wind speed at these heights. The yearly and monthly statistical modeling indicated that Weibull distributions were a reasonable model for the data. Forecasts and hindcasts were done by using a Weibull model based on 2004 data and comparing the model with the 2003 and 2005 data. The 2004 distribution was also a reasonable model for these years. Lastly, the Weibull distribution and cumulative function were used to predict the 50%, 95%, and 99% winds, which are directly related to the expected power requirements of a near-space station-keeping airship. These values indicated that using only the standard deviation of the mean may underestimate the operational conditions.

  9. O modelo q-weibull em confiabilidade, árvores de falha dinâmicas e implementação de manutenção

    OpenAIRE

    Assis, Edilson Machado de

    2013-01-01

    A distribuição q-Weibull foi aplicada em análise de con abilidade. Trata-se de uma generalização com quatro parâmetros de uma distribuição amplamente utilizada em con- abilidade, a distribuição Weibull, que possui três parâmetros. A distribuição Weibull é baseada na função exponencial do negativo de uma potência. A distribuição q-Weibull utiliza uma generalização da função exponencial, chamada q-exponencial, que apresenta o comportamento assintótico a uma lei de potência e recupe...

  10. The Level of Calculation Errors in the Case of Using the Weibull Distribution for Estimating the Eolian Potential According to the Real Potential Based on Effective Measurements of the Weather Characteristics

    Directory of Open Access Journals (Sweden)

    Ioan Ion

    2011-09-01

    Full Text Available The management of an investment program in wind power generators must consider the proper evaluation of the possibilities offered by the location where the park will be disposed and will operate: the available existing electric networks, access roads, the shape of relief, climate, extreme weather phenomena, the average wind speed, etc. Among the items listed above, the most important is wind potential of the area, quantified, measured mainly by multi-annual average wind speed. Evaluation, without special measurements, can be done based on general information such as measurements obtained from meteorological stations, using Weibull distribution for wind speed. When using the weather characteristics measurement results, the evaluation is closer to real multi- annual potential.

  11. Modelling Wind for Wind Farm Layout Optimization Using Joint Distribution of Wind Speed and Wind Direction

    OpenAIRE

    Ju Feng; Wen Zhong Shen

    2015-01-01

    Reliable wind modelling is of crucial importance for wind farm development. The common practice of using sector-wise Weibull distributions has been found inappropriate for wind farm layout optimization. In this study, we propose a simple and easily implementable method to construct joint distributions of wind speed and wind direction, which is based on the parameters of sector-wise Weibull distributions and interpolations between direction sectors. It is applied to the wind measurement data a...

  12. Impact of Blending on Strength Distribution of Ambient Cured Metakaolin and Palm Oil Fuel Ash Based Geopolymer Mortar

    Directory of Open Access Journals (Sweden)

    Taliat Ola Yusuf

    2014-01-01

    Full Text Available This paper investigates the influence of blending of metakaolin with silica rich palm oil fuel ash (POFA on the strength distribution of geopolymer mortar. The broadness of strength distribution of quasi-brittle to brittle materials depends strongly on the existence of flaws such as voids, microcracks, and impurities in the material. Blending of materials containing alumina and silica with the objective of improving the performance of geopolymer makes comprehensive characterization necessary. The Weibull distribution is used to study the strength distribution and the reliability of geopolymer mortar specimens prepared from 100% metakaolin, 50% and 70% palm and cured under ambient condition. Mortar prisms and cubes were used to test the materials in flexure and compression, respectively, at 28 days and the results were analyzed using Weibull distribution. In flexure, Weibull modulus increased with POFA replacement, indicating reduced broadness of strength distribution from an increased homogeneity of the material. Modulus, however, decreased with increase in replacement of POFA in the specimens tested under compression. It is concluded that Weibull distribution is suitable for analyses of the blended geopolymer system. While porous microstructure is mainly responsible for flexural failure, heterogeneity of reaction relics is responsible for the compression failure.

  13. Reliability demonstration test for load-sharing systems with exponential and Weibull components.

    Directory of Open Access Journals (Sweden)

    Jianyu Xu

    Full Text Available Conducting a Reliability Demonstration Test (RDT is a crucial step in production. Products are tested under certain schemes to demonstrate whether their reliability indices reach pre-specified thresholds. Test schemes for RDT have been studied in different situations, e.g., lifetime testing, degradation testing and accelerated testing. Systems designed with several structures are also investigated in many RDT plans. Despite the availability of a range of test plans for different systems, RDT planning for load-sharing systems hasn't yet received the attention it deserves. In this paper, we propose a demonstration method for two specific types of load-sharing systems with components subject to two distributions: exponential and Weibull. Based on the assumptions and interpretations made in several previous works on such load-sharing systems, we set the mean time to failure (MTTF of the total system as the demonstration target. We represent the MTTF as a summation of mean time between successive component failures. Next, we introduce generalized test statistics for both the underlying distributions. Finally, RDT plans for the two types of systems are established on the basis of these test statistics.

  14. Compositional Analyses and Shelf-Life Modeling of Njangsa (Ricinodendron heudelotii) Seed Oil Using the Weibull Hazard Analysis.

    Science.gov (United States)

    Abaidoo-Ayin, Harold K; Boakye, Prince G; Jones, Kerby C; Wyatt, Victor T; Besong, Samuel A; Lumor, Stephen E

    2017-08-01

    This study investigated the compositional characteristics and shelf-life of Njangsa seed oil (NSO). Oil from Njangsa had a high polyunsaturated fatty acid (PUFA) content of which alpha eleostearic acid (α-ESA), an unusual conjugated linoleic acid was the most prevalent (about 52%). Linoleic acid was also present in appreciable amounts (approximately 34%). Our investigations also indicated that the acid-catalyzed transesterification of NSO resulted in lower yields of α-ESA methyl esters, due to isomerization, a phenomenon which was not observed under basic conditions. The triacylglycerol (TAG) profile analysis showed the presence of at least 1 α-ESA fatty acid chain in more than 95% of the oil's TAGs. Shelf-life was determined by the Weibull Hazard Sensory Method, where the end of shelf-life was defined as the time at which 50% of panelists found the flavor of NSO to be unacceptable. This was determined as 21 wk. Our findings therefore support the potential commercial viability of NSO as an important source of physiologically beneficial PUFAs. © 2017 Institute of Food Technologists®.

  15. Modified Weibull theory and stress-concentration factors of polycrystalline graphite

    International Nuclear Information System (INIS)

    Ho, F.H.

    1980-12-01

    Stress concentration factors (SCF) due to geometric discontinuities in graphite specimens are observed to be much less than the theoretical SCF in an elastic material. In fact, the experimental SCF is always less than two and sometimes even less than one. A four parameter Weibull theory which recognizes the grain size effect is found to give an adequate explanation of the above observed discrepancies

  16. Efficient Weibull channel model for salinity induced turbulent underwater wireless optical communications

    KAUST Repository

    Oubei, Hassan M.

    2017-12-13

    Recent advances in underwater wireless optical communications necessitate a better understanding of the underwater channel. We propose the Weibull model to characterize the fading of salinity induced turbulent underwater wireless optical channels. The model shows an excellent agreement with the measured data under all channel conditions.

  17. Statistical Analysis of Data for Timber Strengths

    DEFF Research Database (Denmark)

    Sørensen, John Dalsgaard; Hoffmeyer, P.

    Statistical analyses are performed for material strength parameters from approximately 6700 specimens of structural timber. Non-parametric statistical analyses and fits to the following distributions types have been investigated: Normal, Lognormal, 2 parameter Weibull and 3-parameter Weibull...

  18. Likelihood inference for COM-Poisson cure rate model with interval-censored data and Weibull lifetimes.

    Science.gov (United States)

    Pal, Suvra; Balakrishnan, N

    2017-10-01

    In this paper, we consider a competing cause scenario and assume the number of competing causes to follow a Conway-Maxwell Poisson distribution which can capture both over and under dispersion that is usually encountered in discrete data. Assuming the population of interest having a component cure and the form of the data to be interval censored, as opposed to the usually considered right-censored data, the main contribution is in developing the steps of the expectation maximization algorithm for the determination of the maximum likelihood estimates of the model parameters of the flexible Conway-Maxwell Poisson cure rate model with Weibull lifetimes. An extensive Monte Carlo simulation study is carried out to demonstrate the performance of the proposed estimation method. Model discrimination within the Conway-Maxwell Poisson distribution is addressed using the likelihood ratio test and information-based criteria to select a suitable competing cause distribution that provides the best fit to the data. A simulation study is also carried out to demonstrate the loss in efficiency when selecting an improper competing cause distribution which justifies the use of a flexible family of distributions for the number of competing causes. Finally, the proposed methodology and the flexibility of the Conway-Maxwell Poisson distribution are illustrated with two known data sets from the literature: smoking cessation data and breast cosmesis data.

  19. Development of a Weibull model of cleavage fracture toughness for shallow flaws in reactor pressure vessel material

    Energy Technology Data Exchange (ETDEWEB)

    Bass, B.R.; Williams, P.T.; McAfee, W.J.; Pugh, C.E. [Oak Ridge National Lab., Heavy-Section Steel Technology Program, Oak Ridge, TN (United States)

    2001-07-01

    A primary objective of the United States Nuclear Regulatory Commission (USNRC) -sponsored Heavy-Section Steel Technology (HSST) Program is to develop and validate technology applicable to quantitative assessments of fracture prevention margins in nuclear reactor pressure vessels (RPVs) containing flaws and subjected to service-induced material toughness degradation. This paper describes an experimental/analytical program for the development of a Weibull statistical model of cleavage fracture toughness for applications to shallow surface-breaking and embedded flaws in RPV materials subjected to multi-axial loading conditions. The experimental part includes both material characterization testing and larger fracture toughness experiments conducted using a special-purpose cruciform beam specimen developed by Oak Ridge National Laboratory for applying biaxial loads to shallow cracks. Test materials (pressure vessel steels) included plate product forms (conforming to ASTM A533 Grade B Class 1 specifications) and shell segments procured from a pressurized-water reactor vessel intended for a nuclear power plant. Results from tests performed on cruciform specimens demonstrated that biaxial loading can have a pronounced effect on shallow-flaw fracture toughness in the lower-transition temperature region. A local approach methodology based on a three-parameter Weibull model was developed to correlate these experimentally-observed biaxial effects on fracture toughness. The Weibull model, combined with a new hydrostatic stress criterion in place of the more commonly used maximum principal stress in the kernel of the Weibull stress integral definition, is shown to provide a scaling mechanism between uniaxial and biaxial loading states for 2-dimensional flaws located in the A533-B plate material. The Weibull stress density was introduced as a matrice for identifying regions along a semi-elliptical flaw front that have a higher probability of cleavage initiation. Cumulative

  20. Development of a Weibull model of cleavage fracture toughness for shallow flaws in reactor pressure vessel material

    International Nuclear Information System (INIS)

    Bass, B.R.; Williams, P.T.; McAfee, W.J.; Pugh, C.E.

    2001-01-01

    A primary objective of the United States Nuclear Regulatory Commission (USNRC) -sponsored Heavy-Section Steel Technology (HSST) Program is to develop and validate technology applicable to quantitative assessments of fracture prevention margins in nuclear reactor pressure vessels (RPVs) containing flaws and subjected to service-induced material toughness degradation. This paper describes an experimental/analytical program for the development of a Weibull statistical model of cleavage fracture toughness for applications to shallow surface-breaking and embedded flaws in RPV materials subjected to multi-axial loading conditions. The experimental part includes both material characterization testing and larger fracture toughness experiments conducted using a special-purpose cruciform beam specimen developed by Oak Ridge National Laboratory for applying biaxial loads to shallow cracks. Test materials (pressure vessel steels) included plate product forms (conforming to ASTM A533 Grade B Class 1 specifications) and shell segments procured from a pressurized-water reactor vessel intended for a nuclear power plant. Results from tests performed on cruciform specimens demonstrated that biaxial loading can have a pronounced effect on shallow-flaw fracture toughness in the lower-transition temperature region. A local approach methodology based on a three-parameter Weibull model was developed to correlate these experimentally-observed biaxial effects on fracture toughness. The Weibull model, combined with a new hydrostatic stress criterion in place of the more commonly used maximum principal stress in the kernel of the Weibull stress integral definition, is shown to provide a scaling mechanism between uniaxial and biaxial loading states for 2-dimensional flaws located in the A533-B plate material. The Weibull stress density was introduced as a matrice for identifying regions along a semi-elliptical flaw front that have a higher probability of cleavage initiation. Cumulative

  1. A study of optimization problem for amplify-and-forward relaying over weibull fading channels

    KAUST Repository

    Ikki, Salama Said; Aissa, Sonia

    2010-01-01

    This paper addresses the power allocation and relay positioning problems in amplify-and-forward cooperative networks operating in Weibull fading environments. We study adaptive power allocation (PA) with fixed relay location, optimal relay location

  2. Evaluation of wind power production prospective and Weibull parameter estimation methods for Babaurband, Sindh Pakistan

    International Nuclear Information System (INIS)

    Khahro, Shahnawaz Farhan; Tabbassum, Kavita; Soomro, Amir Mahmood; Dong, Lei; Liao, Xiaozhong

    2014-01-01

    Highlights: • Weibull scale and shape parameters are calculated using 5 numerical methods. • Yearly mean wind speed is 6.712 m/s at 80 m height with highest in May 9.595 m/s. • Yearly mean WPD is 310 W/m 2 and available energy density is 2716 kWh/m 2 at 80 m height. • Probability of higher wind speeds is more in spring and summer than in autumn and winter. • Estimated cost of per kWh of electricity from wind is calculated as 0.0263 US$/kWh. - Abstract: Pakistan is currently experiencing an acute shortage of energy and urgently needs new sources of affordable energy that could alleviate the misery of the energy starved masses. At present the government is increasing not only the conventional energy sources like hydel and thermal but also focusing on the immense potential of renewable energy sources like; solar, wind, biogas, waste-to-energy etc. The recent economic crisis worldwide, global warming and climate change have also emphasized the need for utilizing economic feasible energy sources having lowest carbon emissions. Wind energy, with its sustainability and low environmental impact, is highly prominent. The aim of this paper is to explore the wind power production prospective of one of the sites in south region of Pakistan. It is worth mentioning here that this type of detailed analysis is hardly done for any location in Pakistan. Wind power densities and frequency distributions of wind speed at four different altitudes along with estimated wind power expected to be generated through commercial wind turbines is calculated. Analysis and comparison of 5 numerical methods is presented in this paper to determine the Weibull scale and shape parameters for the available wind data. The yearly mean wind speed of the considered site is 6.712 m/s and has power density of 310 W/m 2 at 80 m height with high power density during April to August (highest in May with wind speed 9.595 m/s and power density 732 W/m 2 ). Economic evaluation, to exemplify feasibility

  3. Assessing a Tornado Climatology from Global Tornado Intensity Distributions

    OpenAIRE

    Feuerstein, B.; Dotzek, N.; Grieser, J.

    2005-01-01

    Recent work demonstrated that the shape of tornado intensity distributions from various regions worldwide is well described by Weibull functions. This statistical modeling revealed a strong correlation between the fit parameters c for shape and b for scale regardless of the data source. In the present work it is shown that the quality of the Weibull fits is optimized if only tornado reports of F1 and higher intensity are used and that the c–b correlation does indeed reflect a universal featur...

  4. Two sample Bayesian prediction intervals for order statistics based on the inverse exponential-type distributions using right censored sample

    Directory of Open Access Journals (Sweden)

    M.M. Mohie El-Din

    2011-10-01

    Full Text Available In this paper, two sample Bayesian prediction intervals for order statistics (OS are obtained. This prediction is based on a certain class of the inverse exponential-type distributions using a right censored sample. A general class of prior density functions is used and the predictive cumulative function is obtained in the two samples case. The class of the inverse exponential-type distributions includes several important distributions such the inverse Weibull distribution, the inverse Burr distribution, the loglogistic distribution, the inverse Pareto distribution and the inverse paralogistic distribution. Special cases of the inverse Weibull model such as the inverse exponential model and the inverse Rayleigh model are considered.

  5. On Selection of the Probability Distribution for Representing the Maximum Annual Wind Speed in East Cairo, Egypt

    International Nuclear Information System (INIS)

    El-Shanshoury, Gh. I.; El-Hemamy, S.T.

    2013-01-01

    The main objective of this paper is to identify an appropriate probability model and best plotting position formula which represent the maximum annual wind speed in east Cairo. This model can be used to estimate the extreme wind speed and return period at a particular site as well as to determine the radioactive release distribution in case of accident occurrence at a nuclear power plant. Wind speed probabilities can be estimated by using probability distributions. An accurate determination of probability distribution for maximum wind speed data is very important in expecting the extreme value . The probability plots of the maximum annual wind speed (MAWS) in east Cairo are fitted to six major statistical distributions namely: Gumbel, Weibull, Normal, Log-Normal, Logistic and Log- Logistic distribution, while eight plotting positions of Hosking and Wallis, Hazen, Gringorten, Cunnane, Blom, Filliben, Benard and Weibull are used for determining exceedance of their probabilities. A proper probability distribution for representing the MAWS is selected by the statistical test criteria in frequency analysis. Therefore, the best plotting position formula which can be used to select appropriate probability model representing the MAWS data must be determined. The statistical test criteria which represented in: the probability plot correlation coefficient (PPCC), the root mean square error (RMSE), the relative root mean square error (RRMSE) and the maximum absolute error (MAE) are used to select the appropriate probability position and distribution. The data obtained show that the maximum annual wind speed in east Cairo vary from 44.3 Km/h to 96.1 Km/h within duration of 39 years . Weibull plotting position combined with Normal distribution gave the highest fit, most reliable, accurate predictions and determination of the wind speed in the study area having the highest value of PPCC and lowest values of RMSE, RRMSE and MAE

  6. Observations in the statistical analysis of NBG-18 nuclear graphite strength tests

    International Nuclear Information System (INIS)

    Hindley, Michael P.; Mitchell, Mark N.; Blaine, Deborah C.; Groenwold, Albert A.

    2012-01-01

    Highlights: ► Statistical analysis of NBG-18 nuclear graphite strength test. ► A Weibull distribution and normal distribution is tested for all data. ► A Bimodal distribution in the CS data is confirmed. ► The CS data set has the lowest variance. ► A Combined data set is formed and has Weibull distribution. - Abstract: The purpose of this paper is to report on the selection of a statistical distribution chosen to represent the experimental material strength of NBG-18 nuclear graphite. Three large sets of samples were tested during the material characterisation of the Pebble Bed Modular Reactor and Core Structure Ceramics materials. These sets of samples are tensile strength, flexural strength and compressive strength (CS) measurements. A relevant statistical fit is determined and the goodness of fit is also evaluated for each data set. The data sets are also normalised for ease of comparison, and combined into one representative data set. The validity of this approach is demonstrated. A second failure mode distribution is found on the CS test data. Identifying this failure mode supports the similar observations made in the past. The success of fitting the Weibull distribution through the normalised data sets allows us to improve the basis for the estimates of the variability. This could also imply that the variability on the graphite strength for the different strength measures is based on the same flaw distribution and thus a property of the material.

  7. Análise de distribuição de chuva para Santa Maria, RS Analysis of rainfall distribution for Santa Maria, RS, Brazil

    Directory of Open Access Journals (Sweden)

    Joel C. da Silva

    2007-02-01

    Full Text Available O estudo em pauta teve como objetivo analisar a distribuição da quantidade diária de precipitação e do número de dias com chuva e determinar a variação da probabilidade de ocorrência de precipitação diária, durante os meses do ano, em Santa Maria, RS. Os dados de precipitação utilizados foram obtidos durante 36 anos de observação, na Estação Climatológica do 8º Distrito de Meteorologia, localizada na Universidade Federal de Santa Maria (29º 43' 23" de latitude Sul e 53º 43' 15" de longitude Oeste, altitude 95 m. Analisaram-se as seguintes funções de distribuição de probabilidade: gama, Weibull, normal, log-normal e exponencial. As funções que melhor descreveram a distribuição das probabilidades foram gama e Weibull. O maior número de dias com chuva ocorreu durante os meses de inverno porém o volume de precipitação é menor nesses dias, resultando em total mensal semelhante para todos os meses do ano.The objectives of this study were to analyze the distribution of total daily rainfall data and the number of rainy-days, and to determine the probability variation of daily precipitation during the months of the year in Santa Maria, Rio Grande do Sul State, Brazil. A 36-year rainfall database measured at the Climatological Station of 8th District of Meteorology, located in Santa Maria Federal University (29º 43' 23" S and 53º 43' 15" W were used in the study. The following probability distribution functions were tested: gamma, Weibull, normal, lognormal and exponential. The functions that best described the frequency distribution were gamma and Weibull. There were more number of rainy days in the winter, but with less amount of rainfall, resulting in similar monthly total precipitation for the twelve months of the year.

  8. Probability distribution of machining center failures

    International Nuclear Information System (INIS)

    Jia Yazhou; Wang Molin; Jia Zhixin

    1995-01-01

    Through field tracing research for 24 Chinese cutter-changeable CNC machine tools (machining centers) over a period of one year, a database of operation and maintenance for machining centers was built, the failure data was fitted to the Weibull distribution and the exponential distribution, the effectiveness was tested, and the failure distribution pattern of machining centers was found. Finally, the reliability characterizations for machining centers are proposed

  9. Weibull Parameters Estimation Based on Physics of Failure Model

    DEFF Research Database (Denmark)

    Kostandyan, Erik; Sørensen, John Dalsgaard

    2012-01-01

    Reliability estimation procedures are discussed for the example of fatigue development in solder joints using a physics of failure model. The accumulated damage is estimated based on a physics of failure model, the Rainflow counting algorithm and the Miner’s rule. A threshold model is used...... for degradation modeling and failure criteria determination. The time dependent accumulated damage is assumed linearly proportional to the time dependent degradation level. It is observed that the deterministic accumulated damage at the level of unity closely estimates the characteristic fatigue life of Weibull...

  10. A New Generalization of the Lomax Distribution with Increasing, Decreasing, and Constant Failure Rate

    Directory of Open Access Journals (Sweden)

    Pelumi E. Oguntunde

    2017-01-01

    Full Text Available Developing new compound distributions which are more flexible than the existing distributions have become the new trend in distribution theory. In this present study, the Lomax distribution was extended using the Gompertz family of distribution, its resulting densities and statistical properties were carefully derived, and the method of maximum likelihood estimation was proposed in estimating the model parameters. A simulation study to assess the performance of the parameters of Gompertz Lomax distribution was provided and an application to real life data was provided to assess the potentials of the newly derived distribution. Excerpt from the analysis indicates that the Gompertz Lomax distribution performed better than the Beta Lomax distribution, Weibull Lomax distribution, and Kumaraswamy Lomax distribution.

  11. A reappraisal of drug release laws using Monte Carlo simulations: the prevalence of the Weibull function.

    Science.gov (United States)

    Kosmidis, Kosmas; Argyrakis, Panos; Macheras, Panos

    2003-07-01

    To verify the Higuchi law and study the drug release from cylindrical and spherical matrices by means of Monte Carlo computer simulation. A one-dimensional matrix, based on the theoretical assumptions of the derivation of the Higuchi law, was simulated and its time evolution was monitored. Cylindrical and spherical three-dimensional lattices were simulated with sites at the boundary of the lattice having been denoted as leak sites. Particles were allowed to move inside it using the random walk model. Excluded volume interactions between the particles was assumed. We have monitored the system time evolution for different lattice sizes and different initial particle concentrations. The Higuchi law was verified using the Monte Carlo technique in a one-dimensional lattice. It was found that Fickian drug release from cylindrical matrices can be approximated nicely with the Weibull function. A simple linear relation between the Weibull function parameters and the specific surface of the system was found. Drug release from a matrix, as a result of a diffusion process assuming excluded volume interactions between the drug molecules, can be described using a Weibull function. This model, although approximate and semiempirical, has the benefit of providing a simple physical connection between the model parameters and the system geometry, which was something missing from other semiempirical models.

  12. An appraisal of wind speed distribution prediction by soft computing methodologies: A comparative study

    International Nuclear Information System (INIS)

    Petković, Dalibor; Shamshirband, Shahaboddin; Anuar, Nor Badrul; Saboohi, Hadi; Abdul Wahab, Ainuddin Wahid; Protić, Milan; Zalnezhad, Erfan; Mirhashemi, Seyed Mohammad Amin

    2014-01-01

    Highlights: • Probabilistic distribution functions of wind speed. • Two parameter Weibull probability distribution. • To build an effective prediction model of distribution of wind speed. • Support vector regression application as probability function for wind speed. - Abstract: The probabilistic distribution of wind speed is among the more significant wind characteristics in examining wind energy potential and the performance of wind energy conversion systems. When the wind speed probability distribution is known, the wind energy distribution can be easily obtained. Therefore, the probability distribution of wind speed is a very important piece of information required in assessing wind energy potential. For this reason, a large number of studies have been established concerning the use of a variety of probability density functions to describe wind speed frequency distributions. Although the two-parameter Weibull distribution comprises a widely used and accepted method, solving the function is very challenging. In this study, the polynomial and radial basis functions (RBF) are applied as the kernel function of support vector regression (SVR) to estimate two parameters of the Weibull distribution function according to previously established analytical methods. Rather than minimizing the observed training error, SVR p oly and SVR r bf attempt to minimize the generalization error bound, so as to achieve generalized performance. According to the experimental results, enhanced predictive accuracy and capability of generalization can be achieved using the SVR approach compared to other soft computing methodologies

  13. Lifetime modelling with a Weibull law: comparison of three Bayesian Methods

    International Nuclear Information System (INIS)

    Billy, F.; Remy, E.; Bousquet, N.; Celeux, G.

    2006-01-01

    For a nuclear power plant, being able to estimate the lifetime of important components is strategic. But data is usually insufficient to do so. Thus, it is relevant to use expertise, together with data, in order to assess the value of lifetime on the grounds of both sources. The Bayesian frame and the choice of a Weibull law to model the random time for replacement are relevant. They have been chosen for this article. Two indicators are computed : the mean lifetime of any component and the mean residual lifetime of a given component, after it has been controlled. Three different Bayesian methods are compared on three sets of data. The article shows that the three methods lead to coherent results and that uncertainties are strongly reduced. The method developed around PMC has two main advantages: it models a conditional dependence of the two parameters of the Weibull law, which enables more coherent results on the prior; it has a parameter that weights the strength of the expertise. This last point is very important to do lifetime assessments, because then, expertise is not used to increase too small samples as much as to do a real extrapolation, far beyond what data itself say. (authors)

  14. Introducing a system of wind speed distributions for modeling properties of wind speed regimes around the world

    International Nuclear Information System (INIS)

    Jung, Christopher; Schindler, Dirk; Laible, Jessica; Buchholz, Alexander

    2017-01-01

    Highlights: • Evaluation of statistical properties of 10,016 empirical wind speed distributions. • Analysis of the shape of empirical wind speed distributions by L-moment ratios. • Introduction of a new system of wind speed distributions (Swd). • Random forests classification of the most appropriate distribution. • Comprehensive goodness of Swd fit evaluation on a global scale. - Abstract: Accurate modeling of empirical wind speed distributions is a crucial step in the estimation of average wind turbine power output. For this purpose, the Weibull distribution has often been fitted to empirical wind speed distributions. However, the Weibull distribution has been found to be insufficient to reproduce many wind speed regimes existing around the world. Results from previous studies demonstrate that numerous one-component distributions as well as mixture distributions provide a better goodness-of-fit to empirical wind speed distributions than the Weibull distribution. Moreover, there is considerable interest to apply a single system of distributions that can be utilized to reproduce the large majority of near-surface wind speed regimes existing around the world. Therefore, a system of wind speed distributions was developed that is capable of reproducing the main characteristics of existing wind speed regimes. The proposed system consists of two one-component distributions (Kappa and Wakeby) and one mixture distribution (Burr-Generalized Extreme Value). A random forests classifier was trained in order to select the most appropriate of these three distributions for each of 10,016 globally distributed empirical wind speed distributions. The shape of the empirical wind speed distributions was described by L-moment ratios. The L-moment ratios were used as predictor variables for the random forests classifier. The goodness-of-fit of the system of wind speed distributions was evaluated according to eleven goodness-of-fit metrics, which were merged into one

  15. Determining the distribution of fitness effects using a generalized Beta-Burr distribution.

    Science.gov (United States)

    Joyce, Paul; Abdo, Zaid

    2017-07-12

    In Beisel et al. (2007), a likelihood framework, based on extreme value theory (EVT), was developed for determining the distribution of fitness effects for adaptive mutations. In this paper we extend this framework beyond the extreme distributions and develop a likelihood framework for testing whether or not extreme value theory applies. By making two simple adjustments to the Generalized Pareto Distribution (GPD) we introduce a new simple five parameter probability density function that incorporates nearly every common (continuous) probability model ever used. This means that all of the common models are nested. This has important implications in model selection beyond determining the distribution of fitness effects. However, we demonstrate the use of this distribution utilizing likelihood ratio testing to evaluate alternative distributions to the Gumbel and Weibull domains of attraction of fitness effects. We use a bootstrap strategy, utilizing importance sampling, to determine where in the parameter space will the test be most powerful in detecting deviations from these domains and at what sample size, with focus on small sample sizes (n<20). Our results indicate that the likelihood ratio test is most powerful in detecting deviation from the Gumbel domain when the shape parameters of the model are small while the test is more powerful in detecting deviations from the Weibull domain when these parameters are large. As expected, an increase in sample size improves the power of the test. This improvement is observed to occur quickly with sample size n≥10 in tests related to the Gumbel domain and n≥15 in the case of the Weibull domain. This manuscript is in tribute to the contributions of Dr. Paul Joyce to the areas of Population Genetics, Probability Theory and Mathematical Statistics. A Tribute section is provided at the end that includes Paul's original writing in the first iterations of this manuscript. The Introduction and Alternatives to the GPD sections

  16. Statistical distributions as applied to environmental surveillance data

    International Nuclear Information System (INIS)

    Speer, D.R.; Waite, D.A.

    1975-09-01

    Application of normal, log normal, and Weibull distributions to environmental surveillance data was investigated for approximately 300 nuclide-medium-year-location combinations. Corresponding W test calculations were made to determine the probability of a particular data set falling within the distribution of interest. Conclusions are drawn as to the fit of any data group to the various distributions. The significance of fitting statistical distributions to the data is discussed

  17. Reliability analysis of nuclear component cooling water system using semi-Markov process model

    International Nuclear Information System (INIS)

    Veeramany, Arun; Pandey, Mahesh D.

    2011-01-01

    Research highlights: → Semi-Markov process (SMP) model is used to evaluate system failure probability of the nuclear component cooling water (NCCW) system. → SMP is used because it can solve reliability block diagram with a mixture of redundant repairable and non-repairable components. → The primary objective is to demonstrate that SMP can consider Weibull failure time distribution for components while a Markov model cannot → Result: the variability in component failure time is directly proportional to the NCCW system failure probability. → The result can be utilized as an initiating event probability in probabilistic safety assessment projects. - Abstract: A reliability analysis of nuclear component cooling water (NCCW) system is carried out. Semi-Markov process model is used in the analysis because it has potential to solve a reliability block diagram with a mixture of repairable and non-repairable components. With Markov models it is only possible to assume an exponential profile for component failure times. An advantage of the proposed model is the ability to assume Weibull distribution for the failure time of components. In an attempt to reduce the number of states in the model, it is shown that usage of poly-Weibull distribution arises. The objective of the paper is to determine system failure probability under these assumptions. Monte Carlo simulation is used to validate the model result. This result can be utilized as an initiating event probability in probabilistic safety assessment projects.

  18. Ceramics Analysis and Reliability Evaluation of Structures (CARES). Users and programmers manual

    Science.gov (United States)

    Nemeth, Noel N.; Manderscheid, Jane M.; Gyekenyesi, John P.

    1990-01-01

    This manual describes how to use the Ceramics Analysis and Reliability Evaluation of Structures (CARES) computer program. The primary function of the code is to calculate the fast fracture reliability or failure probability of macroscopically isotropic ceramic components. These components may be subjected to complex thermomechanical loadings, such as those found in heat engine applications. The program uses results from MSC/NASTRAN or ANSYS finite element analysis programs to evaluate component reliability due to inherent surface and/or volume type flaws. CARES utilizes the Batdorf model and the two-parameter Weibull cumulative distribution function to describe the effect of multiaxial stress states on material strength. The principle of independent action (PIA) and the Weibull normal stress averaging models are also included. Weibull material strength parameters, the Batdorf crack density coefficient, and other related statistical quantities are estimated from four-point bend bar or unifrom uniaxial tensile specimen fracture strength data. Parameter estimation can be performed for single or multiple failure modes by using the least-square analysis or the maximum likelihood method. Kolmogorov-Smirnov and Anderson-Darling goodness-of-fit tests, ninety percent confidence intervals on the Weibull parameters, and Kanofsky-Srinivasan ninety percent confidence band values are also provided. The probabilistic fast-fracture theories used in CARES, along with the input and output for CARES, are described. Example problems to demonstrate various feature of the program are also included. This manual describes the MSC/NASTRAN version of the CARES program.

  19. A mixture of exponentials distribution for a simple and precise assessment of the volcanic hazard

    Directory of Open Access Journals (Sweden)

    A. T. Mendoza-Rosas

    2009-03-01

    Full Text Available The assessment of volcanic hazard is the first step for disaster mitigation. The distribution of repose periods between eruptions provides important information about the probability of new eruptions occurring within given time intervals. The quality of the probability estimate, i.e., of the hazard assessment, depends on the capacity of the chosen statistical model to describe the actual distribution of the repose times. In this work, we use a mixture of exponentials distribution, namely the sum of exponential distributions characterized by the different eruption occurrence rates that may be recognized inspecting the cumulative number of eruptions with time in specific VEI (Volcanic Explosivity Index categories. The most striking property of an exponential mixture density is that the shape of the density function is flexible in a way similar to the frequently used Weibull distribution, matching long-tailed distributions and allowing clustering and time dependence of the eruption sequence, with distribution parameters that can be readily obtained from the observed occurrence rates. Thus, the mixture of exponentials turns out to be more precise and much easier to apply than the Weibull distribution. We recommended the use of a mixture of exponentials distribution when regimes with well-defined eruption rates can be identified in the cumulative series of events. As an example, we apply the mixture of exponential distributions to the repose-time sequences between explosive eruptions of the Colima and Popocatépetl volcanoes, México, and compare the results obtained with the Weibull and other distributions.

  20. Probability Distribution Function of the Upper Equatorial Pacific Current Speeds

    National Research Council Canada - National Science Library

    Chu, Peter C

    2005-01-01

    ...), constructed from hourly ADCP data (1990-2007) at six stations for the Tropical Atmosphere Ocean project satisfies the two-parameter Weibull distribution reasonably well with different characteristics between El Nino and La Nina events...

  1. Idealized models of the joint probability distribution of wind speeds

    Science.gov (United States)

    Monahan, Adam H.

    2018-05-01

    The joint probability distribution of wind speeds at two separate locations in space or points in time completely characterizes the statistical dependence of these two quantities, providing more information than linear measures such as correlation. In this study, we consider two models of the joint distribution of wind speeds obtained from idealized models of the dependence structure of the horizontal wind velocity components. The bivariate Rice distribution follows from assuming that the wind components have Gaussian and isotropic fluctuations. The bivariate Weibull distribution arises from power law transformations of wind speeds corresponding to vector components with Gaussian, isotropic, mean-zero variability. Maximum likelihood estimates of these distributions are compared using wind speed data from the mid-troposphere, from different altitudes at the Cabauw tower in the Netherlands, and from scatterometer observations over the sea surface. While the bivariate Rice distribution is more flexible and can represent a broader class of dependence structures, the bivariate Weibull distribution is mathematically simpler and may be more convenient in many applications. The complexity of the mathematical expressions obtained for the joint distributions suggests that the development of explicit functional forms for multivariate speed distributions from distributions of the components will not be practical for more complicated dependence structure or more than two speed variables.

  2. The comparative kinetic analysis of Acetocell and Lignoboost® lignin pyrolysis: the estimation of the distributed reactivity models.

    Science.gov (United States)

    Janković, Bojan

    2011-10-01

    The non-isothermal pyrolysis kinetics of Acetocell (the organosolv) and Lignoboost® (kraft) lignins, in an inert atmosphere, have been studied by thermogravimetric analysis. Using isoconversional analysis, it was concluded that the apparent activation energy for all lignins strongly depends on conversion, showing that the pyrolysis of lignins is not a single chemical process. It was identified that the pyrolysis process of Acetocell and Lignoboost® lignin takes place over three reaction steps, which was confirmed by appearance of the corresponding isokinetic relationships (IKR). It was found that major pyrolysis stage of both lignins is characterized by stilbene pyrolysis reactions, which were subsequently followed by decomposition reactions of products derived from the stilbene pyrolytic process. It was concluded that non-isothermal pyrolysis of Acetocell and Lignoboost® lignins can be best described by n-th (n>1) reaction order kinetics, using the Weibull mixture model (as distributed reactivity model) with alternating shape parameters. Copyright © 2011 Elsevier Ltd. All rights reserved.

  3. Survival Analysis of Patients with End Stage Renal Disease

    Science.gov (United States)

    Urrutia, J. D.; Gayo, W. S.; Bautista, L. A.; Baccay, E. B.

    2015-06-01

    This paper provides a survival analysis of End Stage Renal Disease (ESRD) under Kaplan-Meier Estimates and Weibull Distribution. The data were obtained from the records of V. L. MakabaliMemorial Hospital with respect to time t (patient's age), covariates such as developed secondary disease (Pulmonary Congestion and Cardiovascular Disease), gender, and the event of interest: the death of ESRD patients. Survival and hazard rates were estimated using NCSS for Weibull Distribution and SPSS for Kaplan-Meier Estimates. These lead to the same conclusion that hazard rate increases and survival rate decreases of ESRD patient diagnosed with Pulmonary Congestion, Cardiovascular Disease and both diseases with respect to time. It also shows that female patients have a greater risk of death compared to males. The probability risk was given the equation R = 1 — e-H(t) where e-H(t) is the survival function, H(t) the cumulative hazard function which was created using Cox-Regression.

  4. SMALL-SCALE AND GLOBAL DYNAMOS AND THE AREA AND FLUX DISTRIBUTIONS OF ACTIVE REGIONS, SUNSPOT GROUPS, AND SUNSPOTS: A MULTI-DATABASE STUDY

    Energy Technology Data Exchange (ETDEWEB)

    Muñoz-Jaramillo, Andrés; Windmueller, John C.; Amouzou, Ernest C.; Longcope, Dana W. [Department of Physics, Montana State University, Bozeman, MT 59717 (United States); Senkpeil, Ryan R. [Department of Physics, Purdue University, West Lafayette, IN 47907 (United States); Tlatov, Andrey G. [Kislovodsk Mountain Astronomical Station of the Pulkovo Observatory, Kislovodsk 357700 (Russian Federation); Nagovitsyn, Yury A. [Pulkovo Astronomical Observatory, Russian Academy of Sciences, St. Petersburg 196140 (Russian Federation); Pevtsov, Alexei A. [National Solar Observatory, Sunspot, NM 88349 (United States); Chapman, Gary A.; Cookson, Angela M. [San Fernando Observatory, Department of Physics and Astronomy, California State University Northridge, Northridge, CA 91330 (United States); Yeates, Anthony R. [Department of Mathematical Sciences, Durham University, South Road, Durham DH1 3LE (United Kingdom); Watson, Fraser T. [National Solar Observatory, Tucson, AZ 85719 (United States); Balmaceda, Laura A. [Institute for Astronomical, Terrestrial and Space Sciences (ICATE-CONICET), San Juan (Argentina); DeLuca, Edward E. [Harvard-Smithsonian Center for Astrophysics, Cambridge, MA 02138 (United States); Martens, Petrus C. H., E-mail: munoz@solar.physics.montana.edu [Department of Physics and Astronomy, Georgia State University, Atlanta, GA 30303 (United States)

    2015-02-10

    In this work, we take advantage of 11 different sunspot group, sunspot, and active region databases to characterize the area and flux distributions of photospheric magnetic structures. We find that, when taken separately, different databases are better fitted by different distributions (as has been reported previously in the literature). However, we find that all our databases can be reconciled by the simple application of a proportionality constant, and that, in reality, different databases are sampling different parts of a composite distribution. This composite distribution is made up by linear combination of Weibull and log-normal distributions—where a pure Weibull (log-normal) characterizes the distribution of structures with fluxes below (above) 10{sup 21}Mx (10{sup 22}Mx). Additionally, we demonstrate that the Weibull distribution shows the expected linear behavior of a power-law distribution (when extended to smaller fluxes), making our results compatible with the results of Parnell et al. We propose that this is evidence of two separate mechanisms giving rise to visible structures on the photosphere: one directly connected to the global component of the dynamo (and the generation of bipolar active regions), and the other with the small-scale component of the dynamo (and the fragmentation of magnetic structures due to their interaction with turbulent convection)

  5. Global sensitivity analysis in wind energy assessment

    Science.gov (United States)

    Tsvetkova, O.; Ouarda, T. B.

    2012-12-01

    research show that the brute force method is best for wind assessment purpose, SBSS outperforms other sampling strategies in the majority of cases. The results indicate that the Weibull scale parameter, turbine lifetime and Weibull shape parameter are the three most influential variables in the case study setting. The following conclusions can be drawn from these results: 1) SBSS should be recommended for use in Monte Carlo experiments, 2) The brute force method should be recommended for conducting sensitivity analysis in wind resource assessment, and 3) Little variation in the Weibull scale causes significant variation in energy production. The presence of the two distribution parameters in the top three influential variables (the Weibull shape and scale) emphasizes the importance of accuracy of (a) choosing the distribution to model wind regime at a site and (b) estimating probability distribution parameters. This can be labeled as the most important conclusion of this research because it opens a field for further research, which the authors see could change the wind energy field tremendously.

  6. Reliability Engineering

    International Nuclear Information System (INIS)

    Lee, Sang Yong

    1992-07-01

    This book is about reliability engineering, which describes definition and importance of reliability, development of reliability engineering, failure rate and failure probability density function about types of it, CFR and index distribution, IFR and normal distribution and Weibull distribution, maintainability and movability, reliability test and reliability assumption in index distribution type, normal distribution type and Weibull distribution type, reliability sampling test, reliability of system, design of reliability and functionality failure analysis by FTA.

  7. Distribution and Mobility of Wealth of Nations

    NARCIS (Netherlands)

    R. Paap (Richard); H.K. van Dijk (Herman)

    2009-01-01

    textabstractWe estimate the empirical bimodal cross-section distribution of real Gross Domestic Product per capita of 120 countries over the period 1960–1989 by a mixture of a Weibull and a truncated normal density. The components of the mixture represent a group of poor and a group of rich

  8. Optimal power flow for distribution networks with distributed generation

    Directory of Open Access Journals (Sweden)

    Radosavljević Jordan

    2015-01-01

    Full Text Available This paper presents a genetic algorithm (GA based approach for the solution of the optimal power flow (OPF in distribution networks with distributed generation (DG units, including fuel cells, micro turbines, diesel generators, photovoltaic systems and wind turbines. The OPF is formulated as a nonlinear multi-objective optimization problem with equality and inequality constraints. Due to the stochastic nature of energy produced from renewable sources, i.e. wind turbines and photovoltaic systems, as well as load uncertainties, a probabilisticalgorithm is introduced in the OPF analysis. The Weibull and normal distributions are employed to model the input random variables, namely the wind speed, solar irradiance and load power. The 2m+1 point estimate method and the Gram Charlier expansion theory are used to obtain the statistical moments and the probability density functions (PDFs of the OPF results. The proposed approach is examined and tested on a modified IEEE 34 node test feeder with integrated five different DG units. The obtained results prove the efficiency of the proposed approach to solve both deterministic and probabilistic OPF problems for different forms of the multi-objective function. As such, it can serve as a useful decision-making supporting tool for distribution network operators. [Projekat Ministarstva nauke Republike Srbije, br. TR33046

  9. Inference for exponentiated general class of distributions based on record values

    Directory of Open Access Journals (Sweden)

    Samah N. Sindi

    2017-09-01

    Full Text Available The main objective of this paper is to suggest and study a new exponentiated general class (EGC of distributions. Maximum likelihood, Bayesian and empirical Bayesian estimators of the parameter of the EGC of distributions based on lower record values are obtained. Furthermore, Bayesian prediction of future records is considered. Based on lower record values, the exponentiated Weibull distribution, its special cases of distributions and exponentiated Gompertz distribution are applied to the EGC of distributions.  

  10. The stochastic distribution of available coefficient of friction on quarry tiles for human locomotion.

    Science.gov (United States)

    Chang, Wen-Ruey; Matz, Simon; Chang, Chien-Chi

    2012-01-01

    The available coefficient of friction (ACOF) for human locomotion is the maximum coefficient of friction that can be supported without a slip at the shoe and floor interface. A statistical model was introduced to estimate the probability of slip by comparing the ACOF with the required coefficient of friction, assuming that both coefficients have stochastic distributions. This paper presents an investigation of the stochastic distributions of the ACOF of quarry tiles under dry, water and glycerol conditions. One hundred friction measurements were performed on a walkway under the surface conditions of dry, water and 45% glycerol concentration. The Kolmogorov-Smirnov goodness-of-fit test was used to determine if the distribution of the ACOF was a good fit with the normal, log-normal and Weibull distributions. The results indicated that the ACOF appears to fit the normal and log-normal distributions better than the Weibull distribution for the water and glycerol conditions. However, no match was found between the distribution of ACOF under the dry condition and any of the three continuous distributions evaluated. Based on limited data, a normal distribution might be more appropriate due to its simplicity, practicality and familiarity among the three distributions evaluated.

  11. Statistical analysis of the ASME KIc database

    International Nuclear Information System (INIS)

    Sokolov, M.A.

    1998-01-01

    The American Society of Mechanical Engineers (ASME) K Ic curve is a function of test temperature (T) normalized to a reference nil-ductility temperature, RT NDT , namely, T-RT NDT . It was constructed as the lower boundary to the available K Ic database. Being a lower bound to the unique but limited database, the ASME K Ic curve concept does not discuss probability matters. However, a continuing evolution of fracture mechanics advances has led to employment of the Weibull distribution function to model the scatter of fracture toughness values in the transition range. The Weibull statistic/master curve approach was applied to analyze the current ASME K Ic database. It is shown that the Weibull distribution function models the scatter in K Ic data from different materials very well, while the temperature dependence is described by the master curve. Probabilistic-based tolerance-bound curves are suggested to describe lower-bound K Ic values

  12. Fitting Statistical Distributions Functions on Ozone Concentration Data at Coastal Areas

    International Nuclear Information System (INIS)

    Muhammad Yazid Nasir; Nurul Adyani Ghazali; Muhammad Izwan Zariq Mokhtar; Norhazlina Suhaimi

    2016-01-01

    Ozone is known as one of the pollutant that contributes to the air pollution problem. Therefore, it is important to carry out the study on ozone. The objective of this study is to find the best statistical distribution for ozone concentration. There are three distributions namely Inverse Gaussian, Weibull and Lognormal were chosen to fit one year hourly average ozone concentration data in 2010 at Port Dickson and Port Klang. Maximum likelihood estimation (MLE) method was used to estimate the parameters to develop the probability density function (PDF) graph and cumulative density function (CDF) graph. Three performance indicators (PI) that are normalized absolute error (NAE), prediction accuracy (PA), and coefficient of determination (R 2 ) were used to determine the goodness-of-fit criteria of the distribution. Result shows that Weibull distribution is the best distribution with the smallest error measure value (NAE) at Port Klang and Port Dickson is 0.08 and 0.31, respectively. The best score for highest adequacy measure (PA: 0.99) with the value of R 2 is 0.98 (Port Klang) and 0.99 (Port Dickson). These results provide useful information to local authorities for prediction purpose. (author)

  13. A simple linear regression method for quantitative trait loci linkage analysis with censored observations.

    Science.gov (United States)

    Anderson, Carl A; McRae, Allan F; Visscher, Peter M

    2006-07-01

    Standard quantitative trait loci (QTL) mapping techniques commonly assume that the trait is both fully observed and normally distributed. When considering survival or age-at-onset traits these assumptions are often incorrect. Methods have been developed to map QTL for survival traits; however, they are both computationally intensive and not available in standard genome analysis software packages. We propose a grouped linear regression method for the analysis of continuous survival data. Using simulation we compare this method to both the Cox and Weibull proportional hazards models and a standard linear regression method that ignores censoring. The grouped linear regression method is of equivalent power to both the Cox and Weibull proportional hazards methods and is significantly better than the standard linear regression method when censored observations are present. The method is also robust to the proportion of censored individuals and the underlying distribution of the trait. On the basis of linear regression methodology, the grouped linear regression model is computationally simple and fast and can be implemented readily in freely available statistical software.

  14. Fitting diameter distribution models to data from forest inventories with concentric plot design

    Directory of Open Access Journals (Sweden)

    Nikos Nanos

    2017-10-01

    Research highlights:We designed a new method to fit the Weibull distribution to forest inventory data from concentric plots that achieves high accuracy and precision in parameter estimates regardless of the within-plot spatial tree pattern.

  15. Bias correction for the least squares estimator of Weibull shape parameter with complete and censored data

    International Nuclear Information System (INIS)

    Zhang, L.F.; Xie, M.; Tang, L.C.

    2006-01-01

    Estimation of the Weibull shape parameter is important in reliability engineering. However, commonly used methods such as the maximum likelihood estimation (MLE) and the least squares estimation (LSE) are known to be biased. Bias correction methods for MLE have been studied in the literature. This paper investigates the methods for bias correction when model parameters are estimated with LSE based on probability plot. Weibull probability plot is very simple and commonly used by practitioners and hence such a study is useful. The bias of the LS shape parameter estimator for multiple censored data is also examined. It is found that the bias can be modeled as the function of the sample size and the censoring level, and is mainly dependent on the latter. A simple bias function is introduced and bias correcting formulas are proposed for both complete and censored data. Simulation results are also presented. The bias correction methods proposed are very easy to use and they can typically reduce the bias of the LSE of the shape parameter to less than half percent

  16. Robust D-optimal designs under correlated error, applicable invariantly for some lifetime distributions

    International Nuclear Information System (INIS)

    Das, Rabindra Nath; Kim, Jinseog; Park, Jeong-Soo

    2015-01-01

    In quality engineering, the most commonly used lifetime distributions are log-normal, exponential, gamma and Weibull. Experimental designs are useful for predicting the optimal operating conditions of the process in lifetime improvement experiments. In the present article, invariant robust first-order D-optimal designs are derived for correlated lifetime responses having the above four distributions. Robust designs are developed for some correlated error structures. It is shown that robust first-order D-optimal designs for these lifetime distributions are always robust rotatable but the converse is not true. Moreover, it is observed that these designs depend on the respective error covariance structure but are invariant to the above four lifetime distributions. This article generalizes the results of Das and Lin [7] for the above four lifetime distributions with general (intra-class, inter-class, compound symmetry, and tri-diagonal) correlated error structures. - Highlights: • This paper presents invariant robust first-order D-optimal designs under correlated lifetime responses. • The results of Das and Lin [7] are extended for the four lifetime (log-normal, exponential, gamma and Weibull) distributions. • This paper also generalizes the results of Das and Lin [7] to more general correlated error structures

  17. Evaluación poscosecha y estimación de vida útil de guayaba fresca utilizando el modelo de Weibull Postharvest evaluation and estimate of shelf-life of fresh guava using the Weibull model

    Directory of Open Access Journals (Sweden)

    Carlos García Mogollón

    2010-07-01

    Full Text Available La guayaba (Psidium guajava L. es una fruta tropical susceptible de sufrir alteraciones indeseables que afectan su vida útil debido a condiciones inadecuadas de almacenamiento y acondicionamiento. En este trabajo se estimó la vida útil de guayaba en fresco utilizando el modelo probabilístico de Weibull y se valuó la calidad de los frutos durante almacenamiento en diferentes condiciones de temperatura y empaque. La evaluación poscosecha se hizo por 15 días con guayabas variedad regional roja. Se utilizó un dise&#ntilde;o completamente al azar con arreglo factorial consistente en tres factores: tiempo de almacenamiento con seis niveles (0, 3, 6, 9, 12 y 15 días, temperatura de almacenamiento con dos niveles: ambiente (37 °C y humedad relativa (HR entre 85 y 90% y refrigeración (9±2 °C y HR de 85 - 90%; dos tipo de empaques: bandeja de poliestireno con film plástico de PVC y 'foil' de aluminio. Durante la evaluación sensorial en el periodo de almacenamiento se usó una escala estructurada de tres puntos grado de satisfacción. El modelo de Weibull demostró ser adecuado para predecir la vida útil de la guayaba fresca basados en los criterios de ajustes, límites de confianza de aceptación y fallo. Durante el periodo de almacenamiento se observó que el factor tiempo, la temperatura y el tipo de empaque tienen un efecto estadístico significativo (P Guava is a tropical fruit susceptible to undesirable alterations that affect the shelf-life due to inadequate conditions of storage and packing. In this work the shelf-life of guava in fresh using the probabilistic model of Weibull was considered and the quality of the fruits was estimated during storage to different conditions of temperature and packing. The postharvest evaluation was made during 15 days with guavas variety `Red Regional´. The completely randomized design and factorial design with 3 factors: storage time with 6 levels (0, 3, 6, 9, 12, 15 days, storage temperature with

  18. Fast Reliability Assessing Method for Distribution Network with Distributed Renewable Energy Generation

    Science.gov (United States)

    Chen, Fan; Huang, Shaoxiong; Ding, Jinjin; Ding, Jinjin; Gao, Bo; Xie, Yuguang; Wang, Xiaoming

    2018-01-01

    This paper proposes a fast reliability assessing method for distribution grid with distributed renewable energy generation. First, the Weibull distribution and the Beta distribution are used to describe the probability distribution characteristics of wind speed and solar irradiance respectively, and the models of wind farm, solar park and local load are built for reliability assessment. Then based on power system production cost simulation probability discretization and linearization power flow, a optimal power flow objected with minimum cost of conventional power generation is to be resolved. Thus a reliability assessment for distribution grid is implemented fast and accurately. The Loss Of Load Probability (LOLP) and Expected Energy Not Supplied (EENS) are selected as the reliability index, a simulation for IEEE RBTS BUS6 system in MATLAB indicates that the fast reliability assessing method calculates the reliability index much faster with the accuracy ensured when compared with Monte Carlo method.

  19. Probability distributions of bed load particle velocities, accelerations, hop distances, and travel times informed by Jaynes's principle of maximum entropy

    Science.gov (United States)

    Furbish, David; Schmeeckle, Mark; Schumer, Rina; Fathel, Siobhan

    2016-01-01

    We describe the most likely forms of the probability distributions of bed load particle velocities, accelerations, hop distances, and travel times, in a manner that formally appeals to inferential statistics while honoring mechanical and kinematic constraints imposed by equilibrium transport conditions. The analysis is based on E. Jaynes's elaboration of the implications of the similarity between the Gibbs entropy in statistical mechanics and the Shannon entropy in information theory. By maximizing the information entropy of a distribution subject to known constraints on its moments, our choice of the form of the distribution is unbiased. The analysis suggests that particle velocities and travel times are exponentially distributed and that particle accelerations follow a Laplace distribution with zero mean. Particle hop distances, viewed alone, ought to be distributed exponentially. However, the covariance between hop distances and travel times precludes this result. Instead, the covariance structure suggests that hop distances follow a Weibull distribution. These distributions are consistent with high-resolution measurements obtained from high-speed imaging of bed load particle motions. The analysis brings us closer to choosing distributions based on our mechanical insight.

  20. Determination of Reliability Index and Weibull Modulus as a Measure of Hypereutectic Silumins Survival

    OpenAIRE

    J. Szymszal; J. Piątkowski; J. Przondziono

    2007-01-01

    The first part of the study describes the methods used to determine Weibull modulus and the related reliability index of hypereutectic silumins containing about 17% Si, assigned for manufacture of high-duty castings to be used in automotive applications and aviation. The second part of the study discusses the importance of chemical composition, including the additions of 3% Cu, 1,5% Ni and 1,5% Mg, while in the third part attention was focussed on the effect of process history, including moul...

  1. Stability of the laws for the distribution of the cumulative failures in railway transport

    OpenAIRE

    Kirill VOYNOV

    2008-01-01

    There are very many different laws of distribution (for example), bellshaped (Gaussian) distribution, lognormal, Weibull distribution, exponential, uniform, Poisson’s, Student’s distributions and so on, which help to describe the real picture of failures with elements in various mechanical systems, in locomotives and carriages, too. To diminish the possibility of getting the rough error in the output of maths data treatment the new method is demonstrated in this article. The task is solved bo...

  2. Evaluating the suitability of wind speed probability distribution models: A case of study of east and southeast parts of Iran

    International Nuclear Information System (INIS)

    Alavi, Omid; Mohammadi, Kasra; Mostafaeipour, Ali

    2016-01-01

    Highlights: • Suitability of different wind speed probability functions is assessed. • 5 stations distributed in east and south-east of Iran are considered as case studies. • Nakagami distribution is tested for first time and compared with 7 other functions. • Due to difference in wind features, best function is not similar for all stations. - Abstract: Precise information of wind speed probability distribution is truly significant for many wind energy applications. The objective of this study is to evaluate the suitability of different probability functions for estimating wind speed distribution at five stations, distributed in the east and southeast of Iran. Nakagami distribution function is utilized for the first time to estimate the distribution of wind speed. The performance of Nakagami function is compared with seven typically used distribution functions. The achieved results reveal that the more effective function is not similar among all stations. Wind speed characteristics, quantity and quality of the recorded wind speed data can be considered as influential parameters on the performance of the distribution functions. Also, the skewness of the recorded wind speed data may have influence on the accuracy of the Nakagami distribution. For Chabahar and Khaf stations the Nakagami distribution shows the highest performance while for Lutak, Rafsanjan and Zabol stations the Gamma, Generalized Extreme Value and Inverse-Gaussian distributions offer the best fits, respectively. Based on the analysis, the Nakagami distribution can generally be considered as an effective distribution since it provides the best fits in 2 stations and ranks 3rd to 5th in the remaining stations; however, due to the close performance of the Nakagami and Weibull distributions and also flexibility of the Weibull function as its widely proven feature, more assessments on the performance of the Nakagami distribution are required.

  3. Reliability analysis of mining equipment: A case study of a crushing plant at Jajarm Bauxite Mine in Iran

    International Nuclear Information System (INIS)

    Barabady, Javad; Kumar, Uday

    2008-01-01

    The performance of mining machines depends on the reliability of the equipment used, the operating environment, the maintenance efficiency, the operation process, the technical expertise of the miners, etc. As the size and complexity of mining equipments continue to increase, the implications of equipment failure become ever more critical. Therefore, reliability analysis is required to identify the bottlenecks in the system and to find the components or subsystems with low reliability for a given designed performance. It is important to select a suitable method for data collection as well as for reliability analysis. This paper presents a case study describing reliability and availability analysis of the crushing plant number 3 at Jajarm Bauxite Mine in Iran. In this study, the crushing plant number 3 is divided into six subsystems. The parameters of some probability distributions, such as Weibull, Exponential, and Lognormal distributions have been estimated by using ReliaSoft's Weibull++6 software. The results of the analysis show that the conveyer subsystem and secondary screen subsystem are critical from a reliability point of view, and the secondary crusher subsystem and conveyer subsystem are critical from an availability point of view. The study also shows that the reliability analysis is very useful for deciding maintenance intervals

  4. statistical analysis of wind speed for electrical power generation

    African Journals Online (AJOL)

    HOD

    In order to predict and model the potential of any site, ... gamma, and Raleigh distributions for 8 locations in. Nigeria. ... probability density function is used to model the average power in ... mathematical expression of the Weibull distribution is.

  5. Weibull statistics effective area and volume in the ball-on-ring testing method

    DEFF Research Database (Denmark)

    Frandsen, Henrik Lund

    2014-01-01

    The ball-on-ring method is together with other biaxial bending methods often used for measuring the strength of plates of brittle materials, because machining defects are remote from the high stresses causing the failure of the specimens. In order to scale the measured Weibull strength...... to geometries relevant for the application of the material, the effective area or volume for the test specimen must be evaluated. In this work analytical expressions for the effective area and volume of the ball-on-ring test specimen is derived. In the derivation the multiaxial stress field has been accounted...

  6. Stand diameter distribution modelling and prediction based on Richards function.

    Directory of Open Access Journals (Sweden)

    Ai-guo Duan

    Full Text Available The objective of this study was to introduce application of the Richards equation on modelling and prediction of stand diameter distribution. The long-term repeated measurement data sets, consisted of 309 diameter frequency distributions from Chinese fir (Cunninghamia lanceolata plantations in the southern China, were used. Also, 150 stands were used as fitting data, the other 159 stands were used for testing. Nonlinear regression method (NRM or maximum likelihood estimates method (MLEM were applied to estimate the parameters of models, and the parameter prediction method (PPM and parameter recovery method (PRM were used to predict the diameter distributions of unknown stands. Four main conclusions were obtained: (1 R distribution presented a more accurate simulation than three-parametric Weibull function; (2 the parameters p, q and r of R distribution proved to be its scale, location and shape parameters, and have a deep relationship with stand characteristics, which means the parameters of R distribution have good theoretical interpretation; (3 the ordinate of inflection point of R distribution has significant relativity with its skewness and kurtosis, and the fitted main distribution range for the cumulative diameter distribution of Chinese fir plantations was 0.4∼0.6; (4 the goodness-of-fit test showed diameter distributions of unknown stands can be well estimated by applying R distribution based on PRM or the combination of PPM and PRM under the condition that only quadratic mean DBH or plus stand age are known, and the non-rejection rates were near 80%, which are higher than the 72.33% non-rejection rate of three-parametric Weibull function based on the combination of PPM and PRM.

  7. Progressive failure site generation in AlGaN/GaN high electron mobility transistors under OFF-state stress: Weibull statistics and temperature dependence

    International Nuclear Information System (INIS)

    Sun, Huarui; Bajo, Miguel Montes; Uren, Michael J.; Kuball, Martin

    2015-01-01

    Gate leakage degradation of AlGaN/GaN high electron mobility transistors under OFF-state stress is investigated using a combination of electrical, optical, and surface morphology characterizations. The generation of leakage “hot spots” at the edge of the gate is found to be strongly temperature accelerated. The time for the formation of each failure site follows a Weibull distribution with a shape parameter in the range of 0.7–0.9 from room temperature up to 120 °C. The average leakage per failure site is only weakly temperature dependent. The stress-induced structural degradation at the leakage sites exhibits a temperature dependence in the surface morphology, which is consistent with a surface defect generation process involving temperature-associated changes in the breakdown sites

  8. Ajustes de funções de distribuição de probabilidade à radiação solar global no Estado do Rio Grande do Sul Adjustments of probability distribution functions to global solar radiation in Rio Grande do Sul State

    Directory of Open Access Journals (Sweden)

    Alberto Cargnelutti Filho

    2004-12-01

    Full Text Available O objetivo deste trabalho foi verificar o ajuste das séries de dados de radiação solar global média decendial, de 22 municípios do Estado do Rio Grande do Sul, às funções de distribuições de probabilidade normal, log-normal, gama, gumbel e weibull. Aplicou-se o teste de aderência de Kolmogorov-Smirnov, nas 792 séries de dados (22 municípios x 36 decêndios de radiação solar global média decendial, para verificar o ajuste dos dados às distribuições normal, log-normal, gama, gumbel e weibull, totalizando 3.960 testes. Os dados decendiais de radiação solar global média se ajustam às funções de distribuições de probabilidade normal, log-normal, gama, gumbel e weibull, e apresentam melhor ajuste à função de distribuição de probabilidade normal.The objective of this work was to verify the adjustment of data series for average global solar radiation to the normal, log-normal, gamma, gumbel and weibull probability distribution functions. Data were collected from 22 cities in Rio Grande do Sul State, Brazil. The Kolmogorov-Smirnov test was applied in the 792 series of data (22 localities x 36 periods of ten days of average global solar radiation to verify the adjustment of the data to the normal, log-normal, gamma, gumbel and weibull probability distribution functions, totalizing 3,960 tests. The data of average global solar radiation adjust to the normal, log-normal, gamma, gumbel and weibull probability distribution functions, and present a better adjustment to the normal probability function.

  9. Stability of the laws for the distribution of the cumulative failures in railway transport

    Directory of Open Access Journals (Sweden)

    Kirill VOYNOV

    2008-01-01

    Full Text Available There are very many different laws of distribution (for example, bellshaped (Gaussian distribution, lognormal, Weibull distribution, exponential, uniform, Poisson’s, Student’s distributions and so on, which help to describe the real picture of failures with elements in various mechanical systems, in locomotives and carriages, too. To diminish the possibility of getting the rough error in the output of maths data treatment the new method is demonstrated in this article. The task is solved both to the discrete, and to the continuous distributions.

  10. Statistical analysis of absorptive laser damage in dielectric thin films

    International Nuclear Information System (INIS)

    Budgor, A.B.; Luria-Budgor, K.F.

    1978-01-01

    The Weibull distribution arises as an example of the theory of extreme events. It is commonly used to fit statistical data arising in the failure analysis of electrical components and in DC breakdown of materials. This distribution is employed to analyze time-to-damage and intensity-to-damage statistics obtained when irradiating thin film coated samples of SiO 2 , ZrO 2 , and Al 2 O 3 with tightly focused laser beams. The data used is furnished by Milam. The fit to the data is excellent; and least squared correlation coefficients greater than 0.9 are often obtained

  11. Statistical Distribution of Fatigue Life for Cast TiAl Alloy

    Directory of Open Access Journals (Sweden)

    WAN Wenjuan

    2016-08-01

    Full Text Available Statistic distribution of fatigue life data and its controls of cast Ti-47.5Al-2.5V-1.0Cr-0.2Zr (atom fraction/% alloy were investigated. Fatigue tests were operated by means of load-controlled rotating bending fatigue tests (R=-1 performed at a frequency of 100 Hz at 750 ℃ in air. The fracture mechanism was analyzed by observing the fracture surface morphologies through scanning electron microscope,and the achieved fatigue life data were analyzed by Weibull statistics. The results show that the fatigue life data present a remarkable scatter ranging from 103 to 106 cycles, and distribute mainly in short and long life regime. The reason for this phenomenon is that the fatigue crack initiators are different with different specimens. The crack initiators for short-life specimens are caused by shrinkage porosity, and for long-life ones are caused by bridged porosity interface and soft-oriented lamellar interface. Based on the observation results of fracture surface, two-parameter Weibull distribution model for fatigue life data can be used for the prediction of fatigue life at a certain failure probability. It has also shown that the shrinkage porosity causes the most detrimental effect to fatigue life.

  12. A statistical investigation of wind characteristics and wind energy potential based on the Weibull and Rayleigh models in Rwanda

    Energy Technology Data Exchange (ETDEWEB)

    Safari, Bonfils; Gasore, Jimmy [Department of Physics, National University of Rwanda, P.O. Box 117, Huye, South Province (Rwanda)

    2010-12-15

    A wind energy system converts the kinetic energy of the wind into mechanical or electrical energy that can be harnessed for practical uses and transform the economy of rural areas where access to water and electricity is very restricted and industry is almost nonexistent in most of the developing countries like Rwanda. Assessing wind power potential for a location is an imperative requirement before making a decision for the installation of windmills or a wind electric generator and evaluating plans for relating projects. The aim of the present study was to evaluate the potential of wind resource in Rwanda and to constitute a database for the users of the wind power. A time series of hourly daily measured wind speed and wind direction for the period between 1974 and 1993 on five main Rwandan meteorological stations was provided by the National Meteorology Department. Statistical methods applying Weibull and Rayleigh distribution were presented to evaluate the wind speed characteristics and the wind power potential at a height of 10 m above ground level using hourly monthly average data. Those characteristics were extrapolated for higher levels in altitude. The results give a global picture of the distribution of the wind potential in different locations of Rwanda. (author)

  13. Modeling of speed distribution for mixed bicycle traffic flow

    Directory of Open Access Journals (Sweden)

    Cheng Xu

    2015-11-01

    Full Text Available Speed is a fundamental measure of traffic performance for highway systems. There were lots of results for the speed characteristics of motorized vehicles. In this article, we studied the speed distribution for mixed bicycle traffic which was ignored in the past. Field speed data were collected from Hangzhou, China, under different survey sites, traffic conditions, and percentages of electric bicycle. The statistics results of field data show that the total mean speed of electric bicycles is 17.09 km/h, 3.63 km/h faster and 27.0% higher than that of regular bicycles. Normal, log-normal, gamma, and Weibull distribution models were used for testing speed data. The results of goodness-of-fit hypothesis tests imply that the log-normal and Weibull model can fit the field data very well. Then, the relationships between mean speed and electric bicycle proportions were proposed using linear regression models, and the mean speed for purely electric bicycles or regular bicycles can be obtained. The findings of this article will provide effective help for the safety and traffic management of mixed bicycle traffic.

  14. Approximation of the breast height diameter distribution of two-cohort stands by mixture models III Kernel density estimators vs mixture models

    Science.gov (United States)

    Rafal Podlaski; Francis A. Roesch

    2014-01-01

    Two-component mixtures of either the Weibull distribution or the gamma distribution and the kernel density estimator were used for describing the diameter at breast height (dbh) empirical distributions of two-cohort stands. The data consisted of study plots from the Å wietokrzyski National Park (central Poland) and areas close to and including the North Carolina section...

  15. An eoq model for weibull deteriorating item with ramp type demand and salvage value under trade credit system

    Directory of Open Access Journals (Sweden)

    Lalit Mohan Pradhan

    2014-03-01

    Full Text Available Background: In the present competitive business scenario researchers have developed various inventory models for deteriorating items considering various practical situations for better inventory control. Permissible delay in payments with various demands and deteriorations is considerably a new concept introduced in developing various inventory models. These models are very useful for both the consumers and the manufacturer. Methods: In the present work an inventory model has been developed for a three parameter Weibull deteriorating item with ramp type demand and salvage value under trade credit system. Here we have considered a single item for developing the model. Results and conclusion: Optimal order quantity, optimal cycle time and total variable cost during a cycle have been derived for the proposed inventory model. The results obtained in this paper have been illustrated with the help of numerical examples and sensitivity analysis.   

  16. Modelling Wind for Wind Farm Layout Optimization Using Joint Distribution of Wind Speed and Wind Direction

    DEFF Research Database (Denmark)

    Feng, Ju; Shen, Wen Zhong

    2015-01-01

    Reliable wind modelling is of crucial importance for wind farm development. The common practice of using sector-wise Weibull distributions has been found inappropriate for wind farm layout optimization. In this study, we propose a simple and easily implementable method to construct joint distribu...

  17. Statistical Evidence for the Preference of Frailty Distributions with Regularly-Varying-at-Zero Densities

    DEFF Research Database (Denmark)

    Missov, Trifon I.; Schöley, Jonas

    to this criterion admissible distributions are, for example, the gamma, the beta, the truncated normal, the log-logistic and the Weibull, while distributions like the log-normal and the inverse Gaussian do not satisfy this condition. In this article we show that models with admissible frailty distributions...... and a Gompertz baseline provide a better fit to adult human mortality data than the corresponding models with non-admissible frailty distributions. We implement estimation procedures for mixture models with a Gompertz baseline and frailty that follows a gamma, truncated normal, log-normal, or inverse Gaussian...

  18. Fitting diameter distribution models to data from forest inventories with concentric plot design

    Energy Technology Data Exchange (ETDEWEB)

    Nanos, N.; Sjöstedt de Luna, S.

    2017-11-01

    Aim: Several national forest inventories use a complex plot design based on multiple concentric subplots where smaller diameter trees are inventoried when lying in the smaller-radius subplots and ignored otherwise. Data from these plots are truncated with threshold (truncation) diameters varying according to the distance from the plot centre. In this paper we designed a maximum likelihood method to fit the Weibull diameter distribution to data from concentric plots. Material and methods: Our method (M1) was based on multiple truncated probability density functions to build the likelihood. In addition, we used an alternative method (M2) presented recently. We used methods M1 and M2 as well as two other reference methods to estimate the Weibull parameters in 40000 simulated plots. The spatial tree pattern of the simulated plots was generated using four models of spatial point patterns. Two error indices were used to assess the relative performance of M1 and M2 in estimating relevant stand-level variables. In addition, we estimated the Quadratic Mean plot Diameter (QMD) using Expansion Factors (EFs). Main results: Methods M1 and M2 produced comparable estimation errors in random and cluster tree spatial patterns. Method M2 produced biased parameter estimates in plots with inhomogeneous Poisson patterns. Estimation of QMD using EFs produced biased results in plots within inhomogeneous intensity Poisson patterns. Research highlights:We designed a new method to fit the Weibull distribution to forest inventory data from concentric plots that achieves high accuracy and precision in parameter estimates regardless of the within-plot spatial tree pattern.

  19. Development of Testing Methodologies for the Mechanical Properties of MEMS

    Science.gov (United States)

    Ekwaro-Osire, Stephen

    2003-01-01

    This effort is to investigate and design testing strategies to determine the mechanical properties of MicroElectroMechanical Systems (MEMS) as well as investigate the development of a MEMS Probabilistic Design Methodology (PDM). One item of potential interest is the design of a test for the Weibull size effect in pressure membranes. The Weibull size effect is a consequence of a stochastic strength response predicted from the Weibull distribution. Confirming that MEMS strength is controlled by the Weibull distribution will enable the development of a probabilistic design methodology for MEMS - similar to the GRC developed CARES/Life program for bulk ceramics. However, the primary area of investigation will most likely be analysis and modeling of material interfaces for strength as well as developing a strategy to handle stress singularities at sharp corners, filets, and material interfaces. This will be a continuation of the previous years work. The ultimate objective of this effort is to further develop and verify the ability of the Ceramics Analysis and Reliability Evaluation of Structures Life (CARES/Life) code to predict the time-dependent reliability of MEMS structures subjected to multiple transient loads.

  20. Distributed Analysis in CMS

    CERN Document Server

    Fanfani, Alessandra; Sanches, Jose Afonso; Andreeva, Julia; Bagliesi, Giusepppe; Bauerdick, Lothar; Belforte, Stefano; Bittencourt Sampaio, Patricia; Bloom, Ken; Blumenfeld, Barry; Bonacorsi, Daniele; Brew, Chris; Calloni, Marco; Cesini, Daniele; Cinquilli, Mattia; Codispoti, Giuseppe; D'Hondt, Jorgen; Dong, Liang; Dongiovanni, Danilo; Donvito, Giacinto; Dykstra, David; Edelmann, Erik; Egeland, Ricky; Elmer, Peter; Eulisse, Giulio; Evans, Dave; Fanzago, Federica; Farina, Fabio; Feichtinger, Derek; Fisk, Ian; Flix, Josep; Grandi, Claudio; Guo, Yuyi; Happonen, Kalle; Hernandez, Jose M; Huang, Chih-Hao; Kang, Kejing; Karavakis, Edward; Kasemann, Matthias; Kavka, Carlos; Khan, Akram; Kim, Bockjoo; Klem, Jukka; Koivumaki, Jesper; Kress, Thomas; Kreuzer, Peter; Kurca, Tibor; Kuznetsov, Valentin; Lacaprara, Stefano; Lassila-Perini, Kati; Letts, James; Linden, Tomas; Lueking, Lee; Maes, Joris; Magini, Nicolo; Maier, Gerhild; McBride, Patricia; Metson, Simon; Miccio, Vincenzo; Padhi, Sanjay; Pi, Haifeng; Riahi, Hassen; Riley, Daniel; Rossman, Paul; Saiz, Pablo; Sartirana, Andrea; Sciaba, Andrea; Sekhri, Vijay; Spiga, Daniele; Tuura, Lassi; Vaandering, Eric; Vanelderen, Lukas; Van Mulders, Petra; Vedaee, Aresh; Villella, Ilaria; Wicklund, Eric; Wildish, Tony; Wissing, Christoph; Wurthwein, Frank

    2009-01-01

    The CMS experiment expects to manage several Pbytes of data each year during the LHC programme, distributing them over many computing sites around the world and enabling data access at those centers for analysis. CMS has identified the distributed sites as the primary location for physics analysis to support a wide community with thousands potential users. This represents an unprecedented experimental challenge in terms of the scale of distributed computing resources and number of user. An overview of the computing architecture, the software tools and the distributed infrastructure is reported. Summaries of the experience in establishing efficient and scalable operations to get prepared for CMS distributed analysis are presented, followed by the user experience in their current analysis activities.

  1. Failure analysis a practical guide for manufacturers of electronic components and systems

    CERN Document Server

    Bâzu, Marius

    2011-01-01

    Failure analysis is the preferred method to investigate product or process reliability and to ensure optimum performance of electrical components and systems. The physics-of-failure approach is the only internationally accepted solution for continuously improving the reliability of materials, devices and processes. The models have been developed from the physical and chemical phenomena that are responsible for degradation or failure of electronic components and materials and now replace popular distribution models for failure mechanisms such as Weibull or lognormal. Reliability engineers nee

  2. Analysis of Flexural Fatigue Strength of Self Compacting Fibre Reinforced Concrete Beams

    Science.gov (United States)

    Murali, G.; Sudar Celestina, J. P. Arul; Subhashini, N.; Vigneshwari, M.

    2017-07-01

    This study presents the extensive statistical investigation ofvariations in flexural fatigue life of self-compacting Fibrous Concrete (FC) beams. For this purpose, the experimental data of earlier researchers were examined by two parameter Weibull distribution.Two methods namely Graphical and moment wereused to analyse the variations in experimental data and the results have been presented in the form of probability of survival. The Weibull parameters values obtained from graphical and method of moments are precise. At 0.7 stress level, the fatigue life shows 59861 cyclesfor areliability of 90%.

  3. Caracterização analítica e geométrica da metodologia geral de determinação de distribuições de Weibull para o regime eólico e suas aplicações Analytical and geometric characterization of general methodology of determination of Weibull distribution for wind regime and its applications

    Directory of Open Access Journals (Sweden)

    Luís R. A Gabriel Filho

    2011-02-01

    Full Text Available O regime eólico de uma região pode ser descrito por distribuição de frequências que fornecem informações e características extremamente necessárias para uma possível implantação de sistemas eólicos de captação de energia na região e consequentes aplicações no meio rural em regiões afastadas. Estas características, tais como a velocidade média anual, a variância das velocidades registradas e a densidade da potência eólica média horária, podem ser obtidas pela frequência de ocorrências de determinada velocidade, que por sua vez deve ser estudada através de expressões analíticas. A função analítica mais adequada para distribuições eólicas é a função de densidade de Weibull, que pode ser determinada por métodos numéricos e regressões lineares. O objetivo deste trabalho é caracterizar analítica e geometricamente todos os procedimentos metodológicos necessários para a realização de uma caracterização completa do regime eólico de uma região e suas aplicações na região de Botucatu - SP, visando a determinar o potencial energético para implementação de turbinas eólicas. Assim, foi possível estabelecer teoremas relacionados com a forma de caracterização do regime eólico, estabelecendo a metodologia concisa analiticamente para a definição dos parâmetros eólicos de qualquer região a ser estudada. Para o desenvolvimento desta pesquisa, utilizou-se um anemômetro da CAMPBELL.The wind regime of a region can be described by frequency distributions that provide information and features extremely necessary for a possible deployment of wind systems of energy capturing in the region and the resulting applications in rural areas in remote regions. These features, such as the annual average speed, variance of speed and hourly average of wind power density, can be obtained by the frequency of occurrences of certain speed, which in turn should be studied through analytical expressions. The analytic

  4. An analysis of confidence limit calculations used in AAPM Task Group No. 119

    International Nuclear Information System (INIS)

    Knill, Cory; Snyder, Michael

    2011-01-01

    Purpose: The report issued by AAPM Task Group No. 119 outlined a procedure for evaluating the effectiveness of IMRT commissioning. The procedure involves measuring gamma pass-rate indices for IMRT plans of standard phantoms and determining if the results fall within a confidence limit set by assuming normally distributed data. As stated in the TG report, the assumption of normally distributed gamma pass rates is a convenient approximation for commissioning purposes, but may not accurately describe the data. Here the authors attempt to better describe gamma pass-rate data by fitting it to different distributions. The authors then calculate updated confidence limits using those distributions and compare them to those derived using TG No. 119 method. Methods: Gamma pass-rate data from 111 head and neck patients are fitted using the TG No. 119 normal distribution, a truncated normal distribution, and a Weibull distribution. Confidence limits to 95% are calculated for each and compared. A more general analysis of the expected differences between the TG No. 119 method of determining confidence limits and a more time-consuming curve fitting method is performed. Results: The TG No. 119 standard normal distribution does not fit the measured data. However, due to the small range of measured data points, the inaccuracy of the fit has only a small effect on the final value of the confidence limits. The confidence limits for the 111 patient plans are within 0.1% of each other for all distributions. The maximum expected difference in confidence limits, calculated using TG No. 119's approximation and a truncated distribution, is 1.2%. Conclusions: A three-parameter Weibull probability distribution more accurately fits the clinical gamma index pass-rate data than the normal distribution adopted by TG No. 119. However, the sensitivity of the confidence limit on distribution fit is low outside of exceptional circumstances.

  5. Failure rate and reliability of the KOMATSU hydraulic excavator in surface limestone mine

    Science.gov (United States)

    Harish Kumar N., S.; Choudhary, R. P.; Murthy, Ch. S. N.

    2018-04-01

    The model with failure rate function of bathtub-shaped is helpful in reliability analysis of any system and particularly in reliability associated privative maintenance. The usual Weibull distribution is, however, not capable to model the complete lifecycle of the any with a bathtub-shaped failure rate function. In this paper, failure rate and reliability analysis of the KOMATSU hydraulic excavator/shovel in surface mine is presented and also to improve the reliability and decrease the failure rate of each subsystem of the shovel based on the preventive maintenance. The model of the bathtub-shaped for shovel can also be seen as a simplification of the Weibull distribution.

  6. Empleo de la función Weibull para evaluar la emergencia de las plántulas de Albizia lebbeck (L. Benth

    Directory of Open Access Journals (Sweden)

    Marlen Navarro

    Full Text Available Con el objetivo de conocer el vigor de las semillas de Albizia lebbeck mediante la evaluación de la emergencia de plántulas, a través de la función Weibull modificada, se realizó la siembra en tres condiciones ambientales y en diferentes tiempos de almacenamiento de la semilla. El diseño fue completamente aleatorizado, con arreglo factorial. Se realizó análisis de varianza para los parámetros M (emergencia acumulada máxima, k (tasa de emergencia y Z (retraso para el inicio de la emergencia de la función Weibull modificada. A partir de los seis meses de iniciado el almacenamiento (44,1 % se observó la pérdida brusca del porcentaje de M en el vivero (A y ligeras variaciones en la cabina (C, en comparación con A y B (sombreador. El ámbito de dispersión del parámetro k osciló entre 0,4-2,6; 0,29-1,9 y 0,5-1,4 % emergencia d-1 para las evaluaciones realizadas en A, B y C, respectivamente. Del análisis de Z se interpretó que el tiempo para el inicio de la emergencia, sin distinción del ambiente de siembra, estuvo enmarcado entre los 3,0 y 7,3 días posteriores a la siembra. En el vivero a pleno sol, en la evaluación a 6 mdia (meses de iniciado el almacenamiento, se obtuvieron los mejores resultados de los parámetros biológicos de la ecuación de Weibull, lo cual permitió un análisis global que indicó un grado de vigor alto en las semillas de A. lebbeck, en comparación con las restantes evaluaciones

  7. Comparison of susceptibility to pitting corrosion of AA2024-T4, AA7075-T651 and AA7475-T761 aluminium alloys in neutral chloride solutions using electrochemical noise analysis

    International Nuclear Information System (INIS)

    Na, Kyung-Hwan; Pyun, Su-Il

    2008-01-01

    The susceptibility to pitting corrosion of AA2024-T4, AA7075-T651 and AA7475-T761 aluminium alloys was investigated in aqueous neutral chloride solution for the purpose of comparison using electrochemical noise measurement. The experimentally measured electrochemical noises were analysed based upon the combined stochastic theory and shot-noise theory using the Weibull distribution function. From the occurrence of two linear regions on one Weibull probability plot, it was suggested that there existed two stochastic processes of uniform corrosion and pitting corrosion; pitting corrosion was distinguished from uniform corrosion in terms of the frequency of events in the stochastic analysis. Accordingly, the present analysis method allowed us to investigate pitting corrosion independently. The susceptibility to pitting corrosion was appropriately evaluated by determining pit embryo formation rate in the stochastic analysis. The susceptibility was decreased in the following order: AA2024-T4 (the naturally aged condition), AA7475-T761 (the overaged condition) and AA7075-T651 (the near-peak-aged condition)

  8. Effects of specimen size on the flexural strength and Weibull modulus of nuclear graphite IG-110, NBG-18, and PCEA

    International Nuclear Information System (INIS)

    Chi, Se-Hwan

    2015-01-01

    Changes in flexural strength and Weibull modulus due to specimen size were investigated for three nuclear graphite grades, IG-110, NBG-18, and PCEA, using four-point-1/3 point (4-1/3) loading with specimens of three different sizes: 3.18 (Thickness) × 6.35 (Width) × 50.8 (Length), 6.50 (T) × 12.0 (W) × 52.0 (L), 18.0 (T) × 16.0 (W) × 64 (L) (mm) (total: 210 specimens). Results showed some specimen size effects were grade dependent: While NBG-18 (a) showed rather significant specimen size effects (37% difference between the 3 T and 18 T), the differences in IG-110 and PCEA were 7.6–15%. The maximum differences in flexural strength due to specimen size were larger in the PCEA and NBG-18 having larger sized coke particles (medium grain size: >300 μm) than the IG-110 with super fine coke particle size (25 μm). The Weibull modulus showed a data population dependency, in that it decreased with increasing numbers of data used for modulus determination. A good correlation between the fracture surface roughness and the flexural strength was confirmed

  9. determination of weibull parameters and analysis of wind power

    African Journals Online (AJOL)

    HOD

    shape parameter (k) and the scale factor(c) were obtained to be 6.7 m/s and 4.3 m/s, 0.91 MW and 0.25 MW, K~ 5.4 and. 2.1, and c ... China, the forecast is not different as the report of the ..... Distribution for Wind Energy Analysis”, J. Wind Eng.

  10. WEIBULL MULTIPLICATIVE MODEL AND MACHINE LEARNING MODELS FOR FULL-AUTOMATIC DARK-SPOT DETECTION FROM SAR IMAGES

    Directory of Open Access Journals (Sweden)

    A. Taravat

    2013-09-01

    Full Text Available As a major aspect of marine pollution, oil release into the sea has serious biological and environmental impacts. Among remote sensing systems (which is a tool that offers a non-destructive investigation method, synthetic aperture radar (SAR can provide valuable synoptic information about the position and size of the oil spill due to its wide area coverage and day/night, and all-weather capabilities. In this paper we present a new automated method for oil-spill monitoring. A new approach is based on the combination of Weibull Multiplicative Model and machine learning techniques to differentiate between dark spots and the background. First, the filter created based on Weibull Multiplicative Model is applied to each sub-image. Second, the sub-image is segmented by two different neural networks techniques (Pulsed Coupled Neural Networks and Multilayer Perceptron Neural Networks. As the last step, a very simple filtering process is used to eliminate the false targets. The proposed approaches were tested on 20 ENVISAT and ERS2 images which contained dark spots. The same parameters were used in all tests. For the overall dataset, the average accuracies of 94.05 % and 95.20 % were obtained for PCNN and MLP methods, respectively. The average computational time for dark-spot detection with a 256 × 256 image in about 4 s for PCNN segmentation using IDL software which is the fastest one in this field at present. Our experimental results demonstrate that the proposed approach is very fast, robust and effective. The proposed approach can be applied to the future spaceborne SAR images.

  11. Weibull Multiplicative Model and Machine Learning Models for Full-Automatic Dark-Spot Detection from SAR Images

    Science.gov (United States)

    Taravat, A.; Del Frate, F.

    2013-09-01

    As a major aspect of marine pollution, oil release into the sea has serious biological and environmental impacts. Among remote sensing systems (which is a tool that offers a non-destructive investigation method), synthetic aperture radar (SAR) can provide valuable synoptic information about the position and size of the oil spill due to its wide area coverage and day/night, and all-weather capabilities. In this paper we present a new automated method for oil-spill monitoring. A new approach is based on the combination of Weibull Multiplicative Model and machine learning techniques to differentiate between dark spots and the background. First, the filter created based on Weibull Multiplicative Model is applied to each sub-image. Second, the sub-image is segmented by two different neural networks techniques (Pulsed Coupled Neural Networks and Multilayer Perceptron Neural Networks). As the last step, a very simple filtering process is used to eliminate the false targets. The proposed approaches were tested on 20 ENVISAT and ERS2 images which contained dark spots. The same parameters were used in all tests. For the overall dataset, the average accuracies of 94.05 % and 95.20 % were obtained for PCNN and MLP methods, respectively. The average computational time for dark-spot detection with a 256 × 256 image in about 4 s for PCNN segmentation using IDL software which is the fastest one in this field at present. Our experimental results demonstrate that the proposed approach is very fast, robust and effective. The proposed approach can be applied to the future spaceborne SAR images.

  12. Handling of computational in vitro/in vivo correlation problems by Microsoft Excel II. Distribution functions and moments.

    Science.gov (United States)

    Langenbucher, Frieder

    2003-01-01

    MS Excel is a useful tool to handle in vitro/in vivo correlation (IVIVC) distribution functions, with emphasis on the Weibull and the biexponential distribution, which are most useful for the presentation of cumulative profiles, e.g. release in vitro or urinary excretion in vivo, and differential profiles such as the plasma response in vivo. The discussion includes moments (AUC and mean) as summarizing statistics, and data-fitting algorithms for parameter estimation.

  13. Multi-choice stochastic transportation problem involving general form of distributions.

    Science.gov (United States)

    Quddoos, Abdul; Ull Hasan, Md Gulzar; Khalid, Mohammad Masood

    2014-01-01

    Many authors have presented studies of multi-choice stochastic transportation problem (MCSTP) where availability and demand parameters follow a particular probability distribution (such as exponential, weibull, cauchy or extreme value). In this paper an MCSTP is considered where availability and demand parameters follow general form of distribution and a generalized equivalent deterministic model (GMCSTP) of MCSTP is obtained. It is also shown that all previous models obtained by different authors can be deduced with the help of GMCSTP. MCSTP with pareto, power function or burr-XII distributions are also considered and equivalent deterministic models are obtained. To illustrate the proposed model two numerical examples are presented and solved using LINGO 13.0 software package.

  14. Reliability Analysis and Overload Capability Assessment of Oil-Immersed Power Transformers

    Directory of Open Access Journals (Sweden)

    Chen Wang

    2016-01-01

    Full Text Available Smart grids have been constructed so as to guarantee the security and stability of the power grid in recent years. Power transformers are a most vital component in the complicated smart grid network. Any transformer failure can cause damage of the whole power system, within which the failures caused by overloading cannot be ignored. This research gives a new insight into overload capability assessment of transformers. The hot-spot temperature of the winding is the most critical factor in measuring the overload capacity of power transformers. Thus, the hot-spot temperature is calculated to obtain the duration running time of the power transformers under overloading conditions. Then the overloading probability is fitted with the mature and widely accepted Weibull probability density function. To guarantee the accuracy of this fitting, a new objective function is proposed to obtain the desired parameters in the Weibull distributions. In addition, ten different mutation scenarios are adopted in the differential evolutionary algorithm to optimize the parameter in the Weibull distribution. The final comprehensive overload capability of the power transformer is assessed by the duration running time as well as the overloading probability. Compared with the previous studies that take no account of the overloading probability, the assessment results obtained in this research are much more reliable.

  15. Historical floods in flood frequency analysis: Is this game worth the candle?

    Science.gov (United States)

    Strupczewski, Witold G.; Kochanek, Krzysztof; Bogdanowicz, Ewa

    2017-11-01

    In flood frequency analysis (FFA) the profit from inclusion of historical information on the largest historical pre-instrumental floods depends primarily on reliability of the information, i.e. the accuracy of magnitude and return period of floods. This study is focused on possible theoretical maximum gain in accuracy of estimates of upper quantiles, that can be obtained by incorporating the largest historical floods of known return periods into the FFA. We assumed a simple case: N years of systematic records of annual maximum flows and either one largest (XM1) or two largest (XM1 and XM2) flood peak flows in a historical M-year long period. The problem is explored by Monte Carlo simulations with the maximum likelihood (ML) method. Both correct and false distributional assumptions are considered. In the first case the two-parameter extreme value models (Gumbel, log-Gumbel, Weibull) with various coefficients of variation serve as parent distributions. In the case of unknown parent distribution, the Weibull distribution was assumed as estimating model and the truncated Gumbel as parent distribution. The return periods of XM1 and XM2 are determined from the parent distribution. The results are then compared with the case, when return periods of XM1 and XM2 are defined by their plotting positions. The results are presented in terms of bias, root mean square error and the probability of overestimation of the quantile with 100-year return period. The results of the research indicate that the maximal profit of inclusion of pre-instrumental foods in the FFA may prove smaller than the cost of reconstruction of historical hydrological information.

  16. A Weibull model to describe antimicrobial kinetics of oregano and lemongrass essential oils against Salmonella Enteritidis in ground beef during refrigerated storage.

    Science.gov (United States)

    de Oliveira, Thales Leandro Coutinho; Soares, Rodrigo de Araújo; Piccoli, Roberta Hilsdorf

    2013-03-01

    The antimicrobial effect of oregano (Origanum vulgare L.) and lemongrass (Cymbopogon citratus (DC.) Stapf.) essential oils (EOs) against Salmonella enterica serotype Enteritidis in in vitro experiments, and inoculated in ground bovine meat during refrigerated storage (4±2 °C) for 6 days was evaluated. The Weibull model was tested to fit survival/inactivation bacterial curves (estimating of p and δ parameters). The minimum inhibitory concentration (MIC) value for both EOs on S. Enteritidis was 3.90 μl/ml. The EO concentrations applied in the ground beef were 3.90, 7.80 and 15.60 μl/g, based on MIC levels and possible activity reduction by food constituents. Both evaluated EOs in all tested levels, showed antimicrobial effects, with microbial populations reducing (p≤0.05) along time storage. Evaluating fit-quality parameters (RSS and RSE) Weibull models are able to describe the inactivation curves of EOs against S. Enteritidis. The application of EOs in processed meats can be used to control pathogens during refrigerated shelf-life. Copyright © 2012 Elsevier Ltd. All rights reserved.

  17. Sensitivity Weaknesses in Application of some Statistical Distribution in First Order Reliability Methods

    DEFF Research Database (Denmark)

    Sørensen, John Dalsgaard; Enevoldsen, I.

    1993-01-01

    It has been observed and shown that in some examples a sensitivity analysis of the first order reliability index results in increasing reliability index, when the standard deviation for a stochastic variable is increased while the expected value is fixed. This unfortunate behaviour can occur when...... a stochastic variable is modelled by an asymmetrical density function. For lognormally, Gumbel and Weibull distributed stochastic variables it is shown for which combinations of the/3-point, the expected value and standard deviation the weakness can occur. In relation to practical application the behaviour...... is probably rather infrequent. A simple example is shown as illustration and to exemplify that for second order reliability methods and for exact calculations of the probability of failure this behaviour is much more infrequent....

  18. Reliability analysis of component of affination centrifugal 1 machine by using reliability engineering

    Science.gov (United States)

    Sembiring, N.; Ginting, E.; Darnello, T.

    2017-12-01

    Problems that appear in a company that produces refined sugar, the production floor has not reached the level of critical machine availability because it often suffered damage (breakdown). This results in a sudden loss of production time and production opportunities. This problem can be solved by Reliability Engineering method where the statistical approach to historical damage data is performed to see the pattern of the distribution. The method can provide a value of reliability, rate of damage, and availability level, of an machine during the maintenance time interval schedule. The result of distribution test to time inter-damage data (MTTF) flexible hose component is lognormal distribution while component of teflon cone lifthing is weibull distribution. While from distribution test to mean time of improvement (MTTR) flexible hose component is exponential distribution while component of teflon cone lifthing is weibull distribution. The actual results of the flexible hose component on the replacement schedule per 720 hours obtained reliability of 0.2451 and availability 0.9960. While on the critical components of teflon cone lifthing actual on the replacement schedule per 1944 hours obtained reliability of 0.4083 and availability 0.9927.

  19. Statistical investigation of the crack initiation lives of piping structural welded joint in low cycle fatigue test of 240 degree C

    International Nuclear Information System (INIS)

    Zhao Yongxiang; Gao Qing; Cai Lixun

    1999-01-01

    A statistical investigation into the fitting of four possible fatigue assumed distributions (three parameter Weibull, two parameter Weibull, lognormal and extreme maximum value distributions) for the crack initiation lives of piping structural welded joint in low cycle fatigue test of 240 degree C is performed by linear regression and least squares methods. The results reveal that the three parameters Weibull distribution may give misleading results in fatigue reliability analysis because the shape parameter is often less than 1. This means that the failure rate decreases with fatigue cycling which is contrary to the general understanding of the behaviour of welded joint. Reliability analyses may also affected by the slightly nonconservative evaluations in tail regions of this distribution. The other three distributions are slightly poor in the total fit effects, but they can be safety assumed in reliability analyses due to the non-conservative evaluations in tail regions mostly and the consistency with the fatigue physics of the structural behaviour of welded joint in the range of engineering practice. In addition, the extreme maximum value distribution is in good consists with the general physical understanding of the structural behaviour of welded joint

  20. Distributed analysis at LHCb

    International Nuclear Information System (INIS)

    Williams, Mike; Egede, Ulrik; Paterson, Stuart

    2011-01-01

    The distributed analysis experience to date at LHCb has been positive: job success rates are high and wait times for high-priority jobs are low. LHCb users access the grid using the GANGA job-management package, while the LHCb virtual organization manages its resources using the DIRAC package. This clear division of labor has benefitted LHCb and its users greatly; it is a major reason why distributed analysis at LHCb has been so successful. The newly formed LHCb distributed analysis support team has also proved to be a success.

  1. Comprehensive evaluation of wind speed distribution models: A case study for North Dakota sites

    International Nuclear Information System (INIS)

    Zhou Junyi; Erdem, Ergin; Li Gong; Shi Jing

    2010-01-01

    Accurate analysis of long term wind data is critical to the estimation of wind energy potential for a candidate location and its nearby area. Investigating the wind speed distribution is one critical task for this purpose. This paper presents a comprehensive evaluation on probability density functions for the wind speed data from five representative sites in North Dakota. Besides the popular Weibull and Rayleigh distributions, we also include other distributions such as gamma, lognormal, inverse Gaussian, and maximum entropy principle (MEP) derived probability density functions (PDFs). Six goodness-of-fit (GOF) statistics are used to determine the appropriate distributions for the wind speed data for each site. It is found that no particular distribution outperforms others for all five sites, while Rayleigh distribution performs poorly for most of the sites. Similar to other models, the performances of MEP-derived PDFs in fitting wind speed data varies from site to site. Also, the results demonstrate that MEP-derived PDFs are flexible and have the potential to capture other possible distribution patterns of wind speed data. Meanwhile, different GOF statistics may generate inconsistent ranking orders of fit performance among the candidate PDFs. In addition, one comprehensive metric that combines all individual statistics is proposed to rank the overall performance for the chosen statistical distributions.

  2. Wind energy potential in Peshawar, Pakistan

    International Nuclear Information System (INIS)

    Nasir, S.M.; Raza, S.M.

    1994-01-01

    Hourly wind data at Peshawar airport, received from the Headquarters, Pakistan Air Force, has been used to determine the diurnal variations, speed duration and speed frequency curves. The applicability of Weibull distribution is then tested over probability density function, which shows that weibull distribution fits the wind data satisfactorily and with a good precision, provided the observations of calm spells are omitted. Our analysis shows that monthly mean wind speed and wind power varies from 0.6 to 2.0 m/s and 0.2 to 4.0 wm-2, respectively, giving fair prospects for wind owe applications over the summer months. (author)

  3. Handbook of exponential and related distributions for engineers and scientists

    CERN Document Server

    Pal, Nabendu; Lim, Wooi K

    2005-01-01

    The normal distribution is widely known and used by scientists and engineers. However, there are many cases when the normal distribution is not appropriate, due to the data being skewed. Rather than leaving you to search through journal articles, advanced theoretical monographs, or introductory texts for alternative distributions, the Handbook of Exponential and Related Distributions for Engineers and Scientists provides a concise, carefully selected presentation of the properties and principles of selected distributions that are most useful for application in the sciences and engineering.The book begins with all the basic mathematical and statistical background necessary to select the correct distribution to model real-world data sets. This includes inference, decision theory, and computational aspects including the popular Bootstrap method. The authors then examine four skewed distributions in detail: exponential, gamma, Weibull, and extreme value. For each one, they discuss general properties and applicabi...

  4. The stochastic distribution of available coefficient of friction for human locomotion of five different floor surfaces.

    Science.gov (United States)

    Chang, Wen-Ruey; Matz, Simon; Chang, Chien-Chi

    2014-05-01

    The maximum coefficient of friction that can be supported at the shoe and floor interface without a slip is usually called the available coefficient of friction (ACOF) for human locomotion. The probability of a slip could be estimated using a statistical model by comparing the ACOF with the required coefficient of friction (RCOF), assuming that both coefficients have stochastic distributions. An investigation of the stochastic distributions of the ACOF of five different floor surfaces under dry, water and glycerol conditions is presented in this paper. One hundred friction measurements were performed on each floor surface under each surface condition. The Kolmogorov-Smirnov goodness-of-fit test was used to determine if the distribution of the ACOF was a good fit with the normal, log-normal and Weibull distributions. The results indicated that the ACOF distributions had a slightly better match with the normal and log-normal distributions than with the Weibull in only three out of 15 cases with a statistical significance. The results are far more complex than what had heretofore been published and different scenarios could emerge. Since the ACOF is compared with the RCOF for the estimate of slip probability, the distribution of the ACOF in seven cases could be considered a constant for this purpose when the ACOF is much lower or higher than the RCOF. A few cases could be represented by a normal distribution for practical reasons based on their skewness and kurtosis values without a statistical significance. No representation could be found in three cases out of 15. Copyright © 2013 Elsevier Ltd and The Ergonomics Society. All rights reserved.

  5. Statistical aspects of fatigue crack growth life of base metal, weld metal and heat affected zone in FSWed 7075-T651aluminum alloy

    International Nuclear Information System (INIS)

    Sohn, Hye Jeong; Haryadi, Gunawan Dwi; Kim, Seon Jin

    2014-01-01

    The statistical aspects of fatigue crack growth life of base metal (BM), weld metal (WM) and heat affected zone (HAZ) in friction stir welded (FSWed) 7075-T651 aluminum alloy has been studied by Weibull statistical analysis. The fatigue crack growth tests were performed at room temperature on ASTM standard CT specimens under three different constant stress intensity factor range controls. The main objective of this paper is to investigate the effects of statistical aspects of fatigue crack growth life on stress intensity factor ranges and material properties, namely BM, WM and HAZ specimens. In this work, the Weibull distribution was employed to estimate the statistical aspects of fatigue crack growth life. The shape parameter of Weibull distribution for fatigue crack growth life was significantly affected by material properties and the stress intensity factor range. The scale parameter of WM specimen exhibited the lowest value at all stress intensity factor ranges.

  6. Assessing the Adequacy of Probability Distributions for Estimating the Extreme Events of Air Temperature in Dabaa Region

    International Nuclear Information System (INIS)

    El-Shanshoury, Gh.I.

    2015-01-01

    Assessing the adequacy of probability distributions for estimating the extreme events of air temperature in Dabaa region is one of the pre-requisite s for any design purpose at Dabaa site which can be achieved by probability approach. In the present study, three extreme value distributions are considered and compared to estimate the extreme events of monthly and annual maximum and minimum temperature. These distributions include the Gumbel/Frechet distributions for estimating the extreme maximum values and Gumbel /Weibull distributions for estimating the extreme minimum values. Lieblein technique and Method of Moments are applied for estimating the distribution para meters. Subsequently, the required design values with a given return period of exceedance are obtained. Goodness-of-Fit tests involving Kolmogorov-Smirnov and Anderson-Darling are used for checking the adequacy of fitting the method/distribution for the estimation of maximum/minimum temperature. Mean Absolute Relative Deviation, Root Mean Square Error and Relative Mean Square Deviation are calculated, as the performance indicators, to judge which distribution and method of parameters estimation are the most appropriate one to estimate the extreme temperatures. The present study indicated that the Weibull distribution combined with Method of Moment estimators gives the highest fit, most reliable, accurate predictions for estimating the extreme monthly and annual minimum temperature. The Gumbel distribution combined with Method of Moment estimators showed the highest fit, accurate predictions for the estimation of the extreme monthly and annual maximum temperature except for July, August, October and November. The study shows that the combination of Frechet distribution with Method of Moment is the most accurate for estimating the extreme maximum temperature in July, August and November months while t he Gumbel distribution and Lieblein technique is the best for October

  7. CARES/PC - CERAMICS ANALYSIS AND RELIABILITY EVALUATION OF STRUCTURES

    Science.gov (United States)

    Szatmary, S. A.

    1994-01-01

    The beneficial properties of structural ceramics include their high-temperature strength, light weight, hardness, and corrosion and oxidation resistance. For advanced heat engines, ceramics have demonstrated functional abilities at temperatures well beyond the operational limits of metals. This is offset by the fact that ceramic materials tend to be brittle. When a load is applied, their lack of significant plastic deformation causes the material to crack at microscopic flaws, destroying the component. CARES/PC performs statistical analysis of data obtained from the fracture of simple, uniaxial tensile or flexural specimens and estimates the Weibull and Batdorf material parameters from this data. CARES/PC is a subset of the program CARES (COSMIC program number LEW-15168) which calculates the fast-fracture reliability or failure probability of ceramic components utilizing the Batdorf and Weibull models to describe the effects of multi-axial stress states on material strength. CARES additionally requires that the ceramic structure be modeled by a finite element program such as MSC/NASTRAN or ANSYS. The more limited CARES/PC does not perform fast-fracture reliability estimation of components. CARES/PC estimates ceramic material properties from uniaxial tensile or from three- and four-point bend bar data. In general, the parameters are obtained from the fracture stresses of many specimens (30 or more are recommended) whose geometry and loading configurations are held constant. Parameter estimation can be performed for single or multiple failure modes by using the least-squares analysis or the maximum likelihood method. Kolmogorov-Smirnov and Anderson-Darling goodness-of-fit tests measure the accuracy of the hypothesis that the fracture data comes from a population with a distribution specified by the estimated Weibull parameters. Ninety-percent confidence intervals on the Weibull parameters and the unbiased value of the shape parameter for complete samples are provided

  8. Assessment of wind characteristics for energy generation

    Energy Technology Data Exchange (ETDEWEB)

    Koray Ulgen [Ege University, Izmir (Turkey). Solar Energy Institute; Asir Genc [Selcuk University, Konya (Turkey). Dept. of Statistics; Arif Hepbasli [Ege University, Izmir (Turkey). Dept. of Mechanical Engineering; Galip Oturanc [Selcuk University, Konya (Turkey). Dept. of Mathematics

    2004-11-15

    Wind technology in Turkey has gained considerable maturity over the last five years, and wind energy projects are becoming commercially attractive in the country. In practice, it is essential to describe the variation of wind speeds for optimizing the design of the systems resulting in less energy generating costs. The wind variation for a typical site is usually described using the so-called Weibull distribution. In this study, the two Weibull parameters of the wind speed distribution function, the shape parameter k (dimensionless) and the scale parameter c (m/s), were computed from the wind speed data for Aksehir in Konya, located in Central Anatolia in Turkey (latitude: 38.35{sup o} and longitude: 31.42{sup o}). Wind data, consisting of hourly wind speed records over a 6 year period, 1997-2002, were obtained from the Aksehir State Meteorological Station. Based on the experimental data, it was found that the numerical values of both Weibull parameters (k and c) for Aksehir vary over a wide range. The yearly values of k range from 1.756 to 2.076, while those of c are in the range of 2.956 to 3.444. Average seasonal Weibull distributions for Aksehir are given. The wind speed distributions are represented by Weibull distribution and also by Rayleigh distribution with a special case of the Weibull distribution for k = 2. The Rayleigh distribution is found to be suitable to represent the actual probability of wind speed data for the site studied. (author)

  9. A study of optimization problem for amplify-and-forward relaying over weibull fading channels

    KAUST Repository

    Ikki, Salama Said

    2010-09-01

    This paper addresses the power allocation and relay positioning problems in amplify-and-forward cooperative networks operating in Weibull fading environments. We study adaptive power allocation (PA) with fixed relay location, optimal relay location with fixed power allocation, and joint optimization of the PA and relay location under total transmit power constraint, in order to minimize the outage probability and average error probability at high signal-to-noise ratios (SNR). Analytical results are validated by numerical simulations and comparisons between the different optimization schemes and their performance are provided. Results show that optimum PA brings only coding gain, while optimum relay location yields, in addition to the latter, diversity gains as well. Also, joint optimization improves both, the diversity gain and coding gain. Furthermore, results illustrate that the analyzed adaptive algorithms outperform uniform schemes. ©2010 IEEE.

  10. Reliability analysis for wind turbines with incomplete failure data collected from after the date of initial installation

    International Nuclear Information System (INIS)

    Guo Haitao; Watson, Simon; Tavner, Peter; Xiang Jiangping

    2009-01-01

    Reliability has an impact on wind energy project costs and benefits. Both life test data and field failure data can be used for reliability analysis. In wind energy industry, wind farm operators have greater interest in recording wind turbine operating data. However, field failure data may be tainted or incomplete, and therefore it needs a more general mathematical model and algorithms to solve the model. The aim of this paper is to provide a solution to this problem. A three-parameter Weibull failure rate function is discussed for wind turbines and the parameters are estimated by maximum likelihood and least squares. Two populations of German and Danish wind turbines are analyzed. The traditional Weibull failure rate function is also employed for comparison. Analysis shows that the three-parameter Weibull function can obtain more accuracy on reliability growth of wind turbines. This work will be helpful in the understanding of the reliability growth of wind energy systems as wind energy technologies evolving. The proposed three-parameter Weibull function is also applicable to the life test of the components that have been used for a period of time, not only in wind energy but also in other industries

  11. A distributed delay approach for modeling delayed outcomes in pharmacokinetics and pharmacodynamics studies.

    Science.gov (United States)

    Hu, Shuhua; Dunlavey, Michael; Guzy, Serge; Teuscher, Nathan

    2018-04-01

    A distributed delay approach was proposed in this paper to model delayed outcomes in pharmacokinetics and pharmacodynamics studies. This approach was shown to be general enough to incorporate a wide array of pharmacokinetic and pharmacodynamic models as special cases including transit compartment models, effect compartment models, typical absorption models (either zero-order or first-order absorption), and a number of atypical (or irregular) absorption models (e.g., parallel first-order, mixed first-order and zero-order, inverse Gaussian, and Weibull absorption models). Real-life examples were given to demonstrate how to implement distributed delays in Phoenix ® NLME™ 8.0, and to numerically show the advantages of the distributed delay approach over the traditional methods.

  12. Statistical study on the strength of structural materials and elements

    International Nuclear Information System (INIS)

    Blume, J.A.; Dalal, J.S.; Honda, K.K.

    1975-07-01

    Strength data for structural materials and elements including concrete, reinforcing steel, structural steel, plywood elements, reinforced concrete beams, reinforced concrete columns, brick masonry elements, and concrete masonry walls were statistically analyzed. Sample statistics were computed for these data, and distribution parameters were derived for normal, lognormal, and Weibull distributions. Goodness-of-fit tests were performed on these distributions. Most data, except those for masonry elements, displayed fairly small dispersion. Dispersion in data for structural materials was generally found to be smaller than for structural elements. Lognormal and Weibull distributions displayed better overall fits to data than normal distribution, although either Weibull or lognormal distribution can be used to represent the data analyzed. (auth)

  13. Hyperbolic Cosine–Exponentiated Exponential Lifetime Distribution and its Application in Reliability

    Directory of Open Access Journals (Sweden)

    Omid Kharazmi

    2017-02-01

    Full Text Available Recently, Kharazmi and Saadatinik (2016 introduced a new family of lifetime distributions called hyperbolic cosine – F (HCF distribution. In the present paper, it is focused on a special case of HCF family with exponentiated exponential distribution as a baseline distribution (HCEE. Various properties of the proposed distribution including explicit expressions for the moments, quantiles, mode, moment generating function, failure rate function, mean residual lifetime, order statistics and expression of the entropy are derived. Estimating parameters of HCEE distribution are obtained by eight estimation methods: maximum likelihood, Bayesian, maximum product of spacings, parametric bootstrap, non-parametric bootstrap, percentile, least-squares and weighted least-squares. A simulation study is conducted to examine the bias, mean square error of the maximum likelihood estimators. Finally, one real data set has been analyzed for illustrative purposes and it is observed that the proposed model fits better than Weibull, gamma and generalized exponential distributions.

  14. Performance Analysis of Methods for Estimating Weibull Parameters ...

    African Journals Online (AJOL)

    The performance analysis revealed that the MLM was the most accurate model followed by the EPF and the GM. Furthermore, the comparison between the wind speed standard deviation predicted by the proposed models and the measured data showed that the MLM has a smaller relative error of -3.33% on average ...

  15. Cell-size distribution and scaling in a one-dimensional Kolmogorov-Johnson-Mehl-Avrami lattice model with continuous nucleation

    Science.gov (United States)

    Néda, Zoltán; Járai-Szabó, Ferenc; Boda, Szilárd

    2017-10-01

    The Kolmogorov-Johnson-Mehl-Avrami (KJMA) growth model is considered on a one-dimensional (1D) lattice. Cells can grow with constant speed and continuously nucleate on the empty sites. We offer an alternative mean-field-like approach for describing theoretically the dynamics and derive an analytical cell-size distribution function. Our method reproduces the same scaling laws as the KJMA theory and has the advantage that it leads to a simple closed form for the cell-size distribution function. It is shown that a Weibull distribution is appropriate for describing the final cell-size distribution. The results are discussed in comparison with Monte Carlo simulation data.

  16. Distribuição de frequência da chuva para região Centro-Sul do Ceará, Brasil Frequency distribution of rainfall for the South-Central region of Ceará, Brazil

    Directory of Open Access Journals (Sweden)

    Ítalo Nunes Silva

    2013-09-01

    Full Text Available Foram analisadas sete distribuições de probabilidade Exponencial, Gama, Log-normal, Normal, Weibull, Gumbel e Beta para a chuva mensal e anual na região Centro-Sul do Ceará, Brasil. Para verificação dos ajustes dos dados às funções densidade de probabilidade foi utilizado o teste não-paramétrico de Kolmogorov-Smirnov com nível de 5% de significância. Os dados de chuva foram obtidos da base de dados da SUDENE registrados durante o período de 1913 a 1989. Para a chuva total anual teve ajuste satisfatório dos dados às distribuições Gama, Gumbel, Normal e Weibull e não ocorreu ajuste às distribuições Exponencial, Log-normal e Beta. Recomenda-se o uso da distribuição Normal para estimar valores de chuva provável anual para a região, por ser um procedimento de fácil aplicação e também pelo bom desempenho nos testes. A distribuição de frequência Gumbel foi a que melhor representou os dados de chuva para o período mensal, com o maior número de ajustes no período chuvoso. No período seco os dados de chuva foram melhores representados pela distribuição Exponencial.Seven probability distributions were analysed: Exponential, Gamma, Log-Normal, Normal, Weibull, Gumbel and Beta, for monthly and annual rainfall in the south-central region of Ceará, Brazil. In order to verify the adjustments of the data to the probability density functions, the non-parametric Kolmogorov-Smirnov test was used with a 5% level of significance. The rainfall data were obtained from the database at SUDENE, recorded from 1913 to 1989. For the total annual rainfall, adjustment of the data to the Gamma, Gumbel, Normal and Weibull distributions was satisfactory, and there was no adjustment to the Exponential, Log-normal and Beta distributions. Use of Normal distribution is recommended to estimate the values of probable annual rainfall in the region, this being a procedure of easy application, performing well in the tests. The Gumbel frequency

  17. Application of Weibull analysis and artificial neural networks to predict the useful life of the vacuum packed soft cheese

    Directory of Open Access Journals (Sweden)

    Jesús Alexander Sánchez-González

    2017-01-01

    Full Text Available El objetivo de este trabajo fue evaluar la capacidad de las redes neuronales artificiales (RNA para predecir la vida útil y la acidez en el queso fresco envasado al vacío. En primer lugar, se prepararon muestras de queso de 200 g por unidad. Luego estas muestras se almacenaron en un intervalo de 2 a 4 días a temperaturas de 4, 10 y 16 ° C y humedad relativa del 67,5%. A lo largo del almacenamiento se determinaron la acidez (AC y la aceptabilidad sensorial. Esta aceptabilidad se utilizó para determinar el tiempo de vida útil (TVU por el método de riesgo sensorial Weibull modificado. Se creó y entrenó un conjunto de redes neuronales artificiales (RNA; como entradas se utilizaron la temperatura (T, tiempo de maduración (M y posibilidad de fallo (F (x y TVU y AC como salidas. A partir de este conjunto, se seleccionaron las redes con el menor error cuadrático medio (ECM y el mejor ajuste (R2. Estas redes mostraron coeficientes de correlación (R2 de 0,9996 y 0,6897 para TVU y AC respectivamente y buena precisión en comparación con modelos de regresión. Se muestra que la RNA puede usarse para modelar adecuadamente TVU y en menor grado AC de quesos frescos envasados al vacío.

  18. The statitistical evaluation of the uniaxial compressive strength of the Ruskov andesite

    Directory of Open Access Journals (Sweden)

    Krepelka František

    2002-03-01

    Full Text Available The selection of a suitable model of the statistical distribution of the uniaxial compressive strength is discussed in the paper. The uniaxial compressive strength was studied on 180 specimens of the Ruskov andesite. The rate of loading was 1MPa.s-1. The experimental specimens had a prismatic form with a square base; the slightness ratio of specimens was 2:1. Three sets of specimens with a different length of the base edge were studied, namely 50, 30 and 10 mm. The result of the measurement were three sets with 60 values of the uniaxial compressive strength. The basic statistical parameters: the sample mean, the sample standard deviation, the variational interval, the minimum and maximum value, the sample obliqueness coefficient and the sharpness coefficient were evaluated for each collection. Two types of the distribution which can be joined with the real physical fundamentals of the desintegration of rocks ( the normal and the Weibull distribution were tested. The two-parametric Weibull distribution was tested. The basic characteristics of both distributions were evaluated for each set and the accordance of the model distribution with an experimental distribution was tested. The ÷2-test was used for testing. The two-parametric Weibull distribution was selected following the comparison of the test results of both model distributions as a suitable distribution model for the characterization of uniaxial compressive strength of the Ruskov andesite. The two-parametric Weibull distribution showed better results of the goodness-of-fit test. The normal distribution was suitable for two sets; one of the sets showed a negative result of the goodness-of-fit testing. At the uniaxial compressive strength of the Ruskov andesite, a scale effect was registered : the mean value of uniaxial compressive strength decreases with increasing the specimen base edge. This is another argument for using the Weibull distribution as a suitable statistical model of the

  19. Distribution functions for the linear region of the S-N curve

    Energy Technology Data Exchange (ETDEWEB)

    Mueller, Christian; Waechter, Michael; Masendorf, Rainer; Esderts, Alfons [TU Clausthal, Clausthal-Zellerfeld (Germany). Inst. for Plant Engineering and Fatigue Analysis

    2017-08-01

    This study establishes a database containing the results of fatigue tests from the linear region of the S-N curve using sources from the literature. Each set of test results originates from testing metallic components on a single load level. Eighty-nine test series with sample sizes of 14 ≤ n ≤ 500 are included in the database, resulting in a sum of 6,086 individual test results. The test series are tested in terms of the type of distribution function (log-normal or 2-parameter Weibull) using the Shapiro-Wilk test, the Anderson-Darling test and probability plots. The majority of the tested individual test results follows a log-normal distribution.

  20. Scaling strength distributions in quasi-brittle materials from micro-to macro-scales: A computational approach to modeling Nature-inspired structural ceramics

    International Nuclear Information System (INIS)

    Genet, Martin; Couegnat, Guillaume; Tomsia, Antoni P.; Ritchie, Robert O.

    2014-01-01

    This paper presents an approach to predict the strength distribution of quasi-brittle materials across multiple length-scales, with emphasis on Nature-inspired ceramic structures. It permits the computation of the failure probability of any structure under any mechanical load, solely based on considerations of the microstructure and its failure properties by naturally incorporating the statistical and size-dependent aspects of failure. We overcome the intrinsic limitations of single periodic unit-based approaches by computing the successive failures of the material components and associated stress redistributions on arbitrary numbers of periodic units. For large size samples, the microscopic cells are replaced by a homogenized continuum with equivalent stochastic and damaged constitutive behavior. After establishing the predictive capabilities of the method, and illustrating its potential relevance to several engineering problems, we employ it in the study of the shape and scaling of strength distributions across differing length-scales for a particular quasi-brittle system. We find that the strength distributions display a Weibull form for samples of size approaching the periodic unit; however, these distributions become closer to normal with further increase in sample size before finally reverting to a Weibull form for macroscopic sized samples. In terms of scaling, we find that the weakest link scaling applies only to microscopic, and not macroscopic scale, samples. These findings are discussed in relation to failure patterns computed at different size-scales. (authors)

  1. Characteristics of service requests and service processes of fire and rescue service dispatch centers: analysis of real world data and the underlying probability distributions.

    Science.gov (United States)

    Krueger, Ute; Schimmelpfeng, Katja

    2013-03-01

    A sufficient staffing level in fire and rescue dispatch centers is crucial for saving lives. Therefore, it is important to estimate the expected workload properly. For this purpose, we analyzed whether a dispatch center can be considered as a call center. Current call center publications very often model call arrivals as a non-homogeneous Poisson process. This bases on the underlying assumption of the caller's independent decision to call or not to call. In case of an emergency, however, there are often calls from more than one person reporting the same incident and thus, these calls are not independent. Therefore, this paper focuses on the dependency of calls in a fire and rescue dispatch center. We analyzed and evaluated several distributions in this setting. Results are illustrated using real-world data collected from a typical German dispatch center in Cottbus ("Leitstelle Lausitz"). We identified the Pólya distribution as being superior to the Poisson distribution in describing the call arrival rate and the Weibull distribution to be more suitable than the exponential distribution for interarrival times and service times. However, the commonly used distributions offer acceptable approximations. This is important for estimating a sufficient staffing level in practice using, e.g., the Erlang-C model.

  2. The ATLAS distributed analysis system

    International Nuclear Information System (INIS)

    Legger, F

    2014-01-01

    In the LHC operations era, analysis of the multi-petabyte ATLAS data sample by globally distributed physicists is a challenging task. To attain the required scale the ATLAS Computing Model was designed around the concept of Grid computing, realized in the Worldwide LHC Computing Grid (WLCG), the largest distributed computational resource existing in the sciences. The ATLAS experiment currently stores over 140 PB of data and runs about 140,000 concurrent jobs continuously at WLCG sites. During the first run of the LHC, the ATLAS Distributed Analysis (DA) service has operated stably and scaled as planned. More than 1600 users submitted jobs in 2012, with 2 million or more analysis jobs per week, peaking at about a million jobs per day. The system dynamically distributes popular data to expedite processing and maximally utilize resources. The reliability of the DA service is high and steadily improving; Grid sites are continually validated against a set of standard tests, and a dedicated team of expert shifters provides user support and communicates user problems to the sites. Both the user support techniques and the direct feedback of users have been effective in improving the success rate and user experience when utilizing the distributed computing environment. In this contribution a description of the main components, activities and achievements of ATLAS distributed analysis is given. Several future improvements being undertaken will be described.

  3. The ATLAS distributed analysis system

    Science.gov (United States)

    Legger, F.; Atlas Collaboration

    2014-06-01

    In the LHC operations era, analysis of the multi-petabyte ATLAS data sample by globally distributed physicists is a challenging task. To attain the required scale the ATLAS Computing Model was designed around the concept of Grid computing, realized in the Worldwide LHC Computing Grid (WLCG), the largest distributed computational resource existing in the sciences. The ATLAS experiment currently stores over 140 PB of data and runs about 140,000 concurrent jobs continuously at WLCG sites. During the first run of the LHC, the ATLAS Distributed Analysis (DA) service has operated stably and scaled as planned. More than 1600 users submitted jobs in 2012, with 2 million or more analysis jobs per week, peaking at about a million jobs per day. The system dynamically distributes popular data to expedite processing and maximally utilize resources. The reliability of the DA service is high and steadily improving; Grid sites are continually validated against a set of standard tests, and a dedicated team of expert shifters provides user support and communicates user problems to the sites. Both the user support techniques and the direct feedback of users have been effective in improving the success rate and user experience when utilizing the distributed computing environment. In this contribution a description of the main components, activities and achievements of ATLAS distributed analysis is given. Several future improvements being undertaken will be described.

  4. Moment and maximum likelihood estimators for Weibull distributions under length- and area-biased sampling

    Science.gov (United States)

    Jeffrey H. Gove

    2003-01-01

    Many of the most popular sampling schemes used in forestry are probability proportional to size methods. These methods are also referred to as size biased because sampling is actually from a weighted form of the underlying population distribution. Length- and area-biased sampling are special cases of size-biased sampling where the probability weighting comes from a...

  5. Assessing biomass based on canopy height profiles using airborne laser scanning data in eucalypt plantations

    Directory of Open Access Journals (Sweden)

    André Gracioso Peres Silva

    2015-12-01

    Full Text Available This study aimed to map the stem biomass of an even-aged eucalyptus plantation in southeastern Brazil based on canopy height profile (CHPs statistics using wall-to-wall discrete return airborne laser scanning (ALS, and compare the results with alternative maps generated by ordinary kriging interpolation from field-derived measurements. The assessment of stem biomass with ALS data was carried out using regression analysis methods. Initially, CHPs were determined to express the distribution of laser point heights in the ALS cloud for each sample plot. The probability density function (pdf used was the Weibull distribution, with two parameters that in a secondary task, were used as explanatory variables to model stem biomass. ALS metrics such as height percentiles, dispersion of heights, and proportion of points were also investigated. A simple linear regression model of stem biomass as a function of the Weibull scale parameter showed high correlation (adj.R2 = 0.89. The alternative model considering the 30th percentile and the Weibull shape parameter slightly improved the quality of the estimation (adj.R2 = 0.93. Stem biomass maps based on the Weibull scale parameter doubled the accuracy of the ordinary kriging approach (relative root mean square error = 6 % and 13 %, respectively.

  6. Empirical model based on Weibull distribution describing the destruction kinetics of natural microbiota in pineapple (Ananas comosus L.) puree during high-pressure processing.

    Science.gov (United States)

    Chakraborty, Snehasis; Rao, Pavuluri Srinivasa; Mishra, Hari Niwas

    2015-10-15

    High pressure inactivation of natural microbiota viz. aerobic mesophiles (AM), psychrotrophs (PC), yeasts and molds (YM), total coliforms (TC) and lactic acid bacteria (LAB) in pineapple puree was studied within the experimental domain of 0.1-600 MPa and 30-50 °C with a treatment time up to 20 min. A complete destruction of yeasts and molds was obtained at 500 MPa/50 °C/15 min; whereas no counts were detected for TC and LAB at 300 MPa/30 °C/15 min. A maximum of two log cycle reductions was obtained for YM during pulse pressurization at the severe process intensity of 600 MPa/50 °C/20 min. The Weibull model clearly described the non-linearity of the survival curves during the isobaric period. The tailing effect, as confirmed by the shape parameter (β) of the survival curve, was obtained in case of YM (β1) was observed for the other microbial groups. Analogous to thermal death kinetics, the activation energy (Ea, kJ·mol(-1)) and the activation volume (Va, mL·mol(-1)) values were computed further to describe the temperature and pressure dependencies of the scale parameter (δ, min), respectively. A higher δ value was obtained for each microbe at a lower temperature and it decreased with an increase in pressure. A secondary kinetic model was developed describing the inactivation rate (k, min(-1)) as a function of pressure (P, MPa) and temperature (T, K) including the dependencies of Ea and Va on P and T, respectively. Copyright © 2015 Elsevier B.V. All rights reserved.

  7. Sunflower petals: Some physical properties and modeling distribution of their number, dimensions, and mass

    Directory of Open Access Journals (Sweden)

    Amir Hossein Mirzabe

    2018-06-01

    Full Text Available Sunflower petal is one of the parts of the sunflower which has drawn attention and has several applications these days. These applications justify getting information about physical properties, mechanical properties, drying trends, etc. in order to design new machines and use new methods to harvest or dry the sunflower petals. For three varieties of sunflower, picking force of petals was measured; number of petals of each head was counted; unit mass and 1000-unit mass of fresh petals were measured and length, width, and projected area of fresh petals were calculated based on image processing technique; frequency distributions of these parameters were modeled using statistical distribution models namely Gamma, Generalized Extreme Value (G. E. V, Lognormal, and Weibull. Results of picking force showed that with increasing number of days after appearing the first petal on each head from 5 to 14 and decreasing loading rate from 150 g min−1 to 50 g min−1 values of picking force were decreased for three varieties, but diameter of sunflower head had different effects on picking force for each variety. Length, width, and number of petals of Dorsefid variety ranged from 38.52 to 95.44 mm, 3.80 to 9.28 mm and 29 to 89, respectively. The corresponding values ranged from 34.19 to 88.18 mm, 4.28 to 10.60 mm and 21 to 89, respectively for Shamshiri variety and ranged from 44.47 to 114.63 mm, 7.03 to 20.31 mm and 29 to 89 for Sirena variety. Results of frequency distribution modeling indicated that in most cases, G. E. V and Weibull distributions had better performance than other distributions. Keywords: Sunflower (Helianthus annus L. petal, Picking force, Image processing, Fibonacci sequence, Lucas sequence

  8. Gumbel Weibull distribution function for Sahel precipitation ...

    African Journals Online (AJOL)

    user

    insecurity, migration, social conflicts, etc.). An efficient management of under and over ground water is a ... affects their incomes (Udual and Ini, 2012). Researches on modeling, prediction and forecasting ... Douentza in Mopti region, situated on the national road 15 highway linking Mopti to Gao and Kidal regions. This small ...

  9. Distributed analysis with PROOF in ATLAS collaboration

    International Nuclear Information System (INIS)

    Panitkin, S Y; Ernst, M; Ito, H; Maeno, T; Majewski, S; Rind, O; Tarrade, F; Wenaus, T; Ye, S; Benjamin, D; Montoya, G Carillo; Guan, W; Mellado, B; Xu, N; Cranmer, K; Shibata, A

    2010-01-01

    The Parallel ROOT Facility - PROOF is a distributed analysis system which allows to exploit inherent event level parallelism of high energy physics data. PROOF can be configured to work with centralized storage systems, but it is especially effective together with distributed local storage systems - like Xrootd, when data are distributed over computing nodes. It works efficiently on different types of hardware and scales well from a multi-core laptop to large computing farms. From that point of view it is well suited for both large central analysis facilities and Tier 3 type analysis farms. PROOF can be used in interactive or batch like regimes. The interactive regime allows the user to work with typically distributed data from the ROOT command prompt and get a real time feedback on analysis progress and intermediate results. We will discuss our experience with PROOF in the context of ATLAS Collaboration distributed analysis. In particular we will discuss PROOF performance in various analysis scenarios and in multi-user, multi-session environments. We will also describe PROOF integration with the ATLAS distributed data management system and prospects of running PROOF on geographically distributed analysis farms.

  10. Distributed analysis with PROOF in ATLAS collaboration

    Energy Technology Data Exchange (ETDEWEB)

    Panitkin, S Y; Ernst, M; Ito, H; Maeno, T; Majewski, S; Rind, O; Tarrade, F; Wenaus, T; Ye, S [Brookhaven National Laboratory, Upton, NY 11973 (United States); Benjamin, D [Duke University, Durham, NC 27708 (United States); Montoya, G Carillo; Guan, W; Mellado, B; Xu, N [University of Wisconsin-Madison, Madison, WI 53706 (United States); Cranmer, K; Shibata, A [New York University, New York, NY 10003 (United States)

    2010-04-01

    The Parallel ROOT Facility - PROOF is a distributed analysis system which allows to exploit inherent event level parallelism of high energy physics data. PROOF can be configured to work with centralized storage systems, but it is especially effective together with distributed local storage systems - like Xrootd, when data are distributed over computing nodes. It works efficiently on different types of hardware and scales well from a multi-core laptop to large computing farms. From that point of view it is well suited for both large central analysis facilities and Tier 3 type analysis farms. PROOF can be used in interactive or batch like regimes. The interactive regime allows the user to work with typically distributed data from the ROOT command prompt and get a real time feedback on analysis progress and intermediate results. We will discuss our experience with PROOF in the context of ATLAS Collaboration distributed analysis. In particular we will discuss PROOF performance in various analysis scenarios and in multi-user, multi-session environments. We will also describe PROOF integration with the ATLAS distributed data management system and prospects of running PROOF on geographically distributed analysis farms.

  11. Estimation of some stochastic models used in reliability engineering

    International Nuclear Information System (INIS)

    Huovinen, T.

    1989-04-01

    The work aims to study the estimation of some stochastic models used in reliability engineering. In reliability engineering continuous probability distributions have been used as models for the lifetime of technical components. We consider here the following distributions: exponential, 2-mixture exponential, conditional exponential, Weibull, lognormal and gamma. Maximum likelihood method is used to estimate distributions from observed data which may be either complete or censored. We consider models based on homogeneous Poisson processes such as gamma-poisson and lognormal-poisson models for analysis of failure intensity. We study also a beta-binomial model for analysis of failure probability. The estimators of the parameters for three models are estimated by the matching moments method and in the case of gamma-poisson and beta-binomial models also by maximum likelihood method. A great deal of mathematical or statistical problems that arise in reliability engineering can be solved by utilizing point processes. Here we consider the statistical analysis of non-homogeneous Poisson processes to describe the failing phenomena of a set of components with a Weibull intensity function. We use the method of maximum likelihood to estimate the parameters of the Weibull model. A common cause failure can seriously reduce the reliability of a system. We consider a binomial failure rate (BFR) model as an application of the marked point processes for modelling common cause failure in a system. The parameters of the binomial failure rate model are estimated with the maximum likelihood method

  12. Development of probabilistic fatigue curve for asphalt concrete based on viscoelastic continuum damage mechanics

    Directory of Open Access Journals (Sweden)

    Himanshu Sharma

    2016-07-01

    Full Text Available Due to its roots in fundamental thermodynamic framework, continuum damage approach is popular for modeling asphalt concrete behavior. Currently used continuum damage models use mixture averaged values for model parameters and assume deterministic damage process. On the other hand, significant scatter is found in fatigue data generated even under extremely controlled laboratory testing conditions. Thus, currently used continuum damage models fail to account the scatter observed in fatigue data. This paper illustrates a novel approach for probabilistic fatigue life prediction based on viscoelastic continuum damage approach. Several specimens were tested for their viscoelastic properties and damage properties under uniaxial mode of loading. The data thus generated were analyzed using viscoelastic continuum damage mechanics principles to predict fatigue life. Weibull (2 parameter, 3 parameter and lognormal distributions were fit to fatigue life predicted using viscoelastic continuum damage approach. It was observed that fatigue damage could be best-described using Weibull distribution when compared to lognormal distribution. Due to its flexibility, 3-parameter Weibull distribution was found to fit better than 2-parameter Weibull distribution. Further, significant differences were found between probabilistic fatigue curves developed in this research and traditional deterministic fatigue curve. The proposed methodology combines advantages of continuum damage mechanics as well as probabilistic approaches. These probabilistic fatigue curves can be conveniently used for reliability based pavement design. Keywords: Probabilistic fatigue curve, Continuum damage mechanics, Weibull distribution, Lognormal distribution

  13. Hazard function analysis for flood planning under nonstationarity

    Science.gov (United States)

    Read, Laura K.; Vogel, Richard M.

    2016-05-01

    The field of hazard function analysis (HFA) involves a probabilistic assessment of the "time to failure" or "return period," T, of an event of interest. HFA is used in epidemiology, manufacturing, medicine, actuarial statistics, reliability engineering, economics, and elsewhere. For a stationary process, the probability distribution function (pdf) of the return period always follows an exponential distribution, the same is not true for nonstationary processes. When the process of interest, X, exhibits nonstationary behavior, HFA can provide a complementary approach to risk analysis with analytical tools particularly useful for hydrological applications. After a general introduction to HFA, we describe a new mathematical linkage between the magnitude of the flood event, X, and its return period, T, for nonstationary processes. We derive the probabilistic properties of T for a nonstationary one-parameter exponential model of X, and then use both Monte-Carlo simulation and HFA to generalize the behavior of T when X arises from a nonstationary two-parameter lognormal distribution. For this case, our findings suggest that a two-parameter Weibull distribution provides a reasonable approximation for the pdf of T. We document how HFA can provide an alternative approach to characterize the probabilistic properties of both nonstationary flood series and the resulting pdf of T.

  14. The ATLAS Distributed Analysis System

    CERN Document Server

    Legger, F; The ATLAS collaboration; Pacheco Pages, A; Stradling, A

    2013-01-01

    In the LHC operations era, analysis of the multi-petabyte ATLAS data sample by globally distributed physicists is a challenging task. To attain the required scale the ATLAS Computing Model was designed around the concept of grid computing, realized in the Worldwide LHC Computing Grid (WLCG), the largest distributed computational resource existing in the sciences. The ATLAS experiment currently stores over 140 PB of data and runs about 140,000 concurrent jobs continuously at WLCG sites. During the first run of the LHC, the ATLAS Distributed Analysis (DA) service has operated stably and scaled as planned. More than 1600 users submitted jobs in 2012, with 2 million or more analysis jobs per week, peaking at about a million jobs per day. The system dynamically distributes popular data to expedite processing and maximally utilize resources. The reliability of the DA service is high but steadily improving; grid sites are continually validated against a set of standard tests, and a dedicated team of expert shifters ...

  15. The ATLAS Distributed Analysis System

    CERN Document Server

    Legger, F; The ATLAS collaboration

    2014-01-01

    In the LHC operations era, analysis of the multi-petabyte ATLAS data sample by globally distributed physicists is a challenging task. To attain the required scale the ATLAS Computing Model was designed around the concept of grid computing, realized in the Worldwide LHC Computing Grid (WLCG), the largest distributed computational resource existing in the sciences. The ATLAS experiment currently stores over 140 PB of data and runs about 140,000 concurrent jobs continuously at WLCG sites. During the first run of the LHC, the ATLAS Distributed Analysis (DA) service has operated stably and scaled as planned. More than 1600 users submitted jobs in 2012, with 2 million or more analysis jobs per week, peaking at about a million jobs per day. The system dynamically distributes popular data to expedite processing and maximally utilize resources. The reliability of the DA service is high but steadily improving; grid sites are continually validated against a set of standard tests, and a dedicated team of expert shifters ...

  16. Computer Model to Estimate Reliability Engineering for Air Conditioning Systems

    International Nuclear Information System (INIS)

    Afrah Al-Bossly, A.; El-Berry, A.; El-Berry, A.

    2012-01-01

    Reliability engineering is used to predict the performance and optimize design and maintenance of air conditioning systems. Air conditioning systems are expose to a number of failures. The failures of an air conditioner such as turn on, loss of air conditioner cooling capacity, reduced air conditioning output temperatures, loss of cool air supply and loss of air flow entirely can be due to a variety of problems with one or more components of an air conditioner or air conditioning system. Forecasting for system failure rates are very important for maintenance. This paper focused on the reliability of the air conditioning systems. Statistical distributions that were commonly applied in reliability settings: the standard (2 parameter) Weibull and Gamma distributions. After distributions parameters had been estimated, reliability estimations and predictions were used for evaluations. To evaluate good operating condition in a building, the reliability of the air conditioning system that supplies conditioned air to the several The company's departments. This air conditioning system is divided into two, namely the main chilled water system and the ten air handling systems that serves the ten departments. In a chilled-water system the air conditioner cools water down to 40-45 degree F (4-7 degree C). The chilled water is distributed throughout the building in a piping system and connected to air condition cooling units wherever needed. Data analysis has been done with support a computer aided reliability software, this is due to the Weibull and Gamma distributions indicated that the reliability for the systems equal to 86.012% and 77.7% respectively. A comparison between the two important families of distribution functions, namely, the Weibull and Gamma families was studied. It was found that Weibull method performed for decision making.

  17. Assessment and analysis of wind energy generation and power ...

    African Journals Online (AJOL)

    time, a statistical analysis of wind characteristics and the extrapolation of weibull parameters are presented. Otherwise, the .... The wind speed probability density function. (PDF) can ... be adjusted using following expression [28, 30,. 31]:. (11).

  18. Probability distribution of pitting corrosion depth and rate in underground pipelines: A Monte Carlo study

    International Nuclear Information System (INIS)

    Caleyo, F.; Velazquez, J.C.; Valor, A.; Hallen, J.M.

    2009-01-01

    The probability distributions of external-corrosion pit depth and pit growth rate were investigated in underground pipelines using Monte Carlo simulations. The study combines a predictive pit growth model developed by the authors with the observed distributions of the model variables in a range of soils. Depending on the pipeline age, any of the three maximal extreme value distributions, i.e. Weibull, Frechet or Gumbel, can arise as the best fit to the pitting depth and rate data. The Frechet distribution best fits the corrosion data for long exposure periods. This can be explained by considering the long-term stabilization of the diffusion-controlled pit growth. The findings of the study provide reliability analysts with accurate information regarding the stochastic characteristics of the pitting damage in underground pipelines.

  19. Probability distribution of pitting corrosion depth and rate in underground pipelines: A Monte Carlo study

    Energy Technology Data Exchange (ETDEWEB)

    Caleyo, F. [Departamento de Ingenieria Metalurgica, ESIQIE, IPN, UPALM Edif. 7, Zacatenco, 07738 Mexico, D.F. (Mexico)], E-mail: fcaleyo@gmail.com; Velazquez, J.C. [Departamento de Ingenieria Metalurgica, ESIQIE, IPN, UPALM Edif. 7, Zacatenco, 07738 Mexico, D.F. (Mexico); Valor, A. [Facultad de Fisica, Universidad de La Habana, San Lazaro y L, Vedado, 10400, La Habana (Cuba); Hallen, J.M. [Departamento de Ingenieria Metalurgica, ESIQIE, IPN, UPALM Edif. 7, Zacatenco, 07738 Mexico, D.F. (Mexico)

    2009-09-15

    The probability distributions of external-corrosion pit depth and pit growth rate were investigated in underground pipelines using Monte Carlo simulations. The study combines a predictive pit growth model developed by the authors with the observed distributions of the model variables in a range of soils. Depending on the pipeline age, any of the three maximal extreme value distributions, i.e. Weibull, Frechet or Gumbel, can arise as the best fit to the pitting depth and rate data. The Frechet distribution best fits the corrosion data for long exposure periods. This can be explained by considering the long-term stabilization of the diffusion-controlled pit growth. The findings of the study provide reliability analysts with accurate information regarding the stochastic characteristics of the pitting damage in underground pipelines.

  20. Variation of local critical current and its influence on overall current of bent multifilamentary Bi2223/Ag tape

    International Nuclear Information System (INIS)

    Ochiai, S.; Doko, D.; Rokkaku, H.; Fujimoto, M.; Okuda, H.; Hojo, M.; Tanaka, M.; Sugano, M.; Osamura, K.; Mimura, M.

    2006-01-01

    The correlation between the local and overall currents in a multifilamentary Bi2223/Ag/Ag alloy composite tape under bending strain was studied. The correlation of the measured distributed local critical current and n-value to overall critical current was described comprehensively with a voltage summation model that regards the overall sample to be composed of a series circuit. The analysis of the measured critical current and n-value revealed that the distribution of local critical current could be described with the Weibull distribution function and the n-value could be expressed as a function of critical current as a first approximation. By combining the Weibull distribution function of the local critical current, the empirical formula of the n-value as a function of critical current, voltage summation model and Monte Carlo method, the overall current and n-value could be predicted fairly well from those of local elements

  1. The probability distribution model of air pollution index and its dominants in Kuala Lumpur

    Science.gov (United States)

    AL-Dhurafi, Nasr Ahmed; Razali, Ahmad Mahir; Masseran, Nurulkamal; Zamzuri, Zamira Hasanah

    2016-11-01

    This paper focuses on the statistical modeling for the distributions of air pollution index (API) and its sub-indexes data observed at Kuala Lumpur in Malaysia. Five pollutants or sub-indexes are measured including, carbon monoxide (CO); sulphur dioxide (SO2); nitrogen dioxide (NO2), and; particulate matter (PM10). Four probability distributions are considered, namely log-normal, exponential, Gamma and Weibull in search for the best fit distribution to the Malaysian air pollutants data. In order to determine the best distribution for describing the air pollutants data, five goodness-of-fit criteria's are applied. This will help in minimizing the uncertainty in pollution resource estimates and improving the assessment phase of planning. The conflict in criterion results for selecting the best distribution was overcome by using the weight of ranks method. We found that the Gamma distribution is the best distribution for the majority of air pollutants data in Kuala Lumpur.

  2. Distributed computing and nuclear reactor analysis

    International Nuclear Information System (INIS)

    Brown, F.B.; Derstine, K.L.; Blomquist, R.N.

    1994-01-01

    Large-scale scientific and engineering calculations for nuclear reactor analysis can now be carried out effectively in a distributed computing environment, at costs far lower than for traditional mainframes. The distributed computing environment must include support for traditional system services, such as a queuing system for batch work, reliable filesystem backups, and parallel processing capabilities for large jobs. All ANL computer codes for reactor analysis have been adapted successfully to a distributed system based on workstations and X-terminals. Distributed parallel processing has been demonstrated to be effective for long-running Monte Carlo calculations

  3. Comparison of ductile-to-brittle transition curve fitting approaches

    International Nuclear Information System (INIS)

    Cao, L.W.; Wu, S.J.; Flewitt, P.E.J.

    2012-01-01

    Ductile-to-brittle transition (DBT) curve fitting approaches are compared over the transition temperature range for reactor pressure vessel steels with different kinds of data, including Charpy-V notch impact energy data and fracture toughness data. Three DBT curve fitting methods have been frequently used in the past, including the Burr S-Weibull and tanh distributions. In general there is greater scatter associated with test data obtained within the transition region. Therefore these methods give results with different accuracies, especially when fitting to small quantities of data. The comparison shows that the Burr distribution and tanh distribution can almost equally fit well distributed and large data sets extending across the test temperature range to include the upper and lower shelves. The S-Weibull distribution fit is poor for the lower shelf of the DBT curve. Overall for both large and small quantities of measured data the Burr distribution provides the best description. - Highlights: ► Burr distribution offers a better fit than that of a S-Weibull and tanh fit. ► Burr and tanh methods show similar fitting ability for a large data set. ► Burr method can fit sparse data well distributed across the test temperature. ► S-Weibull method cannot fit the lower shelf well and show poor fitting quality.

  4. Determination of the Mechanical Properties of Plasma-Sprayed Hydroxyapatite Coatings Using the Knoop Indentation Technique

    Science.gov (United States)

    Hasan, Md. Fahad; Wang, James; Berndt, Christopher

    2015-06-01

    The microhardness and elastic modulus of plasma-sprayed hydroxyapatite coatings were evaluated using Knoop indentation on the cross section and on the top surface. The effects of indentation angle, testing direction, measurement location and applied load on the microhardness and elastic modulus were investigated. The variability and distribution of the microhardness and elastic modulus data were statistically analysed using the Weibull modulus distribution. The results indicate that the dependence of microhardness and elastic modulus on the indentation angle exhibits a parabolic shape. Dependence of the microhardness values on the indentation angle follows Pythagoras's theorem. The microhardness, Weibull modulus of microhardness and Weibull modulus of elastic modulus reach their maximum at the central position (175 µm) on the cross section of the coatings. The Weibull modulus of microhardness revealed similar values throughout the thickness, and the Weibull modulus of elastic modulus shows higher values on the top surface compared to the cross section.

  5. Probability Analysis of the Wave-Slamming Pressure Values of the Horizontal Deck with Elastic Support

    Science.gov (United States)

    Zuo, Weiguang; Liu, Ming; Fan, Tianhui; Wang, Pengtao

    2018-06-01

    This paper presents the probability distribution of the slamming pressure from an experimental study of regular wave slamming on an elastically supported horizontal deck. The time series of the slamming pressure during the wave impact were first obtained through statistical analyses on experimental data. The exceeding probability distribution of the maximum slamming pressure peak and distribution parameters were analyzed, and the results show that the exceeding probability distribution of the maximum slamming pressure peak accords with the three-parameter Weibull distribution. Furthermore, the range and relationships of the distribution parameters were studied. The sum of the location parameter D and the scale parameter L was approximately equal to 1.0, and the exceeding probability was more than 36.79% when the random peak was equal to the sample average during the wave impact. The variation of the distribution parameters and slamming pressure under different model conditions were comprehensively presented, and the parameter values of the Weibull distribution of wave-slamming pressure peaks were different due to different test models. The parameter values were found to decrease due to the increased stiffness of the elastic support. The damage criterion of the structure model caused by the wave impact was initially discussed, and the structure model was destroyed when the average slamming time was greater than a certain value during the duration of the wave impact. The conclusions of the experimental study were then described.

  6. ATLAS Distributed Analysis Tools

    CERN Document Server

    Gonzalez de la Hoz, Santiago; Liko, Dietrich

    2008-01-01

    The ATLAS production system has been successfully used to run production of simulation data at an unprecedented scale. Up to 10000 jobs were processed in one day. The experiences obtained operating the system on several grid flavours was essential to perform a user analysis using grid resources. First tests of the distributed analysis system were then performed. In the preparation phase data was registered in the LHC File Catalog (LFC) and replicated in external sites. For the main test, few resources were used. All these tests are only a first step towards the validation of the computing model. The ATLAS management computing board decided to integrate the collaboration efforts in distributed analysis in only one project, GANGA. The goal is to test the reconstruction and analysis software in a large scale Data production using Grid flavors in several sites. GANGA allows trivial switching between running test jobs on a local batch system and running large-scale analyses on the Grid; it provides job splitting a...

  7. Distributed analysis challenges in ATLAS

    Energy Technology Data Exchange (ETDEWEB)

    Duckeck, Guenter; Legger, Federica; Mitterer, Christoph Anton; Walker, Rodney [Ludwig-Maximilians-Universitaet Muenchen (Germany)

    2016-07-01

    The ATLAS computing model has undergone massive changes to meet the high luminosity challenge of the second run of the Large Hadron Collider (LHC) at CERN. The production system and distributed data management have been redesigned, a new data format and event model for analysis have been introduced, and common reduction and derivation frameworks have been developed. We report on the impact these changes have on the distributed analysis system, study the various patterns of grid usage for user analysis, focusing on the differences between the first and th e second LHC runs, and measure performances of user jobs.

  8. New considerations on variability of creep rupture data and life prediction

    International Nuclear Information System (INIS)

    Kim, Seon Jin; Jeong, Won Taek; Kong, Yu Sik

    2009-01-01

    This paper deals with the variability analysis of short term creep rupture test data based on the previous creep rupture tests and the possibility of the creep life prediction. From creep tests performed by constant uniaxial stresses at 600, 650 and 700 .deg. C elevated temperature, in order to investigate the variability of short-term creep rupture data, the creep curves were analyzed for normalized creep strain divided by initial strain. There are some variability in thee creep rupture data. And, the difference between general creep curves and normalized creep curves were obtained. The effects of the creep rupture time and state steady creep rate on the Weibull distribution parameters were investigated. There were good relation between normal Weibull parameters and normalized Weibull parameters. Finally, the predicted creep life were compared with the Monkman-Grant model.

  9. New Considerations on Variability of Creep Rupture Data and Life Prediction

    International Nuclear Information System (INIS)

    Jung, Won Taek; Kong, Yu Sik; Kim, Seon Jin

    2009-01-01

    This paper deals with the variability analysis of short term creep rupture test data based on the previous creep rupture tests and the possibility of the creep life prediction. From creep tests performed by constant uniaxial stresses at 600, 650 and 700 .deg. C elevated temperature, in order to investigate the variability of short-term creep rupture data, the creep curves were analyzed for normalized creep strain divided by initial strain. There are some variability in the creep rupture data. And, the difference between general creep curves and normalized creep curves were obtained. The effects of the creep rupture time (RT) and steady state creep rate (SSCR) on the Weibull distribution parameters were investigated. There were good relation between normal Weibull parameters and normalized Weibull parameters. Finally, the predicted creep life were compared with the Monkman-Grant model

  10. Evaluación poscosecha y estimación de vida útil de guayaba fresca utilizando el modelo de Weibull

    Directory of Open Access Journals (Sweden)

    Carlos García Mogollón

    2010-07-01

    Full Text Available La guayaba (Psidium guajava L. es una fruta tropical susceptible de sufrir alteraciones indeseables que afectan su vida útil debido a condiciones inadecuadas de almacenamiento y acondicionamiento. En este trabajo se estimó la vida útil de guayaba en fresco utilizando el modelo probabilístico de Weibull y se valuó la calidad de los frutos durante almacenamiento en diferentes condiciones de temperatura y empaque. La evaluación poscosecha se hizo por 15 días con guayabas variedad regional roja. Se utilizó un dise&#ntilde;o completamente al azar con arreglo factorial consistente en tres factores: tiempo de almacenamiento con seis niveles (0, 3, 6, 9, 12 y 15 días, temperatura de almacenamiento con dos niveles: ambiente (37 °C y humedad relativa (HR entre 85 y 90% y refrigeración (9±2 °C y HR de 85 - 90%; dos tipo de empaques: bandeja de poliestireno con film plástico de PVC y 'foil' de aluminio. Durante la evaluación sensorial en el periodo de almacenamiento se usó una escala estructurada de tres puntos grado de satisfacción. El modelo de Weibull demostró ser adecuado para predecir la vida útil de la guayaba fresca basados en los criterios de ajustes, límites de confianza de aceptación y fallo. Durante el periodo de almacenamiento se observó que el factor tiempo, la temperatura y el tipo de empaque tienen un efecto estadístico significativo (P < 0.05 sobre el diámetro equivalente, esfericidad, masa específica aparente, SST, pH, acidez y evaluación sensorial de los frutos. El producto puede ser consumido como fruta fresca hasta diez días de almacenamiento a temperatura ambiente y máximo quince días en almacenamiento refrigerado.

  11. Probabilistic analysis in normal operation of distribution system with distributed generation

    DEFF Research Database (Denmark)

    Villafafila-Robles, R.; Sumper, A.; Bak-Jensen, B.

    2011-01-01

    Nowadays, the incorporation of high levels of small-scale non-dispatchable distributed generation is leading to the transition from the traditional 'vertical' power system structure to a 'horizontally-operated' power system, where the distribution networks contain both stochastic generation...... and load. This fact increases the number of stochastic inputs and dependence structures between them need to be considered. The deterministic analysis is not enough to cope with these issues and a new approach is needed. Probabilistic analysis provides a better approach. Moreover, as distribution systems...

  12. Cluster analysis for determining distribution center location

    Science.gov (United States)

    Lestari Widaningrum, Dyah; Andika, Aditya; Murphiyanto, Richard Dimas Julian

    2017-12-01

    Determination of distribution facilities is highly important to survive in the high level of competition in today’s business world. Companies can operate multiple distribution centers to mitigate supply chain risk. Thus, new problems arise, namely how many and where the facilities should be provided. This study examines a fast-food restaurant brand, which located in the Greater Jakarta. This brand is included in the category of top 5 fast food restaurant chain based on retail sales. There were three stages in this study, compiling spatial data, cluster analysis, and network analysis. Cluster analysis results are used to consider the location of the additional distribution center. Network analysis results show a more efficient process referring to a shorter distance to the distribution process.

  13. Comparative Distributions of Hazard Modeling Analysis

    Directory of Open Access Journals (Sweden)

    Rana Abdul Wajid

    2006-07-01

    Full Text Available In this paper we present the comparison among the distributions used in hazard analysis. Simulation technique has been used to study the behavior of hazard distribution modules. The fundamentals of Hazard issues are discussed using failure criteria. We present the flexibility of the hazard modeling distribution that approaches to different distributions.

  14. Comparison of the Gini and Zenga Indexes using Some Theoretical Income Distributions Abstract

    Directory of Open Access Journals (Sweden)

    Katarzyna Ostasiewicz

    2013-01-01

    Full Text Available The most common measure of inequality used in scientific research is the Gini index. In 2007, Zenga proposed a new index of inequality that has all the appropriate properties of an measure of equality. In this paper, we compared the Gini and Zenga indexes, calculating these quantities for the few distributions frequently used for approximating distributions of income, that is, the lognormal, gamma, inverse Gauss, Weibull and Burr distributions. Within this limited examination, we have observed three main differences. First, the Zenga index increases more rapidly for low values of the variation and decreases more slowly when the variation approaches intermediate values from above. Second, the Zenga index seems to be better predicted by the variation. Third, although the Zenga index is always higher than the Gini one, the ordering of some pairs of cases may be inverted. (original abstract

  15. Prediction of Mean and Design Fatigue Lives of Self Compacting Concrete Beams in Flexure

    Science.gov (United States)

    Goel, S.; Singh, S. P.; Singh, P.; Kaushik, S. K.

    2012-02-01

    In this paper, result of an investigation conducted to study the flexural fatigue characteristics of self compacting concrete (SCC) beams in flexure are presented. An experimental programme was planned in which approximately 60 SCC beam specimens of size 100 × 100 × 500 mm were tested under flexural fatigue loading. Approximately 45 static flexural tests were also conducted to facilitate fatigue testing. The flexural fatigue and static flexural strength tests were conducted on a 100 kN servo-controlled actuator. The fatigue life data thus obtained have been used to establish the probability distributions of fatigue life of SCC using two-parameter Weibull distribution. The parameters of the Weibull distribution have been obtained by different methods of analysis. Using the distribution parameters, the mean and design fatigue lives of SCC have been estimated and compared with Normally vibrated concrete (NVC), the data for which have been taken from literature. It has been observed that SCC exhibits higher mean and design fatigue lives compared to NVC.

  16. Wind distribution and capacity factor estimation for wind turbines in the coastal region of South Africa

    International Nuclear Information System (INIS)

    Ayodele, T.R.; Jimoh, A.A.; Munda, J.L.; Agee, J.T.

    2012-01-01

    Highlights: ► We evaluate capacity factor of some commercially available wind turbines. ► Wind speed in the sites studied can best be modelled using Weibull distribution. ► Site WM05 has the highest wind power potential while site WM02 has the lowest. ► More wind power can be harnessed during the day period compared to the night. ► Turbine K seems to be the best turbine for the coastal region of South Africa. - Abstract: The operating curve parameters of a wind turbine should match the local wind regime optimally to ensure maximum exploitation of available energy in a mass of moving air. This paper provides estimates of the capacity factor of 20 commercially available wind turbines, based on the local wind characteristics of ten different sites located in the Western Cape region of South Africa. Ten-min average time series wind-speed data for a period of 1 year are used for the study. First, the wind distribution that best models the local wind regime of the sites is determined. This is based on root mean square error (RMSE) and coefficient of determination (R 2 ) which are used to test goodness of fit. First, annual, seasonal, diurnal and peak period-capacity factor are estimated analytically. Then, the influence of turbine power curve parameters on the capacity factor is investigated. Some of the key results show that the wind distribution of the entire site can best be modelled statistically using the Weibull distribution. Site WM05 (Napier) presents the highest capacity factor for all the turbines. This indicates that this site has the highest wind power potential of all the available sites. Site WM02 (Calvinia) has the lowest capacity factor i.e. lowest wind power potential. This paper can assist in the planning and development of large-scale wind power-generating sites in South Africa.

  17. Log-concavity property for some well-known distributions

    Directory of Open Access Journals (Sweden)

    G. R. Mohtashami Borzadaran

    2011-12-01

    Full Text Available Interesting properties and propositions, in many branches of science such as economics have been obtained according to the property of cumulative distribution function of a random variable as a concave function. Caplin and Nalebuff (1988,1989, Bagnoli and Khanna (1989 and Bagnoli and Bergstrom (1989 , 1989, 2005 have discussed the log-concavity property of probability distributions and their applications, especially in economics. Log-concavity concerns twice differentiable real-valued function g whose domain is an interval on extended real line. g as a function is said to be log-concave on the interval (a,b if the function ln(g is a concave function on (a,b. Log-concavity of g on (a,b is equivalent to g'/g being monotone decreasing on (a,b or (ln(g" 6] have obtained log-concavity for distributions such as normal, logistic, extreme-value, exponential, Laplace, Weibull, power function, uniform, gamma, beta, Pareto, log-normal, Student's t, Cauchy and F distributions. We have discussed and introduced the continuous versions of the Pearson family, also found the log-concavity for this family in general cases, and then obtained the log-concavity property for each distribution that is a member of Pearson family. For the Burr family these cases have been calculated, even for each distribution that belongs to Burr family. Also, log-concavity results for distributions such as generalized gamma distributions, Feller-Pareto distributions, generalized Inverse Gaussian distributions and generalized Log-normal distributions have been obtained.

  18. On the performance of dual-hop mixed RF/FSO wireless communication system in urban area over aggregated exponentiated Weibull fading channels with pointing errors

    Science.gov (United States)

    Wang, Yue; Wang, Ping; Liu, Xiaoxia; Cao, Tian

    2018-03-01

    The performance of decode-and-forward dual-hop mixed radio frequency / free-space optical system in urban area is studied. The RF link is modeled by the Nakagami-m distribution and the FSO link is described by the composite exponentiated Weibull (EW) fading channels with nonzero boresight pointing errors (NBPE). For comparison, the ABER results without pointing errors (PE) and those with zero boresight pointing errors (ZBPE) are also provided. The closed-form expression for the average bit error rate (ABER) in RF link is derived with the help of hypergeometric function, and that in FSO link is obtained by Meijer's G and generalized Gauss-Laguerre quadrature functions. Then, the end-to-end ABERs with binary phase shift keying modulation are achieved on the basis of the computed ABER results of RF and FSO links. The end-to-end ABER performance is further analyzed with different Nakagami-m parameters, turbulence strengths, receiver aperture sizes and boresight displacements. The result shows that with ZBPE and NBPE considered, FSO link suffers a severe ABER degradation and becomes the dominant limitation of the mixed RF/FSO system in urban area. However, aperture averaging can bring significant ABER improvement of this system. Monte Carlo simulation is provided to confirm the validity of the analytical ABER expressions.

  19. STATISTICS, Program System for Statistical Analysis of Experimental Data

    International Nuclear Information System (INIS)

    Helmreich, F.

    1991-01-01

    1 - Description of problem or function: The package is composed of 83 routines, the most important of which are the following: BINDTR: Binomial distribution; HYPDTR: Hypergeometric distribution; POIDTR: Poisson distribution; GAMDTR: Gamma distribution; BETADTR: Beta-1 and Beta-2 distributions; NORDTR: Normal distribution; CHIDTR: Chi-square distribution; STUDTR : Distribution of 'Student's T'; FISDTR: Distribution of F; EXPDTR: Exponential distribution; WEIDTR: Weibull distribution; FRAKTIL: Calculation of the fractiles of the normal, chi-square, Student's, and F distributions; VARVGL: Test for equality of variance for several sample observations; ANPAST: Kolmogorov-Smirnov test and chi-square test of goodness of fit; MULIRE: Multiple linear regression analysis for a dependent variable and a set of independent variables; STPRG: Performs a stepwise multiple linear regression analysis for a dependent variable and a set of independent variables. At each step, the variable entered into the regression equation is the one which has the greatest amount of variance between it and the dependent variable. Any independent variable can be forced into or deleted from the regression equation, irrespective of its contribution to the equation. LTEST: Tests the hypotheses of linearity of the data. SPRANK: Calculates the Spearman rank correlation coefficient. 2 - Method of solution: VARVGL: The Bartlett's Test, the Cochran's Test and the Hartley's Test are performed in the program. MULIRE: The Gauss-Jordan method is used in the solution of the normal equations. STPRG: The abbreviated Doolittle method is used to (1) determine variables to enter into the regression, and (2) complete regression coefficient calculation. 3 - Restrictions on the complexity of the problem: VARVGL: The Hartley's Test is only performed if the sample observations are all of the same size

  20. Protein dynamics and stability: The distribution of atomic fluctuations in thermophilic and mesophilic dihydrofolate reductase derived using elastic incoherent neutron scattering

    International Nuclear Information System (INIS)

    Meinhold, Lars; Clement, David; Tehei, M.; Daniel, R.M.; Finney, J.L.; Smith, Jeremy C.

    2008-01-01

    The temperature dependence of the dynamics of mesophilic and thermophilic dihydrofolate reductase is examined using elastic incoherent neutron scattering. It is demonstrated that the distribution of atomic displacement amplitudes can be derived from the elastic scattering data by assuming a (Weibull) functional form that resembles distributions seen in molecular dynamics simulations. The thermophilic enzyme has a significantly broader distribution than its mesophilic counterpart. Furthermore, although the rate of increase with temperature of the atomic mean-square displacements extracted from the dynamic structure factor is found to be comparable for both enzymes, the amplitudes are found to be slightly larger for the thermophilic enzyme. Therefore, these results imply that the thermophilic enzyme is the more flexible of the two

  1. The effect of roughness model on scattering properties of ice crystals

    International Nuclear Information System (INIS)

    Geogdzhayev, Igor; Diedenhoven, Bastiaan van

    2016-01-01

    We compare stochastic models of microscale surface roughness assuming uniform and Weibull distributions of crystal facet tilt angles to calculate scattering by roughened hexagonal ice crystals using the geometric optics (GO) approximation. Both distributions are determined by similar roughness parameters, while the Weibull model depends on the additional shape parameter. Calculations were performed for two visible wavelengths (864 nm and 410 nm) for roughness values between 0.2 and 0.7 and Weibull shape parameters between 0 and 1.0 for crystals with aspect ratios of 0.21, 1 and 4.8. For this range of parameters we find that, for a given roughness level, varying the Weibull shape parameter can change the asymmetry parameter by up to about 0.05. The largest effect of the shape parameter variation on the phase function is found in the backscattering region, while the degree of linear polarization is most affected at the side-scattering angles. For high roughness, scattering properties calculated using the uniform and Weibull models are in relatively close agreement for a given roughness parameter, especially when a Weibull shape parameter of 0.75 is used. For smaller roughness values, a shape parameter close to unity provides a better agreement. Notable differences are observed in the phase function over the scattering angle range from 5° to 20°, where the uniform roughness model produces a plateau while the Weibull model does not. - Highlights: • We compare scattering by hexagonal crystals for uniform and Weibull roughness models. • The Weibull shape parameter has a stronger effect on the phase function at backscattering. • DoLP is mostly affected at the side-scattering angles. • For high roughness, the two models are in relatively close agreement for a given roughness. • A plateau from 5° to 20° is observed in the phase function when using the uniform model.

  2. Do Insect Populations Die at Constant Rates as They Become Older? Contrasting Demographic Failure Kinetics with Respect to Temperature According to the Weibull Model.

    Directory of Open Access Journals (Sweden)

    Petros Damos

    Full Text Available Temperature implies contrasting biological causes of demographic aging in poikilotherms. In this work, we used the reliability theory to describe the consistency of mortality with age in moth populations and to show that differentiation in hazard rates is related to extrinsic environmental causes such as temperature. Moreover, experiments that manipulate extrinsic mortality were used to distinguish temperature-related death rates and the pertinence of the Weibull aging model. The Newton-Raphson optimization method was applied to calculate parameters for small samples of ages at death by estimating the maximum likelihoods surfaces using scored gradient vectors and the Hessian matrix. The study reveals for the first time that the Weibull function is able to describe contrasting biological causes of demographic aging for moth populations maintained at different temperature regimes. We demonstrate that at favourable conditions the insect death rate accelerates as age advances, in contrast to the extreme temperatures in which each individual drifts toward death in a linear fashion and has a constant chance of passing away. Moreover, slope of hazard rates shifts towards a constant initial rate which is a pattern demonstrated by systems which are not wearing out (e.g. non-aging since the failure, or death, is a random event independent of time. This finding may appear surprising, because, traditionally, it was mostly thought as rule that in aging population force of mortality increases exponentially until all individuals have died. Moreover, in relation to other studies, we have not observed any typical decelerating aging patterns at late life (mortality leveling-off, but rather, accelerated hazard rates at optimum temperatures and a stabilized increase at the extremes.In most cases, the increase in aging-related mortality was simulated reasonably well according to the Weibull survivorship model that is applied. Moreover, semi log- probability hazard

  3. Modeling wind speed and wind power distributions in Rwanda

    Energy Technology Data Exchange (ETDEWEB)

    Safari, Bonfils [Department of Physics, National University of Rwanda, P.O. Box 117, Huye District, South Province (Rwanda)

    2011-02-15

    Utilization of wind energy as an alternative energy source may offer many environmental and economical advantages compared to fossil fuels based energy sources polluting the lower layer atmosphere. Wind energy as other forms of alternative energy may offer the promise of meeting energy demand in the direct, grid connected modes as well as stand alone and remote applications. Wind speed is the most significant parameter of the wind energy. Hence, an accurate determination of probability distribution of wind speed values is very important in estimating wind speed energy potential over a region. In the present study, parameters of five probability density distribution functions such as Weibull, Rayleigh, lognormal, normal and gamma were calculated in the light of long term hourly observed data at four meteorological stations in Rwanda for the period of the year with fairly useful wind energy potential (monthly hourly mean wind speed anti v{>=}2 m s{sup -1}). In order to select good fitting probability density distribution functions, graphical comparisons to the empirical distributions were made. In addition, RMSE and MBE have been computed for each distribution and magnitudes of errors were compared. Residuals of theoretical distributions were visually analyzed graphically. Finally, a selection of three good fitting distributions to the empirical distribution of wind speed measured data was performed with the aid of a {chi}{sup 2} goodness-of-fit test for each station. (author)

  4. On the Weibull distribution for wind energy assessment

    DEFF Research Database (Denmark)

    Batchvarova, Ekaterina; Gryning, Sven-Erik

    2014-01-01

    -term measurements performed by a wind lidar, the vertical profile of the shape parameter will be discussed for a sub-urban site, a coastal site and a marine site. The profile of the shape parameter was found to be substantially different over land and sea. A parameterization of the vertical behavior of the shape...

  5. Percentile-based Weibull diameter distribution model for Pinus ...

    African Journals Online (AJOL)

    Using a site index equation and stem volume model developed for Pinus kesiya in the Philippines, a yield prediction system was created to predict the volume per ha (VPH) for each diameter class and, subsequently, the total volume of a stand. To evaluate the yield prediction system, the predicted mean VPH for each ...

  6. Probabilistic analysis of a materially nonlinear structure

    Science.gov (United States)

    Millwater, H. R.; Wu, Y.-T.; Fossum, A. F.

    1990-01-01

    A probabilistic finite element program is used to perform probabilistic analysis of a materially nonlinear structure. The program used in this study is NESSUS (Numerical Evaluation of Stochastic Structure Under Stress), under development at Southwest Research Institute. The cumulative distribution function (CDF) of the radial stress of a thick-walled cylinder under internal pressure is computed and compared with the analytical solution. In addition, sensitivity factors showing the relative importance of the input random variables are calculated. Significant plasticity is present in this problem and has a pronounced effect on the probabilistic results. The random input variables are the material yield stress and internal pressure with Weibull and normal distributions, respectively. The results verify the ability of NESSUS to compute the CDF and sensitivity factors of a materially nonlinear structure. In addition, the ability of the Advanced Mean Value (AMV) procedure to assess the probabilistic behavior of structures which exhibit a highly nonlinear response is shown. Thus, the AMV procedure can be applied with confidence to other structures which exhibit nonlinear behavior.

  7. Pocket Handbook on Reliability

    Science.gov (United States)

    1975-09-01

    exponencial distributions Weibull distribution, -xtimating reliability, confidence intervals, relia- bility growth, 0. P- curves, Bayesian analysis. 20 A S...introduction for those not familiar with reliability and a good refresher for those who are currently working in the area. LEWIS NERI, CHIEF...includes one or both of the following objectives: a) prediction of the current system reliability, b) projection on the system reliability for someI future

  8. Statistical study on applied stress dependence of failure time in stress corrosion cracking of Zircaloy-4 alloy

    International Nuclear Information System (INIS)

    Hirao, Keiichi; Yamane, Toshimi; Minamino, Yoritoshi; Tanaka, Akiei.

    1988-01-01

    Effects of applied stress on failure time in stress corrosion cracking of Zircaloy-4 alloy were investigated by Weibull distribution method. Test pieces in the evaculated silica tubes were annealed at 1,073 K for 7.2 x 10 3 s, and then quenched into ice-water. These species under constant applied stresses of 40∼90 % yield stress were immersed in CH 3 OH-1 w% I 2 solution at room temperature. The probability distribution of failure times under applied stress of 40 % of yield stress was described as single Weibull distribution, which had one shape parameter. The probability distributions of failure times under applied stress above 60 % of yield stress were described as composite and mixed Weibull distributions, which had the two shape parameters of Weibull distributions for the regions of the shorter time and longer one of failure. The values of these shape parameters in this study were larger than the value of 1 which corresponded to that of wear out failure. The observation of fracture surfaces and the stress dependence of the shape parameters indicated that the shape parameters both for the times of failure under 40 % of yield stress and for the longer ones above 60 % of yield stress corresponded to intergranular cracking, and that for shorter times of failure corresponded to transgranular cracking and dimple fracture. (author)

  9. Universal behaviour in the stock market: Time dynamics of the electronic orderbook

    Energy Technology Data Exchange (ETDEWEB)

    Kızılersü, Ayşe, E-mail: ayse.kizilersu@adelaide.edu.au [Special Research Centre for the Subatomic Structure of Matter (CSSM), Department of Physics, School of Chemistry and Physics, Adelaide University, 5005 (Australia); Kreer, Markus [phi-t products & services, Karlsruher Strasse 88, 76139 Karlsruhe (Germany); Thomas, Anthony W. [CoEPP and CSSM, Department of Physics, Adelaide University, SA 5005 (Australia); Feindt, Michael [Blue Yonder GmbH, Ohiostraße 8, 756139 Karlsruhe (Germany)

    2016-07-29

    A consequence of the digital revolution is that share trading at the stock exchange takes place via electronic order books which are accessed by traders and investors via the internet. Our empirical findings of the London Stock Exchange demonstrate that once ultra-high frequency manipulation on time scales less than around ten milliseconds is excluded, all relevant changes in the order book happen with time differences that are randomly distributed and well described by a left-truncated Weibull distribution with universal shape parameter (independent of time and same for all stocks). The universal shape parameter corresponds to maximum entropy of the distribution. - Highlights: • After the ultra-high frequency manipulation is excluded, all the time differences in the EOB described by a left-truncated Weibull distribution. • The Weibull shape parameter is universal i.e. independent of time and same for all stocks and it is equal to Euler–Mascheroni constant. • The universal shape parameter corresponds to maximum entropy of the distribution.

  10. Universal behaviour in the stock market: Time dynamics of the electronic orderbook

    International Nuclear Information System (INIS)

    Kızılersü, Ayşe; Kreer, Markus; Thomas, Anthony W.; Feindt, Michael

    2016-01-01

    A consequence of the digital revolution is that share trading at the stock exchange takes place via electronic order books which are accessed by traders and investors via the internet. Our empirical findings of the London Stock Exchange demonstrate that once ultra-high frequency manipulation on time scales less than around ten milliseconds is excluded, all relevant changes in the order book happen with time differences that are randomly distributed and well described by a left-truncated Weibull distribution with universal shape parameter (independent of time and same for all stocks). The universal shape parameter corresponds to maximum entropy of the distribution. - Highlights: • After the ultra-high frequency manipulation is excluded, all the time differences in the EOB described by a left-truncated Weibull distribution. • The Weibull shape parameter is universal i.e. independent of time and same for all stocks and it is equal to Euler–Mascheroni constant. • The universal shape parameter corresponds to maximum entropy of the distribution.

  11. Evaluación poscosecha y estimación de vida útil de guayaba fresca utilizando el modelo de Weibull

    Directory of Open Access Journals (Sweden)

    García Mogollón Carlos

    2010-09-01

    Full Text Available

    La guayaba (Psidium guajava L. es una fruta tropical susceptible de sufrir alteraciones indeseables que afectan su vida útil debido a condiciones inadecuadas de almacenamiento y acondicionamiento. En este trabajo se estimó la vida útil de guayaba en fresco utilizando el modelo probabilístico de Weibull y se valuó la calidad de los frutos durante almacenamiento en diferentes condiciones de temperatura y empaque. La evaluación poscosecha se hizo por 15 días con guayabas variedad regional roja. Se utilizó un diseño completamente al azar con arreglo factorial consistente en tres factores: tiempo de almacenamiento con seis niveles (0, 3, 6, 9, 12 y 15 días, temperatura de almacenamiento con dos niveles: ambiente (37 °C y humedad relativa (HR entre 85 y 90% y refrigeración (9±2 °C y HR de 85 - 90%; dos tipo de empaques: bandeja de poliestireno con film plástico de PVC y 'foil' de aluminio. Durante la evaluación sensorial en el periodo de almacenamiento se usó una escala estructurada de tres puntos grado de satisfacción. El modelo de Weibull demostró ser adecuado para predecir la vida útil de la guayaba fresca basados en los criterios de ajustes, límites de confianza de aceptación y fallo. Durante el periodo de almacenamiento se observó que el factor tiempo, la temperatura y el tipo de empaque tienen un efecto estadístico significativo (P < 0.05 sobre el diámetro equivalente, esfericidad, masa específica aparente, SST, pH, acidez y evaluación sensorial de los frutos. El producto puede ser consumido como fruta fresca hasta diez días de almacenamiento a temperatura ambiente y máximo quince días en almacenamiento refrigerado.

  12. Estimating the joint survival probabilities of married individuals

    NARCIS (Netherlands)

    Sanders, Lisanne; Melenberg, Bertrand

    We estimate the joint survival probability of spouses using a large random sample drawn from a Dutch census. As benchmarks we use two bivariate Weibull models. We consider more flexible models, using a semi-nonparametric approach, by extending the independent Weibull distribution using squared

  13. Hourly wind speed analysis in Sicily

    Energy Technology Data Exchange (ETDEWEB)

    Bivona, S.; Leone, C. [Palermo Univ., Dip di Fisica e Technologie Relative, Palermo (Italy); Burlon, R. [Palermo Univ., Dip. di Ingegnaria Nucleare, Palermo (Italy)

    2003-07-01

    The hourly average wind speed data recorded by CNMCA (Centro Nazionale di Meteorologia e Climatologia Aeronautica) have been used to study the statistical properties of the wind speed at nine locations on Sicily. By grouping the observations month by month, we show that the hourly average wind speed, with calms omitted, is represented by a Weibull function. The suitability of the distribution is judged by the discrepancies between the observed and calculated values of the monthly average wind speed and of the standard deviation. (Author)

  14. Reliability Characteristics of Power Plants

    Directory of Open Access Journals (Sweden)

    Zbynek Martinek

    2017-01-01

    Full Text Available This paper describes the phenomenon of reliability of power plants. It gives an explanation of the terms connected with this topic as their proper understanding is important for understanding the relations and equations which model the possible real situations. The reliability phenomenon is analysed using both the exponential distribution and the Weibull distribution. The results of our analysis are specific equations giving information about the characteristics of the power plants, the mean time of operations and the probability of failure-free operation. Equations solved for the Weibull distribution respect the failures as well as the actual operating hours. Thanks to our results, we are able to create a model of dynamic reliability for prediction of future states. It can be useful for improving the current situation of the unit as well as for creating the optimal plan of maintenance and thus have an impact on the overall economics of the operation of these power plants.

  15. Distribution system modeling and analysis

    CERN Document Server

    Kersting, William H

    2001-01-01

    For decades, distribution engineers did not have the sophisticated tools developed for analyzing transmission systems-often they had only their instincts. Things have changed, and we now have computer programs that allow engineers to simulate, analyze, and optimize distribution systems. Powerful as these programs are, however, without a real understanding of the operating characteristics of a distribution system, engineers using the programs can easily make serious errors in their designs and operating procedures. Distribution System Modeling and Analysis helps prevent those errors. It gives readers a basic understanding of the modeling and operating characteristics of the major components of a distribution system. One by one, the author develops and analyzes each component as a stand-alone element, then puts them all together to analyze a distribution system comprising the various shunt and series devices for power-flow and short-circuit studies. He includes the derivation of all models and includes many num...

  16. Empirical analysis for Distributed Energy Resources' impact on future distribution network

    DEFF Research Database (Denmark)

    Han, Xue; Sandels, Claes; Zhu, Kun

    2012-01-01

    There has been a large body of statements claiming that the large scale deployment of Distributed Energy Resources (DERs) will eventually reshape the future distribution grid operation in various ways. Thus, it is interesting to introduce a platform to interpret to what extent the power system...... operation will be alternated. In this paper, quantitative results in terms of how the future distribution grid will be changed by the deployment of distributed generation, active demand and electric vehicles, are presented. The analysis is based on the conditions for both a radial and a meshed distribution...... network. The input parameters are based on the current and envisioned DER deployment scenarios proposed for Sweden....

  17. Utilização da função pearson tipo V, Weibull e hiperbólica para modelagem da distribuição de diâmetros

    Directory of Open Access Journals (Sweden)

    Daniel Henrique Breda Binoti

    2013-09-01

    Full Text Available Objetivou-se neste estudo avaliar a eficiência da função log-Pearson tipo V para a descrição da estrutura diamétrica de povoamentos equiâneos de eucaliptos, bem como propor um modelo de distribuição diamétrica utilizando essa função. A modelagem realizada pela função log-Pearson tipo V foi comparada com a modelagem realizada com a função Weibull e hiperbólica. Para isso utilizou-se dados de parcelas permanentes de eucalipto, localizadas na região centro oeste do estado de Minas Gerais. A função Pearson tipo V foi testada em três diferentes configurações, com três e dois parâmetros, e tendo o parâmetro de locação substituído pelo diâmetro mínimo da parcela. A aderência das funções aos dados foi comprovada pela aplicação do teste Kolmogorov-Sminorv (K-S. Todos os ajustes apresentaram aderência aos dados pelo teste KS. As funções Weibull e hiperbólica apresentaram desempenho superior ao demonstrado pela função Pearson tipo V.

  18. ASEP of MIMO System with MMSE-OSIC Detection over Weibull-Gamma Fading Channel Subject to AWGGN

    Directory of Open Access Journals (Sweden)

    Keerti Tiwari

    2016-01-01

    Full Text Available Ordered successive interference cancellation (OSIC is adopted with minimum mean square error (MMSE detection to enhance the multiple-input multiple-output (MIMO system performance. The optimum detection technique improves the error rate performance but increases system complexity. Therefore, MMSE-OSIC detection is used which reduces error rate compared to traditional MMSE with low complexity. The system performance is analyzed in composite fading environment that includes multipath and shadowing effects known as Weibull-Gamma (WG fading. Along with the composite fading, a generalized noise that is additive white generalized Gaussian noise (AWGGN is considered to show the impact of wireless scenario. This noise model includes various forms of noise as special cases such as impulsive, Gamma, Laplacian, Gaussian, and uniform. Consequently, generalized Q-function is used to model noise. The average symbol error probability (ASEP of MIMO system is computed for 16-quadrature amplitude modulation (16-QAM using MMSE-OSIC detection in WG fading perturbed by AWGGN. Analytical expressions are given in terms of Fox-H function (FHF. These expressions demonstrate the best fit to simulation results.

  19. Author Details

    African Journals Online (AJOL)

    Kaoga, DK. Vol 6, No 2 (2014) - Articles Performance Analysis of Methods for Estimating Weibull Parameters for Wind Speed Distribution in the District of Maroua Abstract PDF. ISSN: 1112-9867. AJOL African Journals Online. HOW TO USE AJOL... for Researchers · for Librarians · for Authors · FAQ's · More about AJOL ...

  20. Assessment of wind speed and wind power through three stations in Egypt, including air density variation and analysis results with rough set theory

    International Nuclear Information System (INIS)

    Essa, K.S.M.; Embaby, M.; Marrouf, A.A.; Koza, A.M.; Abd El-Monsef, M.E.

    2007-01-01

    It is well known that the wind energy potential is proportional to both air density and the third power of the wind speed average over a suitable time period. The wind speed and air density have random variables depending on both time and location. The main objective of this work is to derive the most general wind energy potential of the wind formulation putting into consideration the time variable in both wind speed and air density. The correction factor is derived explicitly in terms of the cross-correlation and the coefficients of variation.The application is performed for environmental and wind speed measurements at the Cairo Airport, Kosseir and Hurguada, Egypt. Comparisons are made between Weibull, Rayleigh, and actual data distributions of wind speed and wind power of one year 2005. A Weibull distribution is the best match to the actual probability distribution of wind speed data for most stations. The maximum wind energy potential was 373 W/m 2 in June at Hurguada (Red Sea coast) where the annual mean value was 207 W/m 2 . By Using Rough Set Theory, We Find That the Wind Power Depends on the Wind Speed with greater than air density

  1. Optimal design of accelerated life tests for an extension of the exponential distribution

    International Nuclear Information System (INIS)

    Haghighi, Firoozeh

    2014-01-01

    Accelerated life tests provide information quickly on the lifetime distribution of the products by testing them at higher than usual levels of stress. In this paper, the lifetime of a product at any level of stress is assumed to have an extension of the exponential distribution. This new family has been recently introduced by Nadarajah and Haghighi (2011 [1]); it can be used as an alternative to the gamma, Weibull and exponentiated exponential distributions. The scale parameter of lifetime distribution at constant stress levels is assumed to be a log-linear function of the stress levels and a cumulative exposure model holds. For this model, the maximum likelihood estimates (MLEs) of the parameters, as well as the Fisher information matrix, are derived. The asymptotic variance of the scale parameter at a design stress is adopted as an optimization objective and its expression formula is provided using the maximum likelihood method. A Monte Carlo simulation study is carried out to examine the performance of these methods. The asymptotic confidence intervals for the parameters and hypothesis test for the parameter of interest are constructed

  2. The ATLAS distributed analysis system

    OpenAIRE

    Legger, F.

    2014-01-01

    In the LHC operations era, analysis of the multi-petabyte ATLAS data sample by globally distributed physicists is a challenging task. To attain the required scale the ATLAS Computing Model was designed around the concept of grid computing, realized in the Worldwide LHC Computing Grid (WLCG), the largest distributed computational resource existing in the sciences. The ATLAS experiment currently stores over 140 PB of data and runs about 140,000 concurrent jobs continuously at WLCG sites. During...

  3. Wear-Out Sensitivity Analysis Project Abstract

    Science.gov (United States)

    Harris, Adam

    2015-01-01

    During the course of the Summer 2015 internship session, I worked in the Reliability and Maintainability group of the ISS Safety and Mission Assurance department. My project was a statistical analysis of how sensitive ORU's (Orbital Replacement Units) are to a reliability parameter called the wear-out characteristic. The intended goal of this was to determine a worst case scenario of how many spares would be needed if multiple systems started exhibiting wear-out characteristics simultaneously. The goal was also to determine which parts would be most likely to do so. In order to do this, my duties were to take historical data of operational times and failure times of these ORU's and use them to build predictive models of failure using probability distribution functions, mainly the Weibull distribution. Then, I ran Monte Carlo Simulations to see how an entire population of these components would perform. From here, my final duty was to vary the wear-out characteristic from the intrinsic value, to extremely high wear-out values and determine how much the probability of sufficiency of the population would shift. This was done for around 30 different ORU populations on board the ISS.

  4. Repeated Time-to-event Analysis of Consecutive Analgesic Events in Postoperative Pain

    DEFF Research Database (Denmark)

    Juul, Rasmus Vestergaard; Rasmussen, Sten; Kreilgaard, Mads

    2015-01-01

    BACKGROUND: Reduction in consumption of opioid rescue medication is often used as an endpoint when investigating analgesic efficacy of drugs by adjunct treatment, but appropriate methods are needed to analyze analgesic consumption in time. Repeated time-to-event (RTTE) modeling is proposed as a way...... to describe analgesic consumption by analyzing the timing of consecutive analgesic events. METHODS: Retrospective data were obtained from 63 patients receiving standard analgesic treatment including morphine on request after surgery following hip fracture. Times of analgesic events up to 96 h after surgery...... were extracted from hospital medical records. Parametric RTTE analysis was performed with exponential, Weibull, or Gompertz distribution of analgesic events using NONMEM®, version 7.2 (ICON Development Solutions, USA). The potential influences of night versus day, sex, and age were investigated...

  5. Probabilistic analysis of flaw distribution on structure under cyclic load

    International Nuclear Information System (INIS)

    Kwak, Sang Log; Choi, Young Hwan; Kim, Hho Jung

    2003-01-01

    Flaw geometries, applied stress, and material properties are major input variables for the fracture mechanics analysis. Probabilistic approach can be applied for the consideration of uncertainties within these input variables. But probabilistic analysis requires many assumptions due to the lack of initial flaw distributions data. In this study correlations are examined between initial flaw distributions and in-service flaw distributions on structures under cyclic load. For the analysis, LEFM theories and Monte Carlo simulation are applied. Result shows that in-service flaw distributions are determined by initial flaw distributions rather than fatigue crack growth rate. So initial flaw distribution can be derived from in-service flaw distributions

  6. Entropy-based derivation of generalized distributions for hydrometeorological frequency analysis

    Science.gov (United States)

    Chen, Lu; Singh, Vijay P.

    2018-02-01

    Frequency analysis of hydrometeorological and hydrological extremes is needed for the design of hydraulic and civil infrastructure facilities as well as water resources management. A multitude of distributions have been employed for frequency analysis of these extremes. However, no single distribution has been accepted as a global standard. Employing the entropy theory, this study derived five generalized distributions for frequency analysis that used different kinds of information encoded as constraints. These distributions were the generalized gamma (GG), the generalized beta distribution of the second kind (GB2), and the Halphen type A distribution (Hal-A), Halphen type B distribution (Hal-B) and Halphen type inverse B distribution (Hal-IB), among which the GG and GB2 distribution were previously derived by Papalexiou and Koutsoyiannis (2012) and the Halphen family was first derived using entropy theory in this paper. The entropy theory allowed to estimate parameters of the distributions in terms of the constraints used for their derivation. The distributions were tested using extreme daily and hourly rainfall data. Results show that the root mean square error (RMSE) values were very small, which indicated that the five generalized distributions fitted the extreme rainfall data well. Among them, according to the Akaike information criterion (AIC) values, generally the GB2 and Halphen family gave a better fit. Therefore, those general distributions are one of the best choices for frequency analysis. The entropy-based derivation led to a new way for frequency analysis of hydrometeorological extremes.

  7. Experiment research on cognition reliability model of nuclear power plant

    International Nuclear Information System (INIS)

    Zhao Bingquan; Fang Xiang

    1999-01-01

    The objective of the paper is to improve the reliability of operation on real nuclear power plant of operators through the simulation research to the cognition reliability of nuclear power plant operators. The research method of the paper is to make use of simulator of nuclear power plant as research platform, to take present international research model of reliability of human cognition based on three-parameter Weibull distribution for reference, to develop and get the research model of Chinese nuclear power plant operators based on two-parameter Weibull distribution. By making use of two-parameter Weibull distribution research model of cognition reliability, the experiments about the cognition reliability of nuclear power plant operators have been done. Compared with the results of other countries such USA and Hungary, the same results can be obtained, which can do good to the safety operation of nuclear power plant

  8. Fluctuations in a quasi-stationary shallow cumulus cloud ensemble

    Directory of Open Access Journals (Sweden)

    M. Sakradzija

    2015-01-01

    Full Text Available We propose an approach to stochastic parameterisation of shallow cumulus clouds to represent the convective variability and its dependence on the model resolution. To collect information about the individual cloud lifecycles and the cloud ensemble as a whole, we employ a large eddy simulation (LES model and a cloud tracking algorithm, followed by conditional sampling of clouds at the cloud-base level. In the case of a shallow cumulus ensemble, the cloud-base mass flux distribution is bimodal, due to the different shallow cloud subtypes, active and passive clouds. Each distribution mode can be approximated using a Weibull distribution, which is a generalisation of exponential distribution by accounting for the change in distribution shape due to the diversity of cloud lifecycles. The exponential distribution of cloud mass flux previously suggested for deep convection parameterisation is a special case of the Weibull distribution, which opens a way towards unification of the statistical convective ensemble formalism of shallow and deep cumulus clouds. Based on the empirical and theoretical findings, a stochastic model has been developed to simulate a shallow convective cloud ensemble. It is formulated as a compound random process, with the number of convective elements drawn from a Poisson distribution, and the cloud mass flux sampled from a mixed Weibull distribution. Convective memory is accounted for through the explicit cloud lifecycles, making the model formulation consistent with the choice of the Weibull cloud mass flux distribution function. The memory of individual shallow clouds is required to capture the correct convective variability. The resulting distribution of the subgrid convective states in the considered shallow cumulus case is scale-adaptive – the smaller the grid size, the broader the distribution.

  9. Probabilistic distributions of wind velocity for the evaluation of the wind power potential; Distribuicoes probabilisticas de velocidades do vento para avaliacao do potencial energetico eolico

    Energy Technology Data Exchange (ETDEWEB)

    Vendramini, Elisa Zanuncio

    1986-10-01

    The theoretical model of wind speed distributions allow valuable information about the probability of events relative to the variable in study eliminating the necessity of a new experiment. The most used distributions has been the Weibull and the Rayleigh. These distributions are examined in the present investigation, as well as the exponential, gamma, chi square and lognormal distributions. Three years of hourly averages wind data recorded from a anemometer setting at the city of Ataliba Leonel, Sao Paulo State, Brazil, were used. Using wind speed distribution the theoretical relative frequency was calculated from the distributions which have been examined. Results from the Kolmogorov - Smirnov test allow to conclude that the lognormal distribution fit better the wind speed data, followed by the gamma and Rayleigh distributions. Using the lognormal probability density function the yearly energy output from a wind generator installed in the side was calculated. 30 refs, 4 figs, 14 tabs

  10. A Comparison of Distribution Free and Non-Distribution Free Factor Analysis Methods

    Science.gov (United States)

    Ritter, Nicola L.

    2012-01-01

    Many researchers recognize that factor analysis can be conducted on both correlation matrices and variance-covariance matrices. Although most researchers extract factors from non-distribution free or parametric methods, researchers can also extract factors from distribution free or non-parametric methods. The nature of the data dictates the method…

  11. 基于韦伯模型的风场储能容量计算%Storage Capacity Calculation of Wind Power Based on Weibull Model

    Institute of Scientific and Technical Information of China (English)

    王树超

    2013-01-01

    Wind speed model and wind generator output model are analyzed by applying Weibull function to set up wind speed distribution model and the concept of probability theory to calculate the power capacity of energy storage system . The ratio of wind energy and storage capacity is reasonable and meets requirement of energy system by means of stimula -tion experiment .Under the condition of satisfying China ’ s wind power grid standard , the energy storage scale should be minimized and be verified by actual wind farm data .%分析了风电场风速的模型、风力发电机输出模型,运用韦伯函数建立风速分布模型,采用概率论期望的思想,计算储能系统功率容量。通过模拟仿真实验,得出满足电力系统要求的合理风储比。在满足我国风电并网标准的条件下,尽可能地减小储能系统规模,并利用实际风电场数据加以分析验证。

  12. Generalized Analysis of a Distribution Separation Method

    Directory of Open Access Journals (Sweden)

    Peng Zhang

    2016-04-01

    Full Text Available Separating two probability distributions from a mixture model that is made up of the combinations of the two is essential to a wide range of applications. For example, in information retrieval (IR, there often exists a mixture distribution consisting of a relevance distribution that we need to estimate and an irrelevance distribution that we hope to get rid of. Recently, a distribution separation method (DSM was proposed to approximate the relevance distribution, by separating a seed irrelevance distribution from the mixture distribution. It was successfully applied to an IR task, namely pseudo-relevance feedback (PRF, where the query expansion model is often a mixture term distribution. Although initially developed in the context of IR, DSM is indeed a general mathematical formulation for probability distribution separation. Thus, it is important to further generalize its basic analysis and to explore its connections to other related methods. In this article, we first extend DSM’s theoretical analysis, which was originally based on the Pearson correlation coefficient, to entropy-related measures, including the KL-divergence (Kullback–Leibler divergence, the symmetrized KL-divergence and the JS-divergence (Jensen–Shannon divergence. Second, we investigate the distribution separation idea in a well-known method, namely the mixture model feedback (MMF approach. We prove that MMF also complies with the linear combination assumption, and then, DSM’s linear separation algorithm can largely simplify the EM algorithm in MMF. These theoretical analyses, as well as further empirical evaluation results demonstrate the advantages of our DSM approach.

  13. Stochastic distribution of the required coefficient of friction for level walking--an in-depth study.

    Science.gov (United States)

    Chang, Wen-Ruey; Matz, Simon; Chang, Chien-Chi

    2012-01-01

    This study investigated the stochastic distribution of the required coefficient of friction (RCOF) which is a critical element for estimating slip probability. Fifty participants walked under four walking conditions. The results of the Kolmogorov-Smirnov two-sample test indicate that 76% of the RCOF data showed a difference in distribution between both feet for the same participant under each walking condition; the data from both feet were kept separate. The results of the Kolmogorov-Smirnov goodness-of-fit test indicate that most of the distribution of the RCOF appears to have a good match with the normal (85.5%), log-normal (84.5%) and Weibull distributions (81.5%). However, approximately 7.75% of the cases did not have a match with any of these distributions. It is reasonable to use the normal distribution for representation of the RCOF distribution due to its simplicity and familiarity, but each foot had a different distribution from the other foot in 76% of cases. The stochastic distribution of the required coefficient of friction (RCOF) was investigated for use in a statistical model to improve the estimate of slip probability in risk assessment. The results indicate that 85.5% of the distribution of the RCOF appears to have a good match with the normal distribution.

  14. Master equation approach to the intra-urban passenger flow and application to the Metropolitan Seoul Subway system

    Energy Technology Data Exchange (ETDEWEB)

    Lee, Keumsook [Department of Geography, Sungshin University, Seoul 136-742 (Korea, Republic of); Goh, Segun; Choi, M Y [Department of Physics and Astronomy and Center for Theoretical Physics, Seoul National University, Seoul 151-747 (Korea, Republic of); Park, Jong Soo [School of Information Technology, Sungshin University, Seoul 136-742 (Korea, Republic of); Jung, Woo-Sung, E-mail: kslee@sungshin.ac.kr, E-mail: mychoi@snu.ac.kr [Department of Physics and Basic Science Research Institute, Pohang University of Science and Technology, Pohang 790-784 (Korea, Republic of)

    2011-03-18

    The master equation approach is proposed to describe the evolution of passengers in a subway system. With the transition rate constructed from simple geographical consideration, the evolution equation for the distribution of subway passengers is found to bear skew distributions including log-normal, Weibull, and power-law distributions. This approach is then applied to the Metropolitan Seoul Subway system: analysis of the trip data of all passengers in a day reveals that the data in most cases fit well to the log-normal distributions. Implications of the results are also discussed.

  15. Master equation approach to the intra-urban passenger flow and application to the Metropolitan Seoul Subway system

    International Nuclear Information System (INIS)

    Lee, Keumsook; Goh, Segun; Choi, M Y; Park, Jong Soo; Jung, Woo-Sung

    2011-01-01

    The master equation approach is proposed to describe the evolution of passengers in a subway system. With the transition rate constructed from simple geographical consideration, the evolution equation for the distribution of subway passengers is found to bear skew distributions including log-normal, Weibull, and power-law distributions. This approach is then applied to the Metropolitan Seoul Subway system: analysis of the trip data of all passengers in a day reveals that the data in most cases fit well to the log-normal distributions. Implications of the results are also discussed.

  16. Uncertainty Analysis of Multi-Model Flood Forecasts

    Directory of Open Access Journals (Sweden)

    Erich J. Plate

    2015-12-01

    Full Text Available This paper demonstrates, by means of a systematic uncertainty analysis, that the use of outputs from more than one model can significantly improve conditional forecasts of discharges or water stages, provided the models are structurally different. Discharge forecasts from two models and the actual forecasted discharge are assumed to form a three-dimensional joint probability density distribution (jpdf, calibrated on long time series of data. The jpdf is decomposed into conditional probability density distributions (cpdf by means of Bayes formula, as suggested and explored by Krzysztofowicz in a series of papers. In this paper his approach is simplified to optimize conditional forecasts for any set of two forecast models. Its application is demonstrated by means of models developed in a study of flood forecasting for station Stung Treng on the middle reach of the Mekong River in South-East Asia. Four different forecast models were used and pairwise combined: forecast with no model, with persistence model, with a regression model, and with a rainfall-runoff model. Working with cpdfs requires determination of dependency among variables, for which linear regressions are required, as was done by Krzysztofowicz. His Bayesian approach based on transforming observed probability distributions of discharges and forecasts into normal distributions is also explored. Results obtained with his method for normal prior and likelihood distributions are identical to results from direct multiple regressions. Furthermore, it is shown that in the present case forecast accuracy is only marginally improved, if Weibull distributed basic data were converted into normally distributed variables.

  17. Fractal analysis of SEM images and mercury intrusion porosimetry data for the microstructural characterization of microcrystalline cellulose-based pellets

    International Nuclear Information System (INIS)

    Gomez-Carracedo, A.; Alvarez-Lorenzo, C.; Coca, R.; Martinez-Pacheco, R.; Concheiro, A.; Gomez-Amoza, J.L.

    2009-01-01

    The microstructure of theophylline pellets prepared from microcrystalline cellulose, carbopol and dicalcium phosphate dihydrate, according to a mixture design, was characterized using textural analysis of gray-level scanning electron microscopy (SEM) images and thermodynamic analysis of the cumulative pore volume distribution obtained by mercury intrusion porosimetry. Surface roughness evaluated in terms of gray-level non-uniformity and fractal dimension of pellet surface depended on agglomeration phenomena during extrusion/spheronization. Pores at the surface, mainly 1-15 μm in diameter, determined both the mechanism and the rate of theophylline release, and a strong negative correlation between the fractal geometry and the b parameter of the Weibull function was found for pellets containing >60% carbopol. Theophylline mean dissolution time from these pellets was about two to four times greater. Textural analysis of SEM micrographs and fractal analysis of mercury intrusion data are complementary techniques that enable complete characterization of multiparticulate drug dosage forms

  18. A statistical model for horizontal mass flux of erodible soil

    International Nuclear Information System (INIS)

    Babiker, A.G.A.G.; Eltayeb, I.A.; Hassan, M.H.A.

    1986-11-01

    It is shown that the mass flux of erodible soil transported horizontally by a statistically distributed wind flow has a statistical distribution. Explicit expression for the probability density function, p.d.f., of the flux is derived for the case in which the wind speed has a Weibull distribution. The statistical distribution for a mass flux characterized by a generalized Bagnold formula is found to be Weibull for the case of zero threshold speed. Analytic and numerical values for the average horizontal mass flux of soil are obtained for various values of wind parameters, by evaluating the first moment of the flux density function. (author)

  19. Quantification of the Water-Energy Nexus in Beijing City Based on Copula Analysis

    Science.gov (United States)

    Cai, J.; Cai, Y.

    2017-12-01

    Water resource and energy resource are intimately and highly interwoven, called ``water-energy nexus", which poses challenges for the sustainable management of water resource and energy resource. In this research, the Copula analysis method is first proposed to be applied in "water-energy nexus" field to clarify the internal relationship of water resource and energy resource, which is a favorable tool to explore the relevance among random variables. Beijing City, the capital of China, is chosen as a case study. The marginal distribution functions of water resource and energy resource are analyzed first. Then the Binary Copula function is employed to construct the joint distribution function of "water-energy nexus" to quantify the inherent relationship between water resource and energy resource. The results show that it is more appropriate to apply Lognormal distribution to establish the marginal distribution function of water resource. Meanwhile, Weibull distribution is more feasible to describe the marginal distribution function of energy resource. Furthermore, it is more suitable to adopt the Bivariate Normal Copula function to construct the joint distribution function of "water-energy nexus" in Beijing City. The findings can help to identify and quantify the "water-energy nexus". In addition, our findings can provide reasonable policy recommendations on the sustainable management of water resource and energy resource to promote regional coordinated development.

  20. Distributed Algorithms for Time Optimal Reachability Analysis

    DEFF Research Database (Denmark)

    Zhang, Zhengkui; Nielsen, Brian; Larsen, Kim Guldstrand

    2016-01-01

    . We propose distributed computing to accelerate time optimal reachability analysis. We develop five distributed state exploration algorithms, implement them in \\uppaal enabling it to exploit the compute resources of a dedicated model-checking cluster. We experimentally evaluate the implemented...... algorithms with four models in terms of their ability to compute near- or proven-optimal solutions, their scalability, time and memory consumption and communication overhead. Our results show that distributed algorithms work much faster than sequential algorithms and have good speedup in general.......Time optimal reachability analysis is a novel model based technique for solving scheduling and planning problems. After modeling them as reachability problems using timed automata, a real-time model checker can compute the fastest trace to the goal states which constitutes a time optimal schedule...

  1. Modelling and analysis of distributed simulation protocols with distributed graph transformation

    OpenAIRE

    Lara, Juan de; Taentzer, Gabriele

    2005-01-01

    Personal use of this material is permitted. Permission from IEEE must be obtained for all other uses, in any current or future media, including reprinting/republishing this material for advertising or promotional purposes, creating new collective works, for resale or redistribution to servers or lists, or reuse of any copyrighted component of this work in other works. J. de Lara, and G. Taentzer, "Modelling and analysis of distributed simulation protocols with distributed graph transformation...

  2. Effect of particle size and distribution of the sizing agent on the carbon fibers surface and interfacial shear strength (IFSS) of its composites

    International Nuclear Information System (INIS)

    Zhang, R.L.; Liu, Y.; Huang, Y.D.; Liu, L.

    2013-01-01

    Effect of particle size and distribution of the sizing agent on the performance of carbon fiber and carbon fiber composites has been investigated. Atomic force microscopy (AFM) and scanning electron microscopy (SEM) were used to characterize carbon fiber surface topographies. At the same time, the single fiber strength and Weibull distribution were also studied in order to investigate the effect of coatings on the fibers. The interfacial shear strength and hygrothermal aging of the carbon fiber/epoxy resin composites were also measured. The results indicated that the particle size and distribution is important for improving the surface of carbon fibers and its composites performance. Different particle size and distribution of sizing agent has different contribution to the wetting performance of carbon fibers. The fibers sized with P-2 had higher value of IFSS and better hygrothermal aging resistant properties.

  3. Study on constant-step stress accelerated life tests in white organic light-emitting diodes.

    Science.gov (United States)

    Zhang, J P; Liu, C; Chen, X; Cheng, G L; Zhou, A X

    2014-11-01

    In order to obtain reliability information for a white organic light-emitting diode (OLED), two constant and one step stress tests were conducted with its working current increased. The Weibull function was applied to describe the OLED life distribution, and the maximum likelihood estimation (MLE) and its iterative flow chart were used to calculate shape and scale parameters. Furthermore, the accelerated life equation was determined using the least squares method, a Kolmogorov-Smirnov test was performed to assess if the white OLED life follows a Weibull distribution, and self-developed software was used to predict the average and the median lifetimes of the OLED. The numerical results indicate that white OLED life conforms to a Weibull distribution, and that the accelerated life equation completely satisfies the inverse power law. The estimated life of a white OLED may provide significant guidelines for its manufacturers and customers. Copyright © 2014 John Wiley & Sons, Ltd.

  4. On dielectric breakdown statistics

    International Nuclear Information System (INIS)

    Tuncer, Enis; James, D Randy; Sauers, Isidor; Ellis, Alvin R; Pace, Marshall O

    2006-01-01

    In this paper, we investigate the dielectric breakdown data of some insulating materials and focus on the applicability of the two- and three-parameter Weibull distributions. A new distribution function is also proposed. In order to assess the model distribution's trustworthiness, we employ the Monte Carlo technique and, randomly selecting data-subsets from the whole dielectric breakdown data, determine whether the selected probability functions accurately describe the breakdown data. The utility and strength of the proposed expression are illustrated distinctly by the numerical procedure. The proposed expression is shown to be a valuable alternative to the Weibull ones

  5. Monitoring and quantifying future climate projections of dryness and wetness extremes: SPI bias

    Directory of Open Access Journals (Sweden)

    F. Sienz

    2012-07-01

    Full Text Available The adequacy of the gamma distribution (GD for monthly precipitation totals is reconsidered. The motivation for this study is the observation that the GD fails to represent precipitation in considerable areas of global observed and simulated data. This misrepresentation may lead to erroneous estimates of the Standardised Precipitation Index (SPI, evaluations of models, and assessments of climate change. In this study, the GD is compared to the Weibull (WD, Burr Type III (BD, exponentiated Weibull (EWD and generalised gamma (GGD distribution. These distributions extend the GD in terms of possible shapes (skewness and kurtosis and the behaviour for large arguments. The comparison is based on the Akaike information criterion, which maximises information entropy and reveals a trade-off between deviation and the numbers of parameters used. We use monthly sums of observed and simulated precipitation for 12 calendar months of the year. Assessing observed and simulated data, (i the Weibull type distributions give distinctly improved fits compared to the GD and (ii the SPI resulting from the GD overestimates (underestimates extreme dryness (wetness.

  6. Transient stability analysis of a distribution network with distributed generators

    NARCIS (Netherlands)

    Xyngi, I.; Ishchenko, A.; Popov, M.; Sluis, van der L.

    2009-01-01

    This letter describes the transient stability analysis of a 10-kV distribution network with wind generators, microturbines, and CHP plants. The network being modeled in Matlab/Simulink takes into account detailed dynamic models of the generators. Fault simulations at various locations are

  7. Response Time Analysis of Distributed Web Systems Using QPNs

    Directory of Open Access Journals (Sweden)

    Tomasz Rak

    2015-01-01

    Full Text Available A performance model is used for studying distributed Web systems. Performance evaluation is done by obtaining load test measurements. Queueing Petri Nets formalism supports modeling and performance analysis of distributed World Wide Web environments. The proposed distributed Web systems modeling and design methodology have been applied in the evaluation of several system architectures under different external loads. Furthermore, performance analysis is done to determine the system response time.

  8. Wind climate modeling using Weibull and extreme value distribution ...

    African Journals Online (AJOL)

    It is very much important to fit wind speed data into some suitable statistical model for two aspects. One is fatigue failure due to periodic vortex shedding and the other is to estimate the wind energy potential of a particular location. For the fatigue failure due to periodic vortex shedding, it is important to analyse the load cycle.

  9. Distribution of lod scores in oligogenic linkage analysis.

    Science.gov (United States)

    Williams, J T; North, K E; Martin, L J; Comuzzie, A G; Göring, H H; Blangero, J

    2001-01-01

    In variance component oligogenic linkage analysis it can happen that the residual additive genetic variance bounds to zero when estimating the effect of the ith quantitative trait locus. Using quantitative trait Q1 from the Genetic Analysis Workshop 12 simulated general population data, we compare the observed lod scores from oligogenic linkage analysis with the empirical lod score distribution under a null model of no linkage. We find that zero residual additive genetic variance in the null model alters the usual distribution of the likelihood-ratio statistic.

  10. Estimation of the lifetime distribution of mechatronic systems in the presence of a covariate: A comparison among parametric, semiparametric and nonparametric models

    International Nuclear Information System (INIS)

    Bobrowski, Sebastian; Chen, Hong; Döring, Maik; Jensen, Uwe; Schinköthe, Wolfgang

    2015-01-01

    In practice manufacturers may have lots of failure data of similar products using the same technology basis under different operating conditions. Thus, one can try to derive predictions for the distribution of the lifetime of newly developed components or new application environments through the existing data using regression models based on covariates. Three categories of such regression models are considered: a parametric, a semiparametric and a nonparametric approach. First, we assume that the lifetime is Weibull distributed, where its parameters are modelled as linear functions of the covariate. Second, the Cox proportional hazards model, well-known in Survival Analysis, is applied. Finally, a kernel estimator is used to interpolate between empirical distribution functions. In particular the last case is new in the context of reliability analysis. We propose a goodness of fit measure (GoF), which can be applied to all three types of regression models. Using this GoF measure we discuss a new model selection procedure. To illustrate this method of reliability prediction, the three classes of regression models are applied to real test data of motor experiments. Further the performance of the approaches is investigated by Monte Carlo simulations. - Highlights: • We estimate the lifetime distribution in the presence of a covariate. • Three types of regression models are considered and compared. • A new nonparametric estimator based on our particular data structure is introduced. • We propose a goodness of fit measure and show a new model selection procedure. • A case study with real data and Monte Carlo simulations are performed

  11. Analysis of meteorological droughts and dry spells in semiarid regions: a comparative analysis of probability distribution functions in the Segura Basin (SE Spain)

    Science.gov (United States)

    Pérez-Sánchez, Julio; Senent-Aparicio, Javier

    2017-08-01

    Dry spells are an essential concept of drought climatology that clearly defines the semiarid Mediterranean environment and whose consequences are a defining feature for an ecosystem, so vulnerable with regard to water. The present study was conducted to characterize rainfall drought in the Segura River basin located in eastern Spain, marked by the self seasonal nature of these latitudes. A daily precipitation set has been utilized for 29 weather stations during a period of 20 years (1993-2013). Furthermore, four sets of dry spell length (complete series, monthly maximum, seasonal maximum, and annual maximum) are used and simulated for all the weather stations with the following probability distribution functions: Burr, Dagum, error, generalized extreme value, generalized logistic, generalized Pareto, Gumbel Max, inverse Gaussian, Johnson SB, Log-Logistic, Log-Pearson 3, Triangular, Weibull, and Wakeby. Only the series of annual maximum spell offer a good adjustment for all the weather stations, thereby gaining the role of Wakeby as the best result, with a p value means of 0.9424 for the Kolmogorov-Smirnov test (0.2 significance level). Probability of dry spell duration for return periods of 2, 5, 10, and 25 years maps reveal the northeast-southeast gradient, increasing periods with annual rainfall of less than 0.1 mm in the eastern third of the basin, in the proximity of the Mediterranean slope.

  12. Temporal distribution of earthquakes using renewal process in the Dasht-e-Bayaz region

    Science.gov (United States)

    Mousavi, Mehdi; Salehi, Masoud

    2018-01-01

    Temporal distribution of earthquakes with M w > 6 in the Dasht-e-Bayaz region, eastern Iran has been investigated using time-dependent models. Based on these types of models, it is assumed that the times between consecutive large earthquakes follow a certain statistical distribution. For this purpose, four time-dependent inter-event distributions including the Weibull, Gamma, Lognormal, and the Brownian Passage Time (BPT) are used in this study and the associated parameters are estimated using the method of maximum likelihood estimation. The suitable distribution is selected based on logarithm likelihood function and Bayesian Information Criterion. The probability of the occurrence of the next large earthquake during a specified interval of time was calculated for each model. Then, the concept of conditional probability has been applied to forecast the next major ( M w > 6) earthquake in the site of our interest. The emphasis is on statistical methods which attempt to quantify the probability of an earthquake occurring within a specified time, space, and magnitude windows. According to obtained results, the probability of occurrence of an earthquake with M w > 6 in the near future is significantly high.

  13. Spatial Distribution Characteristics of Healthcare Facilities in Nanjing: Network Point Pattern Analysis and Correlation Analysis

    Directory of Open Access Journals (Sweden)

    Jianhua Ni

    2016-08-01

    Full Text Available The spatial distribution of urban service facilities is largely constrained by the road network. In this study, network point pattern analysis and correlation analysis were used to analyze the relationship between road network and healthcare facility distribution. The weighted network kernel density estimation method proposed in this study identifies significant differences between the outside and inside areas of the Ming city wall. The results of network K-function analysis show that private hospitals are more evenly distributed than public hospitals, and pharmacy stores tend to cluster around hospitals along the road network. After computing the correlation analysis between different categorized hospitals and street centrality, we find that the distribution of these hospitals correlates highly with the street centralities, and that the correlations are higher with private and small hospitals than with public and large hospitals. The comprehensive analysis results could help examine the reasonability of existing urban healthcare facility distribution and optimize the location of new healthcare facilities.

  14. Wind energy potential on Malaysian Resort Islands: a case study of Tioman, Redang and Perhentian Island

    International Nuclear Information System (INIS)

    Kamaruzzaman Sopian

    2000-01-01

    Wind data collected at three east coast islands of Peninsular Malaysia namely Tioman, Redang and Perhentian Island were analyzed for the wind energy potential. The results were presented as Weibull distribution and preliminary analysis indicate that the site at Redang Island have the greatest potential with a mean power density of 85.1 w/m 2 at 10 meters above sea level. (Author)

  15. Analysis of survival in breast cancer patients by using different parametric models

    Science.gov (United States)

    Enera Amran, Syahila; Asrul Afendi Abdullah, M.; Kek, Sie Long; Afiqah Muhamad Jamil, Siti

    2017-09-01

    In biomedical applications or clinical trials, right censoring was often arising when studying the time to event data. In this case, some individuals are still alive at the end of the study or lost to follow up at a certain time. It is an important issue to handle the censoring data in order to prevent any bias information in the analysis. Therefore, this study was carried out to analyze the right censoring data with three different parametric models; exponential model, Weibull model and log-logistic models. Data of breast cancer patients from Hospital Sultan Ismail, Johor Bahru from 30 December 2008 until 15 February 2017 was used in this study to illustrate the right censoring data. Besides, the covariates included in this study are the time of breast cancer infection patients survive t, age of each patients X1 and treatment given to the patients X2 . In order to determine the best parametric models in analysing survival of breast cancer patients, the performance of each model was compare based on Akaike Information Criterion (AIC), Bayesian Information Criterion (BIC) and log-likelihood value using statistical software R. When analysing the breast cancer data, all three distributions were shown consistency of data with the line graph of cumulative hazard function resembles a straight line going through the origin. As the result, log-logistic model was the best fitted parametric model compared with exponential and Weibull model since it has the smallest value in AIC and BIC, also the biggest value in log-likelihood.

  16. Bayesian analysis of longitudinal Johne's disease diagnostic data without a gold standard test

    DEFF Research Database (Denmark)

    Wang, C.; Turnbull, B.W.; Nielsen, Søren Saxmose

    2011-01-01

    the posterior estimates of the model parameters that provide the basis for inference concerning the accuracy of the diagnostic procedure. Based on the Bayesian approach, the posterior probability distribution of the change-point onset time can be obtained and used as a criterion for infection diagnosis......-point process with a Weibull survival hazard function was used to model the progression of the hidden disease status. The model adjusted for the fixed effects of covariate variables and random effects of subject on the diagnostic testing procedure. Markov chain Monte Carlo methods were used to compute....... An application is presented to an analysis of ELISA and fecal culture test outcomes in the diagnostic testing of paratuberculosis (Johne's disease) for a Danish longitudinal study from January 2000 to March 2003. The posterior probability criterion based on the Bayesian model with 4 repeated observations has...

  17. Study for increasing micro-drill reliability by vibrating drilling

    International Nuclear Information System (INIS)

    Yang Zhaojun; Li Wei; Chen Yanhong; Wang Lijiang

    1998-01-01

    A study for increasing micro-drill reliability by vibrating drilling is described. Under the experimental conditions of this study it is observed, from reliability testing and the fitting of a life-distribution function, that the lives of micro-drills under ordinary drilling follow the log-normal distribution and the lives of micro-drills under vibrating drilling follow the Weibull distribution. Calculations for reliability analysis show that vibrating drilling can increase the lives of micro-drills and correspondingly reduce the scatter of drill lives. Therefore, vibrating drilling increases the reliability of micro-drills

  18. About normal distribution on SO(3) group in texture analysis

    Science.gov (United States)

    Savyolova, T. I.; Filatov, S. V.

    2017-12-01

    This article studies and compares different normal distributions (NDs) on SO(3) group, which are used in texture analysis. Those NDs are: Fisher normal distribution (FND), Bunge normal distribution (BND), central normal distribution (CND) and wrapped normal distribution (WND). All of the previously mentioned NDs are central functions on SO(3) group. CND is a subcase for normal CLT-motivated distributions on SO(3) (CLT here is Parthasarathy’s central limit theorem). WND is motivated by CLT in R 3 and mapped to SO(3) group. A Monte Carlo method for modeling normally distributed values was studied for both CND and WND. All of the NDs mentioned above are used for modeling different components of crystallites orientation distribution function in texture analysis.

  19. Nonparametric Fine Tuning of Mixtures: Application to Non-Life Insurance Claims Distribution Estimation

    Science.gov (United States)

    Sardet, Laure; Patilea, Valentin

    When pricing a specific insurance premium, actuary needs to evaluate the claims cost distribution for the warranty. Traditional actuarial methods use parametric specifications to model claims distribution, like lognormal, Weibull and Pareto laws. Mixtures of such distributions allow to improve the flexibility of the parametric approach and seem to be quite well-adapted to capture the skewness, the long tails as well as the unobserved heterogeneity among the claims. In this paper, instead of looking for a finely tuned mixture with many components, we choose a parsimonious mixture modeling, typically a two or three-component mixture. Next, we use the mixture cumulative distribution function (CDF) to transform data into the unit interval where we apply a beta-kernel smoothing procedure. A bandwidth rule adapted to our methodology is proposed. Finally, the beta-kernel density estimate is back-transformed to recover an estimate of the original claims density. The beta-kernel smoothing provides an automatic fine-tuning of the parsimonious mixture and thus avoids inference in more complex mixture models with many parameters. We investigate the empirical performance of the new method in the estimation of the quantiles with simulated nonnegative data and the quantiles of the individual claims distribution in a non-life insurance application.

  20. Minimum K-S estimator using PH-transform technique

    Directory of Open Access Journals (Sweden)

    Somchit Boonthiem

    2016-07-01

    Full Text Available In this paper, we propose an improvement of the Minimum Kolmogorov-Smirnov (K-S estimator using proportional hazards transform (PH-transform technique. The data of experiment is 47 fire accidents data of an insurance company in Thailand. This experiment has two operations, the first operation, we minimize K-S statistic value using grid search technique for nine distributions; Rayleigh distribution, gamma distribution, Pareto distribution, log-logistic distribution, logistic distribution, normal distribution, Weibull distribution, lognormal distribution, and exponential distribution and the second operation, we improve K-S statistic using PHtransform. The result appears that PH-transform technique can improve the Minimum K-S estimator. The algorithms give better the Minimum K-S estimator for seven distributions; Rayleigh distribution, logistic distribution, gamma distribution, Pareto distribution, log-logistic distribution, normal distribution, Weibull distribution, log-normal distribution, and exponential distribution while the Minimum K-S estimators of normal distribution and logistic distribution are unchanged

  1. The time interval distribution of sand–dust storms in theory: testing with observational data for Yanchi, China

    International Nuclear Information System (INIS)

    Liu, Guoliang; Zhang, Feng; Hao, Lizhen

    2012-01-01

    We previously introduced a time record model for use in studying the duration of sand–dust storms. In the model, X is the normalized wind speed and Xr is the normalized wind speed threshold for the sand–dust storm. X is represented by a random signal with a normal Gaussian distribution. The storms occur when X ≥ Xr. From this model, the time interval distribution of N = Aexp(−bt) can be deduced, wherein N is the number of time intervals with length greater than t, A and b are constants, and b is related to Xr. In this study, sand–dust storm data recorded in spring at the Yanchi meteorological station in China were analysed to verify whether the time interval distribution of the sand–dust storms agrees with the above time interval distribution. We found that the distribution of the time interval between successive sand–dust storms in April agrees well with the above exponential equation. However, the interval distribution for the sand–dust storm data for the entire spring period displayed a better fit to the Weibull equation and depended on the variation of the sand–dust storm threshold wind speed. (paper)

  2. Weibull modulus of hardness, bend strength, and tensile strength of Ni−Ta−Co−X metallic glass ribbons

    Energy Technology Data Exchange (ETDEWEB)

    Neilson, Henry J., E-mail: hjn2@case.edu [Case Western Reserve University, 10900 Euclid Ave, Cleveland, OH (United States); Petersen, Alex S.; Cheung, Andrew M.; Poon, S. Joseph; Shiflet, Gary J. [University of Virginia, 395 McCormick Road, P.O. Box 400745, Charlottesville, VA 22904 (United States); Widom, Mike [Carnegie Mellon University, 5000 Forbes Avenue, Wean Hall 3325, Pittsburgh, PA 15213 (United States); Lewandowski, John J. [Case Western Reserve University, 10900 Euclid Ave, Cleveland, OH (United States)

    2015-05-14

    In this study, the variations in mechanical properties of Ni−Co−Ta-based metallic glasses have been analyzed. Three different chemistries of metallic glass ribbons were analyzed: Ni{sub 45}Ta{sub 35}Co{sub 20}, Ni{sub 40}Ta{sub 35}Co{sub 20}Nb{sub 5}, and Ni{sub 30}Ta{sub 35}Co{sub 30}Nb{sub 5}. These alloys possess very high density (approximately 12.5 g/cm{sup 3}) and very high strength (e.g. >3 GPa). Differential scanning calorimetry (DSC) and x-ray diffraction (XRD) were used to characterize the amorphicity of the ribbons. Mechanical properties were measured via a combination of Vickers hardness, bending strength, and tensile strength for each chemistry. At least 50 tests were conducted for each chemistry and each test technique in order to quantify the variability of properties using both 2- and 3-parameter Weibull statistics. The variability in properties and their source(s) were compared to that of other engineering materials, while the nature of deformation via shear bands as well as fracture surface features have been determined using scanning electron microscopy (SEM). Toughness, the role of defects, and volume effects are also discussed.

  3. A joint probability density function of wind speed and direction for wind energy analysis

    International Nuclear Information System (INIS)

    Carta, Jose A.; Ramirez, Penelope; Bueno, Celia

    2008-01-01

    A very flexible joint probability density function of wind speed and direction is presented in this paper for use in wind energy analysis. A method that enables angular-linear distributions to be obtained with specified marginal distributions has been used for this purpose. For the marginal distribution of wind speed we use a singly truncated from below Normal-Weibull mixture distribution. The marginal distribution of wind direction comprises a finite mixture of von Mises distributions. The proposed model is applied in this paper to wind direction and wind speed hourly data recorded at several weather stations located in the Canary Islands (Spain). The suitability of the distributions is judged from the coefficient of determination R 2 . The conclusions reached are that the joint distribution proposed in this paper: (a) can represent unimodal, bimodal and bitangential wind speed frequency distributions, (b) takes into account the frequency of null winds, (c) represents the wind direction regimes in zones with several modes or prevailing wind directions, (d) takes into account the correlation between wind speeds and its directions. It can therefore be used in several tasks involved in the evaluation process of the wind resources available at a potential site. We also conclude that, in the case of the Canary Islands, the proposed model provides better fits in all the cases analysed than those obtained with the models used in the specialised literature on wind energy

  4. Climate Informed Low Flow Frequency Analysis Using Nonstationary Modeling

    Science.gov (United States)

    Liu, D.; Guo, S.; Lian, Y.

    2014-12-01

    Stationarity is often assumed for frequency analysis of low flows in water resources management and planning. However, many studies have shown that flow characteristics, particularly the frequency spectrum of extreme hydrologic events,were modified by climate change and human activities and the conventional frequency analysis without considering the non-stationary characteristics may lead to costly design. The analysis presented in this paper was based on the more than 100 years of daily flow data from the Yichang gaging station 44 kilometers downstream of the Three Gorges Dam. The Mann-Kendall trend test under the scaling hypothesis showed that the annual low flows had significant monotonic trend, whereas an abrupt change point was identified in 1936 by the Pettitt test. The climate informed low flow frequency analysis and the divided and combined method are employed to account for the impacts from related climate variables and the nonstationarities in annual low flows. Without prior knowledge of the probability density function for the gaging station, six distribution functions including the Generalized Extreme Values (GEV), Pearson Type III, Gumbel, Gamma, Lognormal, and Weibull distributions have been tested to find the best fit, in which the local likelihood method is used to estimate the parameters. Analyses show that GEV had the best fit for the observed low flows. This study has also shown that the climate informed low flow frequency analysis is able to exploit the link between climate indices and low flows, which would account for the dynamic feature for reservoir management and provide more accurate and reliable designs for infrastructure and water supply.

  5. Four theorems on the psychometric function.

    Science.gov (United States)

    May, Keith A; Solomon, Joshua A

    2013-01-01

    In a 2-alternative forced-choice (2AFC) discrimination task, observers choose which of two stimuli has the higher value. The psychometric function for this task gives the probability of a correct response for a given stimulus difference, Δx. This paper proves four theorems about the psychometric function. Assuming the observer applies a transducer and adds noise, Theorem 1 derives a convenient general expression for the psychometric function. Discrimination data are often fitted with a Weibull function. Theorem 2 proves that the Weibull "slope" parameter, β, can be approximated by β(Noise) x β(Transducer), where β(Noise) is the β of the Weibull function that fits best to the cumulative noise distribution, and β(Transducer) depends on the transducer. We derive general expressions for β(Noise) and β(Transducer), from which we derive expressions for specific cases. One case that follows naturally from our general analysis is Pelli's finding that, when d' ∝ (Δx)(b), β ≈ β(Noise) x b. We also consider two limiting cases. Theorem 3 proves that, as sensitivity improves, 2AFC performance will usually approach that for a linear transducer, whatever the actual transducer; we show that this does not apply at signal levels where the transducer gradient is zero, which explains why it does not apply to contrast detection. Theorem 4 proves that, when the exponent of a power-function transducer approaches zero, 2AFC performance approaches that of a logarithmic transducer. We show that the power-function exponents of 0.4-0.5 fitted to suprathreshold contrast discrimination data are close enough to zero for the fitted psychometric function to be practically indistinguishable from that of a log transducer. Finally, Weibull β reflects the shape of the noise distribution, and we used our results to assess the recent claim that internal noise has higher kurtosis than a Gaussian. Our analysis of β for contrast discrimination suggests that, if internal noise is stimulus

  6. TEORIA DA CONFIABILIDADE APLICADA NA AVALIAÇÃO DA VIDA EM FADIGA-DE-CONTATO THEORY OF RELIABILITY APPLIED IN THE EVALUATION OF THE LIFE IN CONTACT FATIGUE

    Directory of Open Access Journals (Sweden)

    Nelson Vanegas M

    2009-12-01

    Full Text Available A confiabilidade é uma ferramenta fundamental no desenvolvimento de projetos cuja aplicação visa a melhoria do desempenho através da redução da probabilidade de falha dos mesmos. Neste estudo empregou-se a teoria de confiabilidade nos critérios de projeto e seleção de ferros fundidos com matrizes de elevada dureza utilizados em diversas aplicações. A partir destes conceitos e das informações geradas em laboratório, em equipamento para ensaios de fadiga de contato de rolamento, avaliou-se a confiabilidade empregando-se os métodos não paramétrico e o paramétrico (com o emprego da distribuição de Weibull, a fim de caracterizar o comportamento da confiabilidade, da probabilidade de falha e da taxa de falha dos dois materiais estudados (ferro fundido cinzento e ferro fundido nodular. Os métodos foram comparados obtendo-se resultados semelhantes, mas foi o método paramétrico (distribuição de Weibull que melhor modelou o fenômeno em estudo.Reliability is a fundamental tool in the development of projects since it allows the improvement of performance through the reduction of the failure probability of the products. In this study the reliability theory was applied as the criteria for design and selection of cast irons with high hardness matrix used in different applications. Based on these concepts and from information generated in laboratory tests, in a rolling contact fatigue testing machine, reliability analysis using non parametric approaches and the Weibull´s distribution is developed, in order to characterize reliability, probability of failure, and the rate of failure of both materials (gray cast iron and ductile cast iron. When comparing both methods, similar results were obtained, but the parametric approach (Weibull distribution has better represented the phenomenon under analysis.

  7. Four theorems on the psychometric function.

    Directory of Open Access Journals (Sweden)

    Keith A May

    Full Text Available In a 2-alternative forced-choice (2AFC discrimination task, observers choose which of two stimuli has the higher value. The psychometric function for this task gives the probability of a correct response for a given stimulus difference, Δx. This paper proves four theorems about the psychometric function. Assuming the observer applies a transducer and adds noise, Theorem 1 derives a convenient general expression for the psychometric function. Discrimination data are often fitted with a Weibull function. Theorem 2 proves that the Weibull "slope" parameter, β, can be approximated by β(Noise x β(Transducer, where β(Noise is the β of the Weibull function that fits best to the cumulative noise distribution, and β(Transducer depends on the transducer. We derive general expressions for β(Noise and β(Transducer, from which we derive expressions for specific cases. One case that follows naturally from our general analysis is Pelli's finding that, when d' ∝ (Δx(b, β ≈ β(Noise x b. We also consider two limiting cases. Theorem 3 proves that, as sensitivity improves, 2AFC performance will usually approach that for a linear transducer, whatever the actual transducer; we show that this does not apply at signal levels where the transducer gradient is zero, which explains why it does not apply to contrast detection. Theorem 4 proves that, when the exponent of a power-function transducer approaches zero, 2AFC performance approaches that of a logarithmic transducer. We show that the power-function exponents of 0.4-0.5 fitted to suprathreshold contrast discrimination data are close enough to zero for the fitted psychometric function to be practically indistinguishable from that of a log transducer. Finally, Weibull β reflects the shape of the noise distribution, and we used our results to assess the recent claim that internal noise has higher kurtosis than a Gaussian. Our analysis of β for contrast discrimination suggests that, if internal noise is

  8. Distributed Data Analysis in ATLAS

    CERN Document Server

    Nilsson, P; The ATLAS collaboration

    2012-01-01

    Data analysis using grid resources is one of the fundamental challenges to be addressed before the start of LHC data taking. The ATLAS detector will produce petabytes of data per year, and roughly one thousand users will need to run physics analyses on this data. Appropriate user interfaces and helper applications have been made available to ensure that the grid resources can be used without requiring expertise in grid technology. These tools enlarge the number of grid users from a few production administrators to potentially all participating physicists. ATLAS makes use of three grid infrastructures for the distributed analysis: the EGEE sites, the Open Science Grid, and NorduGrid. These grids are managed by the gLite workload management system, the PanDA workload management system, and ARC middleware; many sites can be accessed via both the gLite WMS and PanDA. Users can choose between two front-end tools to access the distributed resources. Ganga is a tool co-developed with LHCb to provide a common interfa...

  9. Normal and Extreme Wind Conditions for Power at Coastal Locations in China.

    Science.gov (United States)

    Gao, Meng; Ning, Jicai; Wu, Xiaoqing

    2015-01-01

    In this paper, the normal and extreme wind conditions for power at 12 coastal locations along China's coastline were investigated. For this purpose, the daily meteorological data measured at the standard 10-m height above ground for periods of 40-62 years are statistically analyzed. The East Asian Monsoon that affects almost China's entire coastal region is considered as the leading factor determining wind energy resources. For most stations, the mean wind speed is higher in winter and lower in summer. Meanwhile, the wind direction analysis indicates that the prevalent winds in summer are southerly, while those in winter are northerly. The air densities at different coastal locations differ significantly, resulting in the difference in wind power density. The Weibull and lognormal distributions are applied to fit the yearly wind speeds. The lognormal distribution performs better than the Weibull distribution at 8 coastal stations according to two judgement criteria, the Kolmogorov-Smirnov test and absolute error (AE). Regarding the annual maximum extreme wind speed, the generalized extreme value (GEV) distribution performs better than the commonly-used Gumbel distribution. At these southeastern coastal locations, strong winds usually occur in typhoon season. These 4 coastal provinces, that is, Guangdong, Fujian, Hainan, and Zhejiang, which have abundant wind resources, are also prone to typhoon disasters.

  10. Increase in the accuracy of approximating the profile of the erosion zone in planar magnetrons

    Science.gov (United States)

    Rogov, A. V.; Kapustin, Yu. V.

    2017-09-01

    It has been shown that the use of the survival function of the Weibull distribution shifted along the ordinate axis allows one to increase the accuracy of the approximation of the normalized profile of an erosion zone in the area from the axis to the maximum sputtering region compared with the previously suggested distribution function of the extremum values. The survival function of the Weibull distribution is used in the area from the maximum to the outer boundary of an erosion zone. The major advantage of using the new approximation is observed for magnetrons with a large central nonsputtered spot and for magnetrons with substantial sputtering in the paraxial zone.

  11. Leveraging comprehensive baseline datasets to quantify property variability in nuclear-grade graphites

    Energy Technology Data Exchange (ETDEWEB)

    Carroll, Mark C., E-mail: mark.carroll@inl.gov [Idaho National Laboratory, PO Box 1625, Idaho Falls, ID 83415-2213 (United States); Windes, William E.; Rohrbaugh, David T. [Idaho National Laboratory, PO Box 1625, Idaho Falls, ID 83415-2213 (United States); Strizak, Joseph P.; Burchell, Timothy D. [Oak Ridge National Laboratory, PO Box 2008, Oak Ridge, TN 37831-6088 (United States)

    2016-10-15

    Highlights: • An effort is underway to fully quantify the properties of nuclear-grade graphites. • Physical and mechanical properties of graphite are best characterized by distributions. • The Weibull distribution is most representative of graphite based on goodness-of-fit. • Fine-grained isomolded grades exhibit higher Weibull modulus values, indicative of more homogeneous properties. - Abstract: The full characterization of the physical and mechanical properties of candidate nuclear-grade graphites is highly dependent upon an understanding of the distribution of values that are inherent to graphite. Not only do the material properties of graphites vary considerably between grades owing to the raw materials sources, filler particle type and size, methods of compaction, and production process parameters, but variability is observed between billets of the same grade from a single batch and even across spatial positions within a single billet. Properly enveloping the expected properties of interest requires both a substantial amount of data to statistically capture this variability and a representative distribution capable of accurately describing the range of values. A two-parameter Weibull distribution is confirmed to be representative of the distribution of physical (density, modulus) and mechanical (compressive, flexure, and tensile strength) values in five different nuclear-grades of graphite. The fine-grained isomolded grades tend toward higher Weibull modulus and characteristic strength values, while the extruded grade being examined exhibits relatively large distributions in property values. With the number of candidate graphite specimens that can undergo full irradiation exposure and subsequent testing having limited feasibility with regard to economics and timely evaluations, a proper capture of the raw material variability in an unirradiated state can provide crucial supplementary resolution to the limited amount of available data on irradiated

  12. Measurement based scenario analysis of short-range distribution system planning

    DEFF Research Database (Denmark)

    Chen, Peiyuan; Bak-Jensen, Birgitte; Chen, Zhe

    2009-01-01

    This paper focuses on short-range distribution system planning using a probabilistic approach. Empirical probabilistic distributions of load demand and distributed generations are derived from the historical measurement data and incorporated into the system planning. Simulations with various...... feasible scenarios are performed based on a local distribution system at Støvring in Denmark. Simulation results provide more accurate and insightful information for the decision-maker when using the probabilistic analysis than using the worst-case analysis, so that a better planning can be achieved....

  13. Probabilistic Analysis of Space Shuttle Body Flap Actuator Ball Bearings

    Science.gov (United States)

    Oswald, Fred B.; Jett, Timothy R.; Predmore, Roamer E.; Zaretsky, Erwin V.

    2008-01-01

    A probabilistic analysis, using the 2-parameter Weibull-Johnson method, was performed on experimental life test data from space shuttle actuator bearings. Experiments were performed on a test rig under simulated conditions to determine the life and failure mechanism of the grease lubricated bearings that support the input shaft of the space shuttle body flap actuators. The failure mechanism was wear that can cause loss of bearing preload. These tests established life and reliability data for both shuttle flight and ground operation. Test data were used to estimate the failure rate and reliability as a function of the number of shuttle missions flown. The Weibull analysis of the test data for the four actuators on one shuttle, each with a 2-bearing shaft assembly, established a reliability level of 96.9 percent for a life of 12 missions. A probabilistic system analysis for four shuttles, each of which has four actuators, predicts a single bearing failure in one actuator of one shuttle after 22 missions (a total of 88 missions for a 4-shuttle fleet). This prediction is comparable with actual shuttle flight history in which a single actuator bearing was found to have failed by wear at 20 missions.

  14. Automatic Threshold Detector Techniques

    Science.gov (United States)

    1976-07-15

    Averaging CFAR in Non- Stationary Weibull Clutter, " L. Novak, (1974 IEEE Symposium on Information Theory ). 8. "The Weibull Distribution Applied to the... UGTS (K) ,Kml NPTS) 140 DO 153 K~lvNPT9 IF(SIGCSO(K) .LT.0. )SIOCSO(K).1 .E-50 IF(SIOWSO(K) .LT.0. )SIGWSQ(K)-1 .E-50 IF(SIONSG(K) .LT.O. )SIG3NSQCIO-1.E

  15. Diameter distribution in a Brazilian tropical dry forest domain: predictions for the stand and species.

    Science.gov (United States)

    Lima, Robson B DE; Bufalino, Lina; Alves, Francisco T; Silva, José A A DA; Ferreira, Rinaldo L C

    2017-01-01

    Currently, there is a lack of studies on the correct utilization of continuous distributions for dry tropical forests. Therefore, this work aims to investigate the diameter structure of a brazilian tropical dry forest and to select suitable continuous distributions by means of statistic tools for the stand and the main species. Two subsets were randomly selected from 40 plots. Diameter at base height was obtained. The following functions were tested: log-normal; gamma; Weibull 2P and Burr. The best fits were selected by Akaike's information validation criterion. Overall, the diameter distribution of the dry tropical forest was better described by negative exponential curves and positive skewness. The forest studied showed diameter distributions with decreasing probability for larger trees. This behavior was observed for both the main species and the stand. The generalization of the function fitted for the main species show that the development of individual models is needed. The Burr function showed good flexibility to describe the diameter structure of the stand and the behavior of Mimosa ophthalmocentra and Bauhinia cheilantha species. For Poincianella bracteosa, Aspidosperma pyrifolium and Myracrodum urundeuva better fitting was obtained with the log-normal function.

  16. Application of survival analysis methodology to the quantitative analysis of LC-MS proteomics data.

    Science.gov (United States)

    Tekwe, Carmen D; Carroll, Raymond J; Dabney, Alan R

    2012-08-01

    Protein abundance in quantitative proteomics is often based on observed spectral features derived from liquid chromatography mass spectrometry (LC-MS) or LC-MS/MS experiments. Peak intensities are largely non-normal in distribution. Furthermore, LC-MS-based proteomics data frequently have large proportions of missing peak intensities due to censoring mechanisms on low-abundance spectral features. Recognizing that the observed peak intensities detected with the LC-MS method are all positive, skewed and often left-censored, we propose using survival methodology to carry out differential expression analysis of proteins. Various standard statistical techniques including non-parametric tests such as the Kolmogorov-Smirnov and Wilcoxon-Mann-Whitney rank sum tests, and the parametric survival model and accelerated failure time-model with log-normal, log-logistic and Weibull distributions were used to detect any differentially expressed proteins. The statistical operating characteristics of each method are explored using both real and simulated datasets. Survival methods generally have greater statistical power than standard differential expression methods when the proportion of missing protein level data is 5% or more. In particular, the AFT models we consider consistently achieve greater statistical power than standard testing procedures, with the discrepancy widening with increasing missingness in the proportions. The testing procedures discussed in this article can all be performed using readily available software such as R. The R codes are provided as supplemental materials. ctekwe@stat.tamu.edu.

  17. LHCb Distributed Data Analysis on the Computing Grid

    CERN Document Server

    Paterson, S; Parkes, C

    2006-01-01

    LHCb is one of the four Large Hadron Collider (LHC) experiments based at CERN, the European Organisation for Nuclear Research. The LHC experiments will start taking an unprecedented amount of data when they come online in 2007. Since no single institute has the compute resources to handle this data, resources must be pooled to form the Grid. Where the Internet has made it possible to share information stored on computers across the world, Grid computing aims to provide access to computing power and storage capacity on geographically distributed systems. LHCb software applications must work seamlessly on the Grid allowing users to efficiently access distributed compute resources. It is essential to the success of the LHCb experiment that physicists can access data from the detector, stored in many heterogeneous systems, to perform distributed data analysis. This thesis describes the work performed to enable distributed data analysis for the LHCb experiment on the LHC Computing Grid.

  18. An assessment on seasonal analysis of wind energy characteristics and wind turbine characteristics

    International Nuclear Information System (INIS)

    Akpinar, E. Kavak; Akpinar, S.

    2005-01-01

    This paper presents seasonal variations of the wind characteristics and wind turbine characteristics in the regions around Elazig, namely Maden, Agin and Keban. Mean wind speed data in measured hourly time series format is statistically analyzed for the six year period 1998-2003. The probability density distributions are derived from the time series data and their distributional parameters are identified. Two probability density functions are fitted to the measured probability distributions on a seasonal basis. The wind energy characteristics of all the regions is studied based on the Weibull and Rayleigh distributions. Energy calculations and capacity factors for the wind turbine characteristics were determined for wind machines of different sizes between 300 and 2300 kW. It was found that Maden is the best region, among the regions analyzed, for wind characteristics and wind turbine characteristics

  19. Determining the best population-level alcohol consumption model and its impact on estimates of alcohol-attributable harms

    Directory of Open Access Journals (Sweden)

    Kehoe Tara

    2012-04-01

    Full Text Available Abstract Background The goals of our study are to determine the most appropriate model for alcohol consumption as an exposure for burden of disease, to analyze the effect of the chosen alcohol consumption distribution on the estimation of the alcohol Population- Attributable Fractions (PAFs, and to characterize the chosen alcohol consumption distribution by exploring if there is a global relationship within the distribution. Methods To identify the best model, the Log-Normal, Gamma, and Weibull prevalence distributions were examined using data from 41 surveys from Gender, Alcohol and Culture: An International Study (GENACIS and from the European Comparative Alcohol Study. To assess the effect of these distributions on the estimated alcohol PAFs, we calculated the alcohol PAF for diabetes, breast cancer, and pancreatitis using the three above-named distributions and using the more traditional approach based on categories. The relationship between the mean and the standard deviation from the Gamma distribution was estimated using data from 851 datasets for 66 countries from GENACIS and from the STEPwise approach to Surveillance from the World Health Organization. Results The Log-Normal distribution provided a poor fit for the survey data, with Gamma and Weibull distributions providing better fits. Additionally, our analyses showed that there were no marked differences for the alcohol PAF estimates based on the Gamma or Weibull distributions compared to PAFs based on categorical alcohol consumption estimates. The standard deviation of the alcohol distribution was highly dependent on the mean, with a unit increase in alcohol consumption associated with a unit increase in the mean of 1.258 (95% CI: 1.223 to 1.293 (R2 = 0.9207 for women and 1.171 (95% CI: 1.144 to 1.197 (R2 = 0. 9474 for men. Conclusions Although the Gamma distribution and the Weibull distribution provided similar results, the Gamma distribution is recommended to model alcohol

  20. Preventative maintenance cycle of contact switches for nuclear power plants based on lifetime assessment and economic analysis

    International Nuclear Information System (INIS)

    Shi Jie

    2010-01-01

    An approach to determine the preventive maintenance cycle was proposed in consideration of the lifetime, optimal cost and economy. Two parameters Weibull distribution was used to calculate the lifetime of contact switch. The block replacement model and age replacement model were built with the objective of optimal cost, and the preventive replacement cycle was accounted. Eight proposals for preventive replacement cycle were given. Economy model was applied to assess those proposals and the optimal proposal was confirmed. (authors)

  1. HammerCloud: A Stress Testing System for Distributed Analysis

    CERN Document Server

    van der Ster, Daniel C; Ubeda Garcia, Mario; Paladin, Massimo

    2011-01-01

    Distributed analysis of LHC data is an I/O-intensive activity which places large demands on the internal network, storage, and local disks at remote computing facilities. Commissioning and maintaining a site to provide an efficient distributed analysis service is therefore a challenge which can be aided by tools to help evaluate a variety of infrastructure designs and configurations. HammerCloud (HC) is one such tool; it is a stress testing service which is used by central operations teams, regional coordinators, and local site admins to (a) submit arbitrary number of analysis jobs to a number of sites, (b) maintain at a steady-state a predefined number of jobs running at the sites under test, (c) produce web-based reports summarizing the efficiency and performance of the sites under test, and (d) present a web-interface for historical test results to both evaluate progress and compare sites. HC was built around the distributed analysis framework Ganga, exploiting its API for grid job management. HC has been ...

  2. Three-Phase Harmonic Analysis Method for Unbalanced Distribution Systems

    Directory of Open Access Journals (Sweden)

    Jen-Hao Teng

    2014-01-01

    Full Text Available Due to the unbalanced features of distribution systems, a three-phase harmonic analysis method is essential to accurately analyze the harmonic impact on distribution systems. Moreover, harmonic analysis is the basic tool for harmonic filter design and harmonic resonance mitigation; therefore, the computational performance should also be efficient. An accurate and efficient three-phase harmonic analysis method for unbalanced distribution systems is proposed in this paper. The variations of bus voltages, bus current injections and branch currents affected by harmonic current injections can be analyzed by two relationship matrices developed from the topological characteristics of distribution systems. Some useful formulas are then derived to solve the three-phase harmonic propagation problem. After the harmonic propagation for each harmonic order is calculated, the total harmonic distortion (THD for bus voltages can be calculated accordingly. The proposed method has better computational performance, since the time-consuming full admittance matrix inverse employed by the commonly-used harmonic analysis methods is not necessary in the solution procedure. In addition, the proposed method can provide novel viewpoints in calculating the branch currents and bus voltages under harmonic pollution which are vital for harmonic filter design. Test results demonstrate the effectiveness and efficiency of the proposed method.

  3. Effect of oxide films, inclusions and Fe on reproducibility of tensile properties in cast Al–Si–Mg alloys: Statistical and image analysis

    International Nuclear Information System (INIS)

    Eisaabadi B, G.; Davami, P.; Kim, S.K.; Varahram, N.; Yoon, Y.O.; Yeom, G.Y.

    2012-01-01

    Double oxide films (hereafter: oxides), inclusions and Fe-rich phases are known to be the most detrimental defects in cast Al–Si–Mg alloys. The effects of these defects on reproducibility of tensile properties in Al–7Si–0.35Mg alloy have been investigated in this study. Four different casting conditions (low oxide—low Fe, high oxide—low Fe, low oxide—high Fe and high oxide—high Fe) were studied. In each case, 30 tensile test samples were prepared by casting in a metallic mold and machining (total of 120 tensile test samples). Results of tensile test were analyzed by Weibull three-parameter and mixture analyses. The microstructure and fracture surface of samples were studied by optical and scanning electron microscopes. Total of 800 metallography images (200 images for each experiment) were taken and analyzed by image analysis software. Finally, the relationship between tensile properties and defects characteristics was discussed. According to the results, Fe (Fe-related phases) had larger negative impact on tensile properties of the alloy compared to oxides. On the other hand, Weibull analysis revealed that the scattering of tensile properties was mainly due to the presence of oxides in microstructure. Results of image analysis showed that the shape factor and number of pores were mainly controlled by oxides and Fe, respectively. Also, there was a clear relationship between Weibull modules of UTS and El% and shape factor of pores. Furthermore, tensile properties of the examined alloy showed strong dependence to the number of pores.

  4. A planning and analysis framework for evaluating distributed generation and utility strategies

    International Nuclear Information System (INIS)

    Ault, Graham W.

    2000-01-01

    The numbers of smaller scale distributed power generation units connected to the distribution networks of electricity utilities in the UK and elsewhere have grown significantly in recent years. Numerous economic and political drivers have stimulated this growth and continue to provide the environment for future growth in distributed generation. The simple fact that distributed generation is independent from the distribution utility complicates planning and operational tasks for the distribution network. The uncertainty relating to the number, location and type of distributed generating units to connect to the distribution network in the future makes distribution planning a particularly difficult activity. This thesis concerns the problem of distribution network and business planning in the era of distributed generation. A distributed generation strategic analysis framework is proposed to provide the required analytical capability and planning and decision making framework to enable distribution utilities to deal effectively with the challenges and opportunities presented to them by distributed generation. The distributed generation strategic analysis framework is based on the best features of modern planning and decision making methodologies and facilitates scenario based analysis across many utility strategic options and uncertainties. Case studies are presented and assessed to clearly illustrate the potential benefits of such an approach to distributed generation planning in the UK electricity supply industry. (author)

  5. Estimation of design wave heights based on exterme value statistics for Kakinada coast, Bay of Bengal

    Digital Repository Service at National Institute of Oceanography (India)

    Chandramohan, P.; Nayak, B.U.; Raju, N.S.N.

    Statistical analyses for longterm distribution of significant wave heights were performed using Lognormal, Weibull, Gumbel and Fretcher distributions for waves measured off Kakinada, Andhra Pradesh, India from June 1983 to May 1984. Fretcher...

  6. Comparative analysis through probability distributions of a data set

    Science.gov (United States)

    Cristea, Gabriel; Constantinescu, Dan Mihai

    2018-02-01

    In practice, probability distributions are applied in such diverse fields as risk analysis, reliability engineering, chemical engineering, hydrology, image processing, physics, market research, business and economic research, customer support, medicine, sociology, demography etc. This article highlights important aspects of fitting probability distributions to data and applying the analysis results to make informed decisions. There are a number of statistical methods available which can help us to select the best fitting model. Some of the graphs display both input data and fitted distributions at the same time, as probability density and cumulative distribution. The goodness of fit tests can be used to determine whether a certain distribution is a good fit. The main used idea is to measure the "distance" between the data and the tested distribution, and compare that distance to some threshold values. Calculating the goodness of fit statistics also enables us to order the fitted distributions accordingly to how good they fit to data. This particular feature is very helpful for comparing the fitted models. The paper presents a comparison of most commonly used goodness of fit tests as: Kolmogorov-Smirnov, Anderson-Darling, and Chi-Squared. A large set of data is analyzed and conclusions are drawn by visualizing the data, comparing multiple fitted distributions and selecting the best model. These graphs should be viewed as an addition to the goodness of fit tests.

  7. Evaluation of the climate change impact on wind resources in Taiwan Strait

    International Nuclear Information System (INIS)

    Chang, Tsang-Jung; Chen, Chun-Lung; Tu, Yi-Long; Yeh, Hung-Te; Wu, Yu-Ting

    2015-01-01

    Highlights: • We propose a new statistical downscaling framework to evaluate the climate change impact on wind resources in Taiwan Strait. • The statistical model relates Weibull distribution parameters to output of a GCM model and regression coefficients. • Validation of the simulated wind speed distribution presents an acceptable agreement with meteorological data. • Three chosen GCMs show the same tendency that the eastern half of Taiwan Strait stores higher wind resources. - Abstract: A new statistical downscaling framework is proposed to evaluate the climate change impact on wind resources in Taiwan Strait. In this framework, a two-parameter Weibull distribution function is used to estimate the wind energy density distribution in the strait. An empirically statistical downscaling model that relates the Weibull parameters to output of a General Circulation Model (GCM) and regression coefficients is adopted. The regression coefficients are calculated using wind speed results obtained from a past climate (1981–2000) simulation reconstructed by a Weather Research and Forecasting (WRF) model. These WRF-reconstructed wind speed results are validated with data collected at a weather station on an islet inside the strait. The comparison shows that the probability distributions of the monthly wind speeds obtained from WRF-reconstructed and measured wind speed data are in acceptable agreement, with small discrepancies of 10.3% and 7.9% for the shape and scale parameters of the Weibull distribution, respectively. The statistical downscaling framework with output from three chosen GCMs (i.e., ECHAM5, CM2.1 and CGCM2.3.2) is applied to evaluate the wind energy density distribution in Taiwan Strait for three future climate periods of 2011–2040, 2041–2070, and 2071–2100. The results show that the wind energy density distributions in the future climate periods are higher in the eastern half of Taiwan Strait, but reduce slightly by 3% compared with that in the

  8. Distributed bearing fault diagnosis based on vibration analysis

    Science.gov (United States)

    Dolenc, Boštjan; Boškoski, Pavle; Juričić, Đani

    2016-01-01

    Distributed bearing faults appear under various circumstances, for example due to electroerosion or the progression of localized faults. Bearings with distributed faults tend to generate more complex vibration patterns than those with localized faults. Despite the frequent occurrence of such faults, their diagnosis has attracted limited attention. This paper examines a method for the diagnosis of distributed bearing faults employing vibration analysis. The vibrational patterns generated are modeled by incorporating the geometrical imperfections of the bearing components. Comparing envelope spectra of vibration signals shows that one can distinguish between localized and distributed faults. Furthermore, a diagnostic procedure for the detection of distributed faults is proposed. This is evaluated on several bearings with naturally born distributed faults, which are compared with fault-free bearings and bearings with localized faults. It is shown experimentally that features extracted from vibrations in fault-free, localized and distributed fault conditions form clearly separable clusters, thus enabling diagnosis.

  9. FE Analysis of Rock with Hydraulic-Mechanical Coupling Based on Continuum Damage Evolution

    Directory of Open Access Journals (Sweden)

    Yongliang Wang

    2016-01-01

    Full Text Available A numerical finite element (FE analysis technology is presented for efficient and reliable solutions of rock with hydraulic-mechanical (HM coupling, researching the seepage characteristics and simulating the damage evolution of rock. To be in accord with the actual situation, the rock is naturally viewed as heterogeneous material, in which Young’s modulus, permeability, and strength property obey the typical Weibull distribution function. The classic Biot constitutive relation for rock as porous medium is introduced to establish a set of equations coupling with elastic solid deformation and seepage flow. The rock is subsequently developed into a novel conceptual and practical model considering the damage evolution of Young’s modulus and permeability, in which comprehensive utilization of several other auxiliary technologies, for example, the Drucker-Prager strength criterion, the statistical strength theory, and the continuum damage evolution, yields the damage variable calculating technology. To this end, an effective and reliable numerical FE analysis strategy is established. Numerical examples are given to show that the proposed method can establish heterogeneous rock model and be suitable for different load conditions and furthermore to demonstrate the effectiveness and reliability in the seepage and damage characteristics analysis for rock.

  10. System analysis and planning of a gas distribution network

    Energy Technology Data Exchange (ETDEWEB)

    Salas, Edwin F.M.; Farias, Helio Monteiro [AUTOMIND, Rio de Janeiro, RJ (Brazil); Costa, Carla V.R. [Universidade Salvador (UNIFACS), BA (Brazil)

    2009-07-01

    The increase in demand by gas consumers require that projects or improvements in gas distribution networks be made carefully and safely to ensure a continuous, efficient and economical supply. Gas distribution companies must ensure that the networks and equipment involved are defined and designed at the appropriate time to attend to the demands of the market. To do that a gas distribution network analysis and planning tool should use distribution networks and transmission models for the current situation and the future changes to be implemented. These models are used to evaluate project options and help in making appropriate decisions in order to minimize the capital investment in new components or simple changes in operational procedures. Gas demands are increasing and it is important that gas distribute design new distribution systems to ensure this growth, considering financial constraints of the company, as well as local legislation and regulation. In this study some steps of developing a flexible system that attends to those needs will be described. The analysis of distribution requires geographically referenced data for the models as well as an accurate connectivity and the attributes of the equipment. GIS systems are often used as a deposit center that holds the majority of this information. GIS systems are constantly updated as distribution network equipment is modified. The distribution network modeling gathered from this system ensures that the model represents the current network condition. The benefits of this architecture drastically reduce the creation and maintenance cost of the network models, because network components data are conveniently made available to populate the distribution network. This architecture ensures that the models are continually reflecting the reality of the distribution network. (author)

  11. Distributed analysis in ATLAS using GANGA

    International Nuclear Information System (INIS)

    Elmsheuser, Johannes; Brochu, Frederic; Egede, Ulrik; Reece, Will; Williams, Michael; Gaidioz, Benjamin; Maier, Andrew; Moscicki, Jakub; Vanderster, Daniel; Lee, Hurng-Chun; Pajchel, Katarina; Samset, Bjorn; Slater, Mark; Soroko, Alexander; Cowan, Greig

    2010-01-01

    Distributed data analysis using Grid resources is one of the fundamental applications in high energy physics to be addressed and realized before the start of LHC data taking. The needs to manage the resources are very high. In every experiment up to a thousand physicists will be submitting analysis jobs to the Grid. Appropriate user interfaces and helper applications have to be made available to assure that all users can use the Grid without expertise in Grid technology. These tools enlarge the number of Grid users from a few production administrators to potentially all participating physicists. The GANGA job management system (http://cern.ch/ganga), developed as a common project between the ATLAS and LHCb experiments, provides and integrates these kind of tools. GANGA provides a simple and consistent way of preparing, organizing and executing analysis tasks within the experiment analysis framework, implemented through a plug-in system. It allows trivial switching between running test jobs on a local batch system and running large-scale analyzes on the Grid, hiding Grid technicalities. We will be reporting on the plug-ins and our experiences of distributed data analysis using GANGA within the ATLAS experiment. Support for all Grids presently used by ATLAS, namely the LCG/EGEE, NDGF/NorduGrid, and OSG/PanDA is provided. The integration and interaction with the ATLAS data management system DQ2 into GANGA is a key functionality. An intelligent job brokering is set up by using the job splitting mechanism together with data-set and file location knowledge. The brokering is aided by an automated system that regularly processes test analysis jobs at all ATLAS DQ2 supported sites. Large numbers of analysis jobs can be sent to the locations of data following the ATLAS computing model. GANGA supports amongst other things tasks of user analysis with reconstructed data and small scale production of Monte Carlo data.

  12. Modelling Framework and the Quantitative Analysis of Distributed Energy Resources in Future Distribution Networks

    DEFF Research Database (Denmark)

    Han, Xue; Sandels, Claes; Zhu, Kun

    2013-01-01

    There has been a large body of statements claiming that the large-scale deployment of Distributed Energy Resources (DERs) could eventually reshape the future distribution grid operation in numerous ways. Thus, it is necessary to introduce a framework to measure to what extent the power system......, comprising distributed generation, active demand and electric vehicles. Subsequently, quantitative analysis was made on the basis of the current and envisioned DER deployment scenarios proposed for Sweden. Simulations are performed in two typical distribution network models for four seasons. The simulation...... results show that in general the DER deployment brings in the possibilities to reduce the power losses and voltage drops by compensating power from the local generation and optimizing the local load profiles....

  13. Retrospective analysis of 'gamma distribution' based IMRT QA criteria

    International Nuclear Information System (INIS)

    Wen, C.; Chappell, R.A.

    2010-01-01

    Full text: IMRT has been implemented into clinical practice at Royal Hobart Hospital (RHH) since mid 2006 for treating patients with Head and Neck (H and N) or prostate tumours. A local quality assurance (QA) acceptance criteria based on 'gamma distribution' for approving IMRT plan was developed and implemented in early 2007. A retrospective analysis of such criteria over 194 clinical cases will be presented. The RHH IMRT criteria was established with assumption that gamma distribution obtained through inter-comparison of 2 D dose maps between planned and delivered was governed by a positive-hail' normal distribution. A commercial system-MapCheck was used for 2 D dose map comparison with a built-in gamma analysis tool. Gamma distribution histogram was generated and recorded for all cases. By retrospectively analysing those distributions using curve fitting technique, a statistical gamma distribution can be obtained and evaluated. This analytical result can be used for future IMRT planing and treatment delivery. The analyses indicate that gamma distribution obtained through MapCheckTM is well under the normal distribution, particularly for prostate cases. The applied pass/fail criteria is not overly sensitive to identify 'false fails' but can be further tighten-up for smaller field while for larger field found in both H and N and prostate cases, the criteria was correctly applied. Non-uniform distribution of detectors in MapCheck and experience level of planners are two major factors to variation in gamma distribution among clinical cases. This criteria derived from clinical statistics is superior and more accurate than single-valued criteria for lMRT QA acceptance procedure. (author)

  14. Predicting The Exit Time Of Employees In An Organization Using Statistical Model

    Directory of Open Access Journals (Sweden)

    Ahmed Al Kuwaiti

    2015-08-01

    Full Text Available Employees are considered as an asset to any organization and each organization provide a better and flexible working environment to retain its best and resourceful workforce. As such continuous efforts are being taken to avoid or extend the exitwithdrawal of employees from the organization. Human resource managers are facing a challenge to predict the exit time of employees and there is no precise model existing at present in the literature. This study has been conducted to predict the probability of exit of an employee in an organization using appropriate statistical model. Accordingly authors designed a model using Additive Weibull distribution to predict the expected exit time of employee in an organization. In addition a Shock model approach is also executed to check how well the Additive Weibull distribution suits in an organization. The analytical results showed that when the inter-arrival time increases the expected time for the employees to exit also increases. This study concluded that Additive Weibull distribution can be considered as an alternative in the place of Shock model approach to predict the exit time of employee in an organization.

  15. Study on the product estimation of small wind turbines; Kogata fusha no hatsudenryo yosoku ni kansuru kenkyu

    Energy Technology Data Exchange (ETDEWEB)

    Matsuzawa, K.; Kimura, Y.; Ushiyama, I. [Ashikaga Institute of Technology, Tochigi (Japan); Nagai, H. [Nihon Univ., Chiba (Japan). Coll. of Industrial Technology

    1998-09-01

    In order to clarify problems involved in application of Weibull probability distribution used for estimation of power production by a large wind turbine to a small wind turbine, and solutions thereof, the estimated results are compared with the observed ones. The conventional estimation method, when applied to a small wind turbine, tends to overestimate production of power, because of overestimated production in a high wind velocity range which occurs less frequently. Estimation of power produced by a wind turbine is based on working wind velocity range, determined from the furling mechanism for the power generation characteristics of the wind turbine concerned. In the case of a small wind turbine, on the other hand, better estimates are obtained from the working wind velocity range in which Weibull wind velocity distribution is used to determine probability of occurrence. For wind turbines working at low to medium wind velocities, such as Savonius wind turbine, the estimates are in fairly good agreement with the observed results, by which is meant that the conventional estimation method aided by Weibull distribution can be directly applicable to small wind turbines. 4 refs., 3 figs., 3 tabs.

  16. An integrated economic and distributional analysis of energy policies

    Energy Technology Data Exchange (ETDEWEB)

    Labandeira, Xavier [Facultade de CC. Economicas, University of Vigo, 36310 Vigo (Spain); Labeaga, Jose M. [Instituto de Estudios Fiscales, Avda. Cardenal Herrera Oria 378, 28035 Madrid (Spain); Rodriguez, Miguel [Facultade de CC. Empresariais e Turismo, University of Vigo, 32004 Ourense (Spain)

    2009-12-15

    Most public policies, particularly those in the energy sphere, have not only efficiency but also distributional effects. However, there is a trade-off between modelling approaches suitable for calculating those impacts on the economy. For the former most of the studies have been conducted with general equilibrium models, whereas partial equilibrium models represent the main approach for distributional analysis. This paper proposes a methodology to simultaneously carry out an analysis of the distributional and efficiency consequences of changes in energy taxation. In order to do so, we have integrated a microeconomic household demand model and a computable general equilibrium model for the Spanish economy. We illustrate the advantages of this approach by simulating a revenue-neutral reform in Spanish indirect taxation, with a large increase of energy taxes that serve an environmental purpose. The results show that the reforms bring about significant efficiency and distributional effects, in some cases counterintuitive, and demonstrate the academic and social utility of this approximation. (author)

  17. An integrated economic and distributional analysis of energy policies

    International Nuclear Information System (INIS)

    Labandeira, Xavier; Labeaga, Jose M.; Rodriguez, Miguel

    2009-01-01

    Most public policies, particularly those in the energy sphere, have not only efficiency but also distributional effects. However, there is a trade-off between modelling approaches suitable for calculating those impacts on the economy. For the former most of the studies have been conducted with general equilibrium models, whereas partial equilibrium models represent the main approach for distributional analysis. This paper proposes a methodology to simultaneously carry out an analysis of the distributional and efficiency consequences of changes in energy taxation. In order to do so, we have integrated a microeconomic household demand model and a computable general equilibrium model for the Spanish economy. We illustrate the advantages of this approach by simulating a revenue-neutral reform in Spanish indirect taxation, with a large increase of energy taxes that serve an environmental purpose. The results show that the reforms bring about significant efficiency and distributional effects, in some cases counterintuitive, and demonstrate the academic and social utility of this approximation. (author)

  18. Joint analysis of air pollution in street canyons in St. Petersburg and Copenhagen

    Science.gov (United States)

    Genikhovich, E. L.; Ziv, A. D.; Iakovleva, E. A.; Palmgren, F.; Berkowicz, R.

    The bi-annual data set of concentrations of several traffic-related air pollutants, measured continuously in street canyons in St. Petersburg and Copenhagen, is analysed jointly using different statistical techniques. Annual mean concentrations of NO 2, NO x and, especially, benzene are found systematically higher in St. Petersburg than in Copenhagen but for ozone the situation is opposite. In both cities probability distribution functions (PDFs) of concentrations and their daily or weekly extrema are fitted with the Weibull and double exponential distributions, respectively. Sample estimates of bi-variate distributions of concentrations, concentration roses, and probabilities of concentration of one pollutant being extreme given that another one reaches its extremum are presented in this paper as well as auto- and co-spectra. It is demonstrated that there is a reasonably high correlation between seasonally averaged concentrations of pollutants in St. Petersburg and Copenhagen.

  19. Mediation Analysis with Survival Outcomes: Accelerated Failure Time vs. Proportional Hazards Models.

    Science.gov (United States)

    Gelfand, Lois A; MacKinnon, David P; DeRubeis, Robert J; Baraldi, Amanda N

    2016-01-01

    Survival time is an important type of outcome variable in treatment research. Currently, limited guidance is available regarding performing mediation analyses with survival outcomes, which generally do not have normally distributed errors, and contain unobserved (censored) events. We present considerations for choosing an approach, using a comparison of semi-parametric proportional hazards (PH) and fully parametric accelerated failure time (AFT) approaches for illustration. We compare PH and AFT models and procedures in their integration into mediation models and review their ability to produce coefficients that estimate causal effects. Using simulation studies modeling Weibull-distributed survival times, we compare statistical properties of mediation analyses incorporating PH and AFT approaches (employing SAS procedures PHREG and LIFEREG, respectively) under varied data conditions, some including censoring. A simulated data set illustrates the findings. AFT models integrate more easily than PH models into mediation models. Furthermore, mediation analyses incorporating LIFEREG produce coefficients that can estimate causal effects, and demonstrate superior statistical properties. Censoring introduces bias in the coefficient estimate representing the treatment effect on outcome-underestimation in LIFEREG, and overestimation in PHREG. With LIFEREG, this bias can be addressed using an alternative estimate obtained from combining other coefficients, whereas this is not possible with PHREG. When Weibull assumptions are not violated, there are compelling advantages to using LIFEREG over PHREG for mediation analyses involving survival-time outcomes. Irrespective of the procedures used, the interpretation of coefficients, effects of censoring on coefficient estimates, and statistical properties should be taken into account when reporting results.

  20. Mediation Analysis with Survival Outcomes: Accelerated Failure Time vs. Proportional Hazards Models

    Science.gov (United States)

    Gelfand, Lois A.; MacKinnon, David P.; DeRubeis, Robert J.; Baraldi, Amanda N.

    2016-01-01

    Objective: Survival time is an important type of outcome variable in treatment research. Currently, limited guidance is available regarding performing mediation analyses with survival outcomes, which generally do not have normally distributed errors, and contain unobserved (censored) events. We present considerations for choosing an approach, using a comparison of semi-parametric proportional hazards (PH) and fully parametric accelerated failure time (AFT) approaches for illustration. Method: We compare PH and AFT models and procedures in their integration into mediation models and review their ability to produce coefficients that estimate causal effects. Using simulation studies modeling Weibull-distributed survival times, we compare statistical properties of mediation analyses incorporating PH and AFT approaches (employing SAS procedures PHREG and LIFEREG, respectively) under varied data conditions, some including censoring. A simulated data set illustrates the findings. Results: AFT models integrate more easily than PH models into mediation models. Furthermore, mediation analyses incorporating LIFEREG produce coefficients that can estimate causal effects, and demonstrate superior statistical properties. Censoring introduces bias in the coefficient estimate representing the treatment effect on outcome—underestimation in LIFEREG, and overestimation in PHREG. With LIFEREG, this bias can be addressed using an alternative estimate obtained from combining other coefficients, whereas this is not possible with PHREG. Conclusions: When Weibull assumptions are not violated, there are compelling advantages to using LIFEREG over PHREG for mediation analyses involving survival-time outcomes. Irrespective of the procedures used, the interpretation of coefficients, effects of censoring on coefficient estimates, and statistical properties should be taken into account when reporting results. PMID:27065906

  1. Mediation Analysis with Survival Outcomes: Accelerated Failure Time Versus Proportional Hazards Models

    Directory of Open Access Journals (Sweden)

    Lois A Gelfand

    2016-03-01

    Full Text Available Objective: Survival time is an important type of outcome variable in treatment research. Currently, limited guidance is available regarding performing mediation analyses with survival outcomes, which generally do not have normally distributed errors, and contain unobserved (censored events. We present considerations for choosing an approach, using a comparison of semi-parametric proportional hazards (PH and fully parametric accelerated failure time (AFT approaches for illustration.Method: We compare PH and AFT models and procedures in their integration into mediation models and review their ability to produce coefficients that estimate causal effects. Using simulation studies modeling Weibull-distributed survival times, we compare statistical properties of mediation analyses incorporating PH and AFT approaches (employing SAS procedures PHREG and LIFEREG, respectively under varied data conditions, some including censoring. A simulated data set illustrates the findings.Results: AFT models integrate more easily than PH models into mediation models. Furthermore, mediation analyses incorporating LIFEREG produce coefficients that can estimate causal effects, and demonstrate superior statistical properties. Censoring introduces bias in the coefficient estimate representing the treatment effect on outcome – underestimation in LIFEREG, and overestimation in PHREG. With LIFEREG, this bias can be addressed using an alternative estimate obtained from combining other coefficients, whereas this is not possible with PHREG.Conclusions: When Weibull assumptions are not violated, there are compelling advantages to using LIFEREG over PHREG for mediation analyses involving survival-time outcomes. Irrespective of the procedures used, the interpretation of coefficients, effects of censoring on coefficient estimates, and statistical properties should be taken into account when reporting results.

  2. First Experiences with LHC Grid Computing and Distributed Analysis

    CERN Document Server

    Fisk, Ian

    2010-01-01

    In this presentation the experiences of the LHC experiments using grid computing were presented with a focus on experience with distributed analysis. After many years of development, preparation, exercises, and validation the LHC (Large Hadron Collider) experiments are in operations. The computing infrastructure has been heavily utilized in the first 6 months of data collection. The general experience of exploiting the grid infrastructure for organized processing and preparation is described, as well as the successes employing the infrastructure for distributed analysis. At the end the expected evolution and future plans are outlined.

  3. Application of survival analysis methodology to the quantitative analysis of LC-MS proteomics data

    KAUST Repository

    Tekwe, C. D.

    2012-05-24

    MOTIVATION: Protein abundance in quantitative proteomics is often based on observed spectral features derived from liquid chromatography mass spectrometry (LC-MS) or LC-MS/MS experiments. Peak intensities are largely non-normal in distribution. Furthermore, LC-MS-based proteomics data frequently have large proportions of missing peak intensities due to censoring mechanisms on low-abundance spectral features. Recognizing that the observed peak intensities detected with the LC-MS method are all positive, skewed and often left-censored, we propose using survival methodology to carry out differential expression analysis of proteins. Various standard statistical techniques including non-parametric tests such as the Kolmogorov-Smirnov and Wilcoxon-Mann-Whitney rank sum tests, and the parametric survival model and accelerated failure time-model with log-normal, log-logistic and Weibull distributions were used to detect any differentially expressed proteins. The statistical operating characteristics of each method are explored using both real and simulated datasets. RESULTS: Survival methods generally have greater statistical power than standard differential expression methods when the proportion of missing protein level data is 5% or more. In particular, the AFT models we consider consistently achieve greater statistical power than standard testing procedures, with the discrepancy widening with increasing missingness in the proportions. AVAILABILITY: The testing procedures discussed in this article can all be performed using readily available software such as R. The R codes are provided as supplemental materials. CONTACT: ctekwe@stat.tamu.edu.

  4. GIS-based poverty and population distribution analysis in China

    Science.gov (United States)

    Cui, Jing; Wang, Yingjie; Yan, Hong

    2009-07-01

    Geographically, poverty status is not only related with social-economic factors but also strongly affected by geographical environment. In the paper, GIS-based poverty and population distribution analysis method is introduced for revealing their regional differences. More than 100000 poor villages and 592 national key poor counties are chosen for the analysis. The results show that poverty distribution tends to concentrate in most of west China and mountainous rural areas of mid China. Furthermore, the fifth census data are overlaid to those poor areas in order to gain its internal diversity of social-economic characteristics. By overlaying poverty related social-economic parameters, such as sex ratio, illiteracy, education level, percentage of ethnic minorities, family composition, finding shows that poverty distribution is strongly correlated with high illiteracy rate, high percentage minorities, and larger family member.

  5. Size distribution of magnetic iron oxide nanoparticles using Warren-Averbach XRD analysis

    Science.gov (United States)

    Mahadevan, S.; Behera, S. P.; Gnanaprakash, G.; Jayakumar, T.; Philip, J.; Rao, B. P. C.

    2012-07-01

    We use the Fourier transform based Warren-Averbach (WA) analysis to separate the contributions of X-ray diffraction (XRD) profile broadening due to crystallite size and microstrain for magnetic iron oxide nanoparticles. The profile shape of the column length distribution, obtained from WA analysis, is used to analyze the shape of the magnetic iron oxide nanoparticles. From the column length distribution, the crystallite size and its distribution are estimated for these nanoparticles which are compared with size distribution obtained from dynamic light scattering measurements. The crystallite size and size distribution of crystallites obtained from WA analysis are explained based on the experimental parameters employed in preparation of these magnetic iron oxide nanoparticles. The variation of volume weighted diameter (Dv, from WA analysis) with saturation magnetization (Ms) fits well to a core shell model wherein it is known that Ms=Mbulk(1-6g/Dv) with Mbulk as bulk magnetization of iron oxide and g as magnetic shell disorder thickness.

  6. CLEAVAGE FRACTURE ANALYSIS OF CLADDED BEAMS WITH AN EMBEDDED FLAW UNDER FOUR-POINT BENDING

    International Nuclear Information System (INIS)

    Yin, Shengjun; Williams, Paul T; Bass, Bennett Richard

    2008-01-01

    Semi-large scale embedded flaw beams were tested at Nuclear Research Institute (NRI) Rez in the Czech Republic for the 6th Network for Evaluating Structural Components (NESC-VI) project. The experiments included, among others, a series of semi-large scale tests on cladded beam specimens containing simulated sub-clad flaws. Oak Ridge National Laboratory (ORNL) conducted numerical studies to analyze the constraint issues associated with embedded flaws using various fracture mechanics methods, including T-Stress, hydrostatic stress based QH stress, and the Weibull stress model. The recently developed local approach using the modified Weibull stress model combined with the Master Curve methodology was also utilized to predict the failure probability (Pf) of semi-large scale beams. For this study, the Weibull statistical model associated with the Master Curve methodology was employed to stochastically simulate the fracture toughness data using the available Master Curve reference temperature T0 for the tested base material from the 'aged' WWER-440 Reactor Pressure Vessel (RPV). The study was also conducted to investigate the sensitivity of predicated probability of failure of semi-large scale beams with embedded flaw with different Weibull shape parameters, m

  7. Uncertainty analysis for secondary energy distributions

    International Nuclear Information System (INIS)

    Gerstl, S.A.W.

    1978-01-01

    In many transport calculations the integral design parameter of interest (response) is determined mainly by secondary particles such as gamma rays from (n,γ) reactions or secondary neutrons from inelastic scattering events or (n,2n) reactions. Standard sensitivity analysis usually allows to calculate the sensitivities to the production cross sections of such secondaries, but an extended formalism is needed to also obtain the sensitivities to the energy distribution of the generated secondary particles. For a 30-group standard cross-section set 84% of all non-zero table positions pertain to the description of secondary energy distributions (SED's) and only 16% to the actual reaction cross sections. Therefore, any sensitivity/uncertainty analysis which does not consider the effects of SED's is incomplete and neglects most of the input data. This paper describes the methods of how sensitivity profiles for SED's are obtained and used to estimate the uncertainty of an integral response due to uncertainties in these SED's. The detailed theory is documented elsewhere and implemented in the LASL sensitivity code SENSIT. SED sensitivity profiles have proven particularly valuable in cross-section uncertainty analyses for fusion reactors. Even when the production cross sections for secondary neutrons were assumed to be without error, the uncertainties in the energy distribution of these secondaries produced appreciable uncertainties in the calculated tritium breeding rate. However, complete error files for SED's are presently nonexistent. Therefore, methods will be described that allow rough error estimates due to estimated SED uncertainties based on integral SED sensitivities

  8. Modelización de la antigüedad de las citas en la literatura científica con datos censurados a la derecha

    Directory of Open Access Journals (Sweden)

    Basulto Santos, Jesús

    2002-06-01

    Full Text Available We explore the application of Weibull model in age distribution of citations in scientific articles references, when this age is censored at the right side. The Weibull model has been applied to twelve journals in the field of applied economy, where the basic source of information comes from ISI. The censored age is due to the fact that the citations of 10 or more years appear accumulated in the ISI database. After the model has been fitted, we have built a two-dimensional analysis combining the impact factor, that picks up the short term citations, with the percentile 90, that measures the durability or validity period of the scientific articles.

    Exploramos la aplicación del modelo Weibull a la antigüedad de las citas en los artículos científicos, cuando dicha antigüedad está censurada a la derecha. Se ha aplicado el modelo Weibull a doce revistas del ámbito de economía aplicada, usando la información proveniente del ISI. La censura es debida a que las citas de 10 o más años de antigüedad aparecen agregadas en la base de datos del ISI. Una vez ajustado el modelo, hemos realizado un análisis bidimensional combinando el factor de impacto, que recoge las citas a corto plazo, con el percentil 90, que mide la durabilidad o periodo de vigencia de los artículos científicos.

  9. Location and Size Planning of Distributed Photovoltaic Generation in Distribution network System Based on K-means Clustering Analysis

    Science.gov (United States)

    Lu, Siqi; Wang, Xiaorong; Wu, Junyong

    2018-01-01

    The paper presents a method to generate the planning scenarios, which is based on K-means clustering analysis algorithm driven by data, for the location and size planning of distributed photovoltaic (PV) units in the network. Taken the power losses of the network, the installation and maintenance costs of distributed PV, the profit of distributed PV and the voltage offset as objectives and the locations and sizes of distributed PV as decision variables, Pareto optimal front is obtained through the self-adaptive genetic algorithm (GA) and solutions are ranked by a method called technique for order preference by similarity to an ideal solution (TOPSIS). Finally, select the planning schemes at the top of the ranking list based on different planning emphasis after the analysis in detail. The proposed method is applied to a 10-kV distribution network in Gansu Province, China and the results are discussed.

  10. Wind Resource Assessment in Abadan Airport in Iran

    Directory of Open Access Journals (Sweden)

    Mojtaba Nedaei

    2012-11-01

    Full Text Available Renewable energies have potential for supplying of relatively clean and mostly local energy. Wind energy generation is expected to increase in the near future and has experienced dramatic growth over the past decade in many countries. Wind speed is the most important parameter in the design and study of wind energy conversion systems. Probability density functions such as Weibull and Rayleigh are often used in wind speed and wind energy analyses. This paper presents an assessment of wind energy at three heights during near two years based on Weibull distribution function in Abadan Airport. Extrapolation of the 10 m and 40 m data, using the power law, has been used to determine the wind speed at height of 80 m. According to the results wind speed at 80 m height in Abadan is ranged from 5.8 m/s in Nov to 8.5 m/s in Jun with average value of 7.15 m/s. In this study, different parameters such as Weibull parameters, diurnal and monthly wind speeds, cumulative distribution and turbulence intensity have been estimated and analyzed. In addition Energy production of different wind turbines at different heights was estimated. The results show that the studied site has good potential for Installation of large and commercial wind turbines at height of 80 m or higher. Keywords: Abadan, Iran, wind energy, wind resource, wind turbine, Weibull

  11. Distributed analysis environment for HEP and interdisciplinary applications

    CERN Document Server

    Moscicki, J T

    2003-01-01

    Huge data volumes of Large Hadron Collider experiments require parallel end-user analysis on clusters of hundreds of machines. While the focus of end-user High-Energy Physics analysis is on ntuples, the master-worker model of parallel processing may be also used in other contexts such as detector simulation. The aim of DIANE R&D project (http://cern.ch/it-proj-diane) currently held by CERN IT/API group is to create a generic, component-based framework for distributed, parallel data processing in master-worker model. Pre-compiled user analysis code is loaded dynamically at runtime in component libraries and called back when appropriate. Such application-oriented framework must be flexible enough to integrate with the emerging GRID technologies as they become available in the time to come. Therefore, common services such as environment reconstruction, code distribution, load balancing and authentication are designed and implemented as pluggable modules. This approach allows to easily replace them with modul...

  12. Statistical damage analysis of transverse cracking in high temperature composite laminates

    International Nuclear Information System (INIS)

    Sun Zuo; Daniel, I.M.; Luo, J.J.

    2003-01-01

    High temperature polymer composites are receiving special attention because of their potential applications to high speed transport airframe structures and aircraft engine components exposed to elevated temperatures. In this study, a statistical analysis was used to study the progressive transverse cracking in a typical high temperature composite. The mechanical properties of this unidirectional laminate were first characterized both at room and high temperatures. Damage mechanisms of transverse cracking in cross-ply laminates were studied by X-ray radiography at room temperature and in-test photography technique at high temperature. Since the tensile strength of unidirectional laminate along transverse direction was found to follow Weibull distribution, Monte Carlo simulation technique based on experimentally obtained parameters was applied to predict transverse cracking at different temperatures. Experiments and simulation showed that they agree well both at room temperature and 149 deg. C (stress free temperature) in terms of applied stress versus crack density. The probability density function (PDF) of transverse crack spacing considering statistical strength distribution was also developed, and good agreements with simulation and experimental results are reached. Finally, a generalized master curve that predicts the normalized applied stress versus normalized crack density for various lay-ups and various temperatures was established

  13. Objective Bayesian Analysis of Skew- t Distributions

    KAUST Repository

    BRANCO, MARCIA D'ELIA

    2012-02-27

    We study the Jeffreys prior and its properties for the shape parameter of univariate skew-t distributions with linear and nonlinear Student\\'s t skewing functions. In both cases, we show that the resulting priors for the shape parameter are symmetric around zero and proper. Moreover, we propose a Student\\'s t approximation of the Jeffreys prior that makes an objective Bayesian analysis easy to perform. We carry out a Monte Carlo simulation study that demonstrates an overall better behaviour of the maximum a posteriori estimator compared with the maximum likelihood estimator. We also compare the frequentist coverage of the credible intervals based on the Jeffreys prior and its approximation and show that they are similar. We further discuss location-scale models under scale mixtures of skew-normal distributions and show some conditions for the existence of the posterior distribution and its moments. Finally, we present three numerical examples to illustrate the implications of our results on inference for skew-t distributions. © 2012 Board of the Foundation of the Scandinavian Journal of Statistics.

  14. Statistical analysis of the sustained lava dome emplacement and destruction processes at Popocatépetl volcano, Central México

    Science.gov (United States)

    Mendoza-Rosas, Ana Teresa; Gómez-Vázquez, Ángel; De la Cruz-Reyna, Servando

    2017-06-01

    Popocatépetl volcano reawakened in 1994 after nearly 70 years of quiescence. Between 1996 and 2015, a succession of at least 38 lava domes have been irregularly emplaced and destroyed, with each dome reaching particular volumes at specific emplacement rates. The complexity of this sequence is analyzed using statistical methods in an attempt to gain insight into the physics and dynamics of the lava dome emplacement and destruction process and to objectively assess the hazards related to that volcano. The time series of emplacements, dome residences, lava effusion lulls, and emplaced dome volumes and thicknesses are modeled using the simple exponential and Weibull distributions, the compound non-homogeneous generalized Pareto-Poisson process (NHPPP), and the mixture of exponentials distribution (MOED). The statistical analysis reveals that the sequence of dome emplacements is a non-stationary, self-regulating process most likely controlled by the balance between buoyancy-driven magma ascent and volatile exsolution crystallization. This balance has supported the sustained effusive activity for decades and may persist for an undetermined amount of time. However, the eruptive history of Popocatépetl includes major Plinian phases that may have resulted from a breach in that balance. Certain criteria to recognize such breaching conditions are inferred from this statistical analysis.

  15. Unified nano-mechanics based probabilistic theory of quasibrittle and brittle structures: I. Strength, static crack growth, lifetime and scaling

    Science.gov (United States)

    Le, Jia-Liang; Bažant, Zdeněk P.; Bazant, Martin Z.

    2011-07-01

    strength and tests of the power law for the crack growth rate. The theory is shown to match closely numerous test data on strength and static lifetime of ceramics and concrete, and explains why their histograms deviate systematically from the straight line in Weibull scale. Although the present unified theory is built on several previous advances, new contributions are here made to address: (i) a crack in a disordered nano-structure (such as that of hydrated Portland cement), (ii) tail probability of a fiber bundle (or parallel coupling) model with softening elements, (iii) convergence of this model to the Gaussian distribution, (iv) the stress-life curve under constant load, and (v) a detailed random walk analysis of crack front jumps in an atomic lattice. The nonlocal behavior is captured in the present theory through the finiteness of the number of links in the weakest-link model, which explains why the mean size effect coincides with that of the previously formulated nonlocal Weibull theory. Brittle structures correspond to the large-size limit of the present theory. An important practical conclusion is that the safety factors for strength and tolerable minimum lifetime for large quasibrittle structures (e.g., concrete structures and composite airframes or ship hulls, as well as various micro-devices) should be calculated as a function of structure size and geometry.

  16. Analysis of Faraday Mirror in Auto-Compensating Quantum Key Distribution

    International Nuclear Information System (INIS)

    Wei Ke-Jin; Ma Hai-Qiang; Li Rui-Xue; Zhu Wu; Liu Hong-Wei; Zhang Yong; Jiao Rong-Zhen

    2015-01-01

    The ‘plug and play’ quantum key distribution system is the most stable and the earliest commercial system in the quantum communication field. Jones matrix and Jones calculus are widely used in the analysis of this system and the improved version, which is called the auto-compensating quantum key distribution system. Unfortunately, existing analysis has two drawbacks: only the auto-compensating process is analyzed and existing systems do not fully consider laser phase affected by a Faraday mirror (FM). In this work, we present a detailed analysis of the output of light pulse transmitting in a plug and play quantum key distribution system that contains only an FM, by Jones calculus. A similar analysis is made to a home-made auto-compensating system which contains two FMs to compensate for environmental effects. More importantly, we show that theoretical and experimental results are different in the plug and play interferometric setup due to the fact that a conventional Jones matrix of FM neglected an additional phase π on alternative polarization direction. To resolve the above problem, we give a new Jones matrix of an FM according to the coordinate rotation. This new Jones matrix not only resolves the above contradiction in the plug and play interferometric setup, but also is suitable for the previous analyses about auto-compensating quantum key distribution. (paper)

  17. User-friendly Tool for Power Flow Analysis and Distributed Generation Optimisation in Radial Distribution Networks

    Directory of Open Access Journals (Sweden)

    M. F. Akorede

    2017-06-01

    Full Text Available The intent of power distribution companies (DISCOs is to deliver electric power to their customers in an efficient and reliable manner – with minimal energy loss cost. One major way to minimise power loss on a given power system is to install distributed generation (DG units on the distribution networks. However, to maximise benefits, it is highly crucial for a DISCO to ensure that these DG units are of optimal size and sited in the best locations on the network. This paper gives an overview of a software package developed in this study, called Power System Analysis and DG Optimisation Tool (PFADOT. The main purpose of the graphical user interface-based package is to guide a DISCO in finding the optimal size and location for DG placement in radial distribution networks. The package, which is also suitable for load flow analysis, employs the GUI feature of MATLAB. Three objective functions are formulated into a single optimisation problem and solved with fuzzy genetic algorithm to simultaneously obtain DG optimal size and location. The accuracy and reliability of the developed tool was validated using several radial test systems, and the results obtained are evaluated against the existing similar package cited in the literature, which are impressive and computationally efficient.

  18. Influence of microscopic inhomogeneity on macroscopic transport current of Ag/Bi2223 tapes

    International Nuclear Information System (INIS)

    Ogawa, Kazuhiro; Osamura, Kozo

    2004-01-01

    In Ag/Bi2223 tapes, inhomogeneities such as spatially distributed weak links or non-superconducting oxides are inevitably introduced because of the complicated manufacturing process and thermodynamic instability. In order to clarify the effect of the difference in such microscopic inhomogeneites on the macroscopic current transport properties, we carried out a numerical analysis. By changing volume fraction (V f ) of the Bi2223 phase and the shape of local distribution of critical current at each weak link, it is revealed that I-V characteristics are largely affected by the breadth of local distributions with different dependence on V f of Bi2223 and calculated results can be analyzed by Weibull distribution function with some parameters including the information of two-dimensional distribution

  19. Use of Frequency Distribution Functions to Establish Safe Conditions in Relation to the Foodborne Pathogen Bacillus cereus

    Directory of Open Access Journals (Sweden)

    Begoña Delgado

    2005-01-01

    Full Text Available Minimal processing implementation greatly depends on a detailed knowledge of the effects of preservation factors and their combinations on the spoilage and foodborne pathogenic microorganisms. The effectiveness of mild preservation conditions will become increasingly dependent on a more stochastic approach linking microbial physiological factors with product preservation factors. In this study, the validity of frequency distributions to efficiently describe the inactivation and growth of Bacillus cereus in the presence of natural antimicrobials (essential oils has been studied. For this purpose, vegetative cells were exposed to 0.6 mM of thymol or cymene, obtaining survival curves that were best described by the distribution of Weibull, since a tailing effect was observed. B. cereus was also exposed in a growth medium to a low concentration (0.1 mM of both antimicrobials, separately or combined, and the lag times obtained were fitted to a normal distribution, which allowed a description of dispersion of the start of growth. This allowed a more efficient evaluation of the experimental data to establish safe processing conditions according to accurate parameters and their implementation in risk assessment.

  20. A Modeling Framework for Schedulability Analysis of Distributed Avionics Systems

    DEFF Research Database (Denmark)

    Han, Pujie; Zhai, Zhengjun; Nielsen, Brian

    2018-01-01

    This paper presents a modeling framework for schedulability analysis of distributed integrated modular avionics (DIMA) systems that consist of spatially distributed ARINC-653 modules connected by a unified AFDX network. We model a DIMA system as a set of stopwatch automata (SWA) in UPPAAL...

  1. A Stand-Class Growth and Yield Model for Mexico’s Northern Temperate, Mixed and Multiaged Forests

    Directory of Open Access Journals (Sweden)

    José Návar

    2014-12-01

    Full Text Available The aim of this research was to develop a stand-class growth and yield model based on the diameter growth dynamics of Pinus spp. and Quercus spp. of Mexico’s mixed temperate forests. Using a total of 2663 temporary, circular-sampling plots of 1000 m2 each, nine Weibull distribution techniques of parameter estimation were fitted to the diameter structures of pines and oaks. Statistical equations using stand attributes and the first three moments of the diameter distribution predicted and recovered the Weibull parameters. Using nearly 1200 and 100 harvested trees for pines and oaks, respectively, I developed the total height versus diameter at breast height relationship by fitting three non-linear functions. The Newnham model predicted stem taper and numerical integration was done to estimate merchantable timber volume for all trees in the stand for each diameter class. The independence of the diameter structures of pines and oaks was tested by regressing the Weibull parameters and projecting diameter structures. The model predicts diameter distributions transition from exponential (J inverse, logarithmic to well-balanced distributions with increasing mean stand diameter at breast height. Pine diameter distributions transition faster and the model predicts independent growth rates between pines and oaks. The stand-class growth and yield model must be completed with the diameter-age relationship for oaks in order to carry a full optimization procedure to find stand density and genera composition to maximize forest growth.

  2. Thermographic Analysis of Stress Distribution in Welded Joints

    Directory of Open Access Journals (Sweden)

    Domazet Ž.

    2010-06-01

    Full Text Available The fatigue life prediction of welded joints based on S-N curves in conjunction with nominal stresses generally is not reliable. Stress distribution in welded area affected by geometrical inhomogeneity, irregular welded surface and weld toe radius is quite complex, so the local (structural stress concept is accepted in recent papers. The aim of this paper is to determine the stress distribution in plate type aluminum welded joints, to analyze the reliability of TSA (Thermal Stress Analysis in this kind of investigations, and to obtain numerical values for stress concentration factors for practical use. Stress distribution in aluminum butt and fillet welded joints is determined by using the three different methods: strain gauges measurement, thermal stress analysis and FEM. Obtained results show good agreement - the TSA mutually confirmed the FEM model and stresses measured by strain gauges. According to obtained results, it may be stated that TSA, as a relatively new measurement technique may in the future become a standard tool for the experimental investigation of stress concentration and fatigue in welded joints that can help to develop more accurate numerical tools for fatigue life prediction.

  3. Thermographic Analysis of Stress Distribution in Welded Joints

    Science.gov (United States)

    Piršić, T.; Krstulović Opara, L.; Domazet, Ž.

    2010-06-01

    The fatigue life prediction of welded joints based on S-N curves in conjunction with nominal stresses generally is not reliable. Stress distribution in welded area affected by geometrical inhomogeneity, irregular welded surface and weld toe radius is quite complex, so the local (structural) stress concept is accepted in recent papers. The aim of this paper is to determine the stress distribution in plate type aluminum welded joints, to analyze the reliability of TSA (Thermal Stress Analysis) in this kind of investigations, and to obtain numerical values for stress concentration factors for practical use. Stress distribution in aluminum butt and fillet welded joints is determined by using the three different methods: strain gauges measurement, thermal stress analysis and FEM. Obtained results show good agreement - the TSA mutually confirmed the FEM model and stresses measured by strain gauges. According to obtained results, it may be stated that TSA, as a relatively new measurement technique may in the future become a standard tool for the experimental investigation of stress concentration and fatigue in welded joints that can help to develop more accurate numerical tools for fatigue life prediction.

  4. PM10 Analysis for Three Industrialized Areas using Extreme Value

    International Nuclear Information System (INIS)

    Hasfazilah Ahmat; Ahmad Shukri Yahaya; Nor Azam Ramli; Hasfazilah Ahmat

    2015-01-01

    One of the concerns of the air pollution studies is to compute the concentrations of one or more pollutants' species in space and time in relation to the independent variables, for instance emissions into the atmosphere, meteorological factors and parameters. One of the most significant statistical disciplines developed for the applied sciences and many other disciplines for the last few decades is the extreme value theory (EVT). This study assesses the use of extreme value distributions of the two-parameter Gumbel, two and three-parameter Weibull, Generalized Extreme Value (GEV) and two and three-parameter Generalized Pareto Distribution (GPD) on the maximum concentration of daily PM10 data recorded in the year 2010 - 2012 in Pasir Gudang, Johor; Bukit Rambai, Melaka; and Nilai, Negeri Sembilan. Parameters for all distributions are estimated using the Method of Moments (MOM) and Maximum Likelihood Estimator (MLE). Six performance indicators namely; the accuracy measures which include predictive accuracy (PA), Coefficient of Determination (R2), Index of Agreement (IA) and error measures that consist of Root Mean Square Error (RMSE), Mean Absolute Error (MAE) and Normalized Absolute Error (NAE) are used to find the goodness-of-fit of the distribution. The best distribution is selected based on the highest accuracy measures and the smallest error measures. The results showed that the GEV is the best fit for daily maximum concentration for PM10 for all monitoring stations. The analysis also demonstrates that the estimated numbers of days in which the concentration of PM10 exceeded the Malaysian Ambient Air Quality Guidelines (MAAQG) of 150 mg/ m"3 are between 1/2 and 11/2 days. (author)

  5. Study of Solid State Drives performance in PROOF distributed analysis system

    Science.gov (United States)

    Panitkin, S. Y.; Ernst, M.; Petkus, R.; Rind, O.; Wenaus, T.

    2010-04-01

    Solid State Drives (SSD) is a promising storage technology for High Energy Physics parallel analysis farms. Its combination of low random access time and relatively high read speed is very well suited for situations where multiple jobs concurrently access data located on the same drive. It also has lower energy consumption and higher vibration tolerance than Hard Disk Drive (HDD) which makes it an attractive choice in many applications raging from personal laptops to large analysis farms. The Parallel ROOT Facility - PROOF is a distributed analysis system which allows to exploit inherent event level parallelism of high energy physics data. PROOF is especially efficient together with distributed local storage systems like Xrootd, when data are distributed over computing nodes. In such an architecture the local disk subsystem I/O performance becomes a critical factor, especially when computing nodes use multi-core CPUs. We will discuss our experience with SSDs in PROOF environment. We will compare performance of HDD with SSD in I/O intensive analysis scenarios. In particular we will discuss PROOF system performance scaling with a number of simultaneously running analysis jobs.

  6. Web Based Distributed Coastal Image Analysis System, Phase I

    Data.gov (United States)

    National Aeronautics and Space Administration — This project develops Web based distributed image analysis system processing the Moderate Resolution Imaging Spectroradiometer (MODIS) data to provide decision...

  7. Scaling analysis of meteorite shower mass distributions

    DEFF Research Database (Denmark)

    Oddershede, Lene; Meibom, A.; Bohr, Jakob

    1998-01-01

    Meteorite showers are the remains of extraterrestrial objects which are captivated by the gravitational field of the Earth. We have analyzed the mass distribution of fragments from 16 meteorite showers for scaling. The distributions exhibit distinct scaling behavior over several orders of magnetude......; the observed scaling exponents vary from shower to shower. Half of the analyzed showers show a single scaling region while the orther half show multiple scaling regimes. Such an analysis can provide knowledge about the fragmentation process and about the original meteoroid. We also suggest to compare...... the observed scaling exponents to exponents observed in laboratory experiments and discuss the possibility that one can derive insight into the original shapes of the meteoroids....

  8. Robustness of power systems under a democratic-fiber-bundle-like model.

    Science.gov (United States)

    Yağan, Osman

    2015-06-01

    We consider a power system with N transmission lines whose initial loads (i.e., power flows) L(1),...,L(N) are independent and identically distributed with P(L)(x)=P[L≤x]. The capacity C(i) defines the maximum flow allowed on line i and is assumed to be given by C(i)=(1+α)L(i), with α>0. We study the robustness of this power system against random attacks (or failures) that target a p fraction of the lines, under a democratic fiber-bundle-like model. Namely, when a line fails, the load it was carrying is redistributed equally among the remaining lines. Our contributions are as follows. (i) We show analytically that the final breakdown of the system always takes place through a first-order transition at the critical attack size p(☆)=1-(E[L]/max(x)(P[L>x](αx+E[L|L>x])), where E[·] is the expectation operator; (ii) we derive conditions on the distribution P(L)(x) for which the first-order breakdown of the system occurs abruptly without any preceding diverging rate of failure; (iii) we provide a detailed analysis of the robustness of the system under three specific load distributions-uniform, Pareto, and Weibull-showing that with the minimum load L(min) and mean load E[L] fixed, Pareto distribution is the worst (in terms of robustness) among the three, whereas Weibull distribution is the best with shape parameter selected relatively large; (iv) we provide numerical results that confirm our mean-field analysis; and (v) we show that p(☆) is maximized when the load distribution is a Dirac delta function centered at E[L], i.e., when all lines carry the same load. This last finding is particularly surprising given that heterogeneity is known to lead to high robustness against random failures in many other systems.

  9. Performance optimisations for distributed analysis in ALICE

    CERN Document Server

    Betev, L; Gheata, M; Grigoras, C; Hristov, P

    2014-01-01

    Performance is a critical issue in a production system accommodating hundreds of analysis users. Compared to a local session, distributed analysis is exposed to services and network latencies, remote data access and heterogeneous computing infrastructure, creating a more complex performance and efficiency optimization matrix. During the last 2 years, ALICE analysis shifted from a fast development phase to the more mature and stable code. At the same time, the framewo rks and tools for deployment, monitoring and management of large productions have evolved considerably too. The ALICE Grid production system is currently used by a fair share of organized and individual user analysis, consuming up to 30% or the available r esources and ranging from fully I/O - bound analysis code to CPU intensive correlations or resonances studies. While the intrinsic analysis performance is unlikely to improve by a large factor during the LHC long shutdown (LS1), the overall efficiency of the system has still to be improved by a...

  10. Parametric distribution approach for flow availability in small hydro potential analysis

    Science.gov (United States)

    Abdullah, Samizee; Basri, Mohd Juhari Mat; Jamaluddin, Zahrul Zamri; Azrulhisham, Engku Ahmad; Othman, Jamel

    2016-10-01

    Small hydro system is one of the important sources of renewable energy and it has been recognized worldwide as clean energy sources. Small hydropower generation system uses the potential energy in flowing water to produce electricity is often questionable due to inconsistent and intermittent of power generated. Potential analysis of small hydro system which is mainly dependent on the availability of water requires the knowledge of water flow or stream flow distribution. This paper presented the possibility of applying Pearson system for stream flow availability distribution approximation in the small hydro system. By considering the stochastic nature of stream flow, the Pearson parametric distribution approximation was computed based on the significant characteristic of Pearson system applying direct correlation between the first four statistical moments of the distribution. The advantage of applying various statistical moments in small hydro potential analysis will have the ability to analyze the variation shapes of stream flow distribution.

  11. The effect of surface corrosion damage on the fatigue life of 6061-T6 aluminum alloy extrusions

    Energy Technology Data Exchange (ETDEWEB)

    Weber, Matthew; Eason, Paul D.; Özdeş, Hüseyin; Tiryakioğlu, Murat, E-mail: m.tiryakioglu@unf.edu

    2017-04-06

    An investigation was performed where 6061-T6 extrusions were exposed to a 3.5% NaCl solution at pH 2 for 2 days and 24 days to create distinct surface flaws. The effect of these flaws on the rotating beam fatigue life was then investigated and analyzed by using Wöhler curves, Weibull statistics and scanning electron microscopy (SEM). It was determined that corrosion damage reduced the fatigue life significantly and specimens corroded for both 2-days and 24-days exhibited similar fatigue lives. Statistical analyses showed that fatigue life of all three datasets followed the 3-parameter Weibull distribution and the difference between the fatigue lives of two corroded datasets was statistically insignificant. Analysis of fracture surfaces showed that sizes of pits that led to fatigue crack initiation were very different in the two corroded datasets. Implications of the similarity in fatigue lives despite disparity in surface condition are discussed in detail in the paper.

  12. Reliability analysis and optimisation of subsea compression system facing operational covariate stresses

    International Nuclear Information System (INIS)

    Okaro, Ikenna Anthony; Tao, Longbin

    2016-01-01

    This paper proposes an enhanced Weibull-Corrosion Covariate model for reliability assessment of a system facing operational stresses. The newly developed model is applied to a Subsea Gas Compression System planned for offshore West Africa to predict its reliability index. System technical failure was modelled by developing a Weibull failure model incorporating a physically tested corrosion profile as stress in order to quantify the survival rate of the system under additional operational covariates including marine pH, temperature and pressure. Using Reliability Block Diagrams and enhanced Fusell-Vesely formulations, the whole system was systematically decomposed to sub-systems to analyse the criticality of each component and optimise them. Human reliability was addressed using an enhanced barrier weighting method. A rapid degradation curve is obtained on a subsea system relative to the base case subjected to a time-dependent corrosion stress factor. It reveals that subsea system components failed faster than their Mean time to failure specifications from Offshore Reliability Database as a result of cumulative marine stresses exertion. The case study demonstrated that the reliability of a subsea system can be systematically optimised by modelling the system under higher technical and organisational stresses, prioritising the critical sub-systems and making befitting provisions for redundancy and tolerances. - Highlights: • Novel Weibull Corrosion-Covariate model for reliability analysis of subsea assets. • Predict the accelerated degradation profile of a subsea gas compression. • An enhanced optimisation method based on Fusell-Vesely decomposition process. • New optimisation approach for smoothening of over- and under-designed components. • Demonstrated a significant improvement in producing more realistic failure rate.

  13. Distributed Analysis Experience using Ganga on an ATLAS Tier2 infrastructure

    International Nuclear Information System (INIS)

    Fassi, F.; Cabrera, S.; Vives, R.; Fernandez, A.; Gonzalez de la Hoz, S.; Sanchez, J.; March, L.; Salt, J.; Kaci, M.; Lamas, A.; Amoros, G.

    2007-01-01

    The ATLAS detector will explore the high-energy frontier of Particle Physics collecting the proton-proton collisions delivered by the LHC (Large Hadron Collider). Starting in spring 2008, the LHC will produce more than 10 Peta bytes of data per year. The adapted tiered hierarchy for computing model at the LHC is: Tier-0 (CERN), Tiers-1 and Tiers-2 centres distributed around the word. The ATLAS Distributed Analysis (DA) system has the goal of enabling physicists to perform Grid-based analysis on distributed data using distributed computing resources. IFIC Tier-2 facility is participating in several aspects of DA. In support of the ATLAS DA activities a prototype is being tested, deployed and integrated. The analysis data processing applications are based on the Athena framework. GANGA, developed by LHCb and ATLAS experiments, allows simple switching between testing on a local batch system and large-scale processing on the Grid, hiding Grid complexities. GANGA deals with providing physicists an integrated environment for job preparation, bookkeeping and archiving, job splitting and merging. The experience with the deployment, configuration and operation of the DA prototype will be presented. Experiences gained of using DA system and GANGA in the Top physics analysis will be described. (Author)

  14. HammerCloud: A Stress Testing System for Distributed Analysis

    International Nuclear Information System (INIS)

    Ster, Daniel C van der; García, Mario Úbeda; Paladin, Massimo; Elmsheuser, Johannes

    2011-01-01

    Distributed analysis of LHC data is an I/O-intensive activity which places large demands on the internal network, storage, and local disks at remote computing facilities. Commissioning and maintaining a site to provide an efficient distributed analysis service is therefore a challenge which can be aided by tools to help evaluate a variety of infrastructure designs and configurations. HammerCloud is one such tool; it is a stress testing service which is used by central operations teams, regional coordinators, and local site admins to (a) submit arbitrary number of analysis jobs to a number of sites, (b) maintain at a steady-state a predefined number of jobs running at the sites under test, (c) produce web-based reports summarizing the efficiency and performance of the sites under test, and (d) present a web-interface for historical test results to both evaluate progress and compare sites. HammerCloud was built around the distributed analysis framework Ganga, exploiting its API for grid job management. HammerCloud has been employed by the ATLAS experiment for continuous testing of many sites worldwide, and also during large scale computing challenges such as STEP'09 and UAT'09, where the scale of the tests exceeded 10,000 concurrently running and 1,000,000 total jobs over multi-day periods. In addition, HammerCloud is being adopted by the CMS experiment; the plugin structure of HammerCloud allows the execution of CMS jobs using their official tool (CRAB).

  15. Flexural strength and the probability of failure of cold isostatic pressed zirconia core ceramics.

    Science.gov (United States)

    Siarampi, Eleni; Kontonasaki, Eleana; Papadopoulou, Lambrini; Kantiranis, Nikolaos; Zorba, Triantafillia; Paraskevopoulos, Konstantinos M; Koidis, Petros

    2012-08-01

    The flexural strength of zirconia core ceramics must predictably withstand the high stresses developed during oral function. The in-depth interpretation of strength parameters and the probability of failure during clinical performance could assist the clinician in selecting the optimum materials while planning treatment. The purpose of this study was to evaluate the flexural strength based on survival probability and Weibull statistical analysis of 2 zirconia cores for ceramic restorations. Twenty bar-shaped specimens were milled from 2 core ceramics, IPS e.max ZirCAD and Wieland ZENO Zr, and were loaded until fracture according to ISO 6872 (3-point bending test). An independent samples t test was used to assess significant differences of fracture strength (α=.05). Weibull statistical analysis of the flexural strength data provided 2 parameter estimates: Weibull modulus (m) and characteristic strength (σ(0)). The fractured surfaces of the specimens were evaluated by scanning electron microscopy (SEM) and energy dispersive spectroscopy (EDS). The investigation of the crystallographic state of the materials was performed with x-ray diffraction analysis (XRD) and Fourier transform infrared (FTIR) spectroscopy. Higher mean flexural strength (Plines zones). Both groups primarily sustained the tetragonal phase of zirconia and a negligible amount of the monoclinic phase. Although both zirconia ceramics presented similar fractographic and crystallographic properties, the higher flexural strength of WZ ceramics was associated with a lower m and more voids in their microstructure. These findings suggest a greater scattering of strength values and a flaw distribution that are expected to increase failure probability. Copyright © 2012 The Editorial Council of the Journal of Prosthetic Dentistry. Published by Mosby, Inc. All rights reserved.

  16. How Many Model Evaluations Are Required To Predict The AEP Of A Wind Power Plant?

    DEFF Research Database (Denmark)

    Murcia Leon, Juan Pablo; Réthoré, Pierre-Elouan; Natarajan, Anand

    2015-01-01

    (AEP) predictions expensive. The objective of the present paper is to minimize the number of model evaluations required to capture the wind power plant's AEP using stationary wind farm flow models. Polynomial chaos techniques are proposed based on arbitrary Weibull distributed wind speed and Von Misses...... distributed wind direction. The correlation between wind direction and wind speed are captured by defining Weibull-parameters as functions of wind direction. In order to evaluate the accuracy of these methods the expectation and variance of the wind farm power distributions are compared against...... the traditional binning method with trapezoidal and Simpson's integration rules. The wind farm flow model used in this study is the semi-empirical wake model developed by Larsen [1]. Three test cases are studied: a single turbine, a simple and a real offshore wind power plant. A reduced number of model...

  17. The cost of electricity distribution in Italy: a quantitative analysis

    International Nuclear Information System (INIS)

    Scarpa, C.

    1998-01-01

    This paper presents a quantitative analysis of the cost of medium and low tension electricity distribution in Italy. An econometric analysis of the cost function is proposed, on the basis of data on 147 zones of the dominant firm, ENEL. Data are available only for 1996, which has forced to carry out only a cross-section OLS analysis. The econometric estimate shows the existence of significant scale economies, that the current organisational structure does not exploit. On this basis is also possible to control to what extent exogenous cost drivers affect costs. The role of numerous exogenous factors considered seems however quite limited. The area of the distribution zone and an indicator of quality are the only elements that appear significant from an economic viewpoint [it

  18. Calling patterns in human communication dynamics.

    Science.gov (United States)

    Jiang, Zhi-Qiang; Xie, Wen-Jie; Li, Ming-Xia; Podobnik, Boris; Zhou, Wei-Xing; Stanley, H Eugene

    2013-01-29

    Modern technologies not only provide a variety of communication modes (e.g., texting, cell phone conversation, and online instant messaging), but also detailed electronic traces of these communications between individuals. These electronic traces indicate that the interactions occur in temporal bursts. Here, we study intercall duration of communications of the 100,000 most active cell phone users of a Chinese mobile phone operator. We confirm that the intercall durations follow a power-law distribution with an exponential cutoff at the population level but find differences when focusing on individual users. We apply statistical tests at the individual level and find that the intercall durations follow a power-law distribution for only 3,460 individuals (3.46%). The intercall durations for the majority (73.34%) follow a Weibull distribution. We quantify individual users using three measures: out-degree, percentage of outgoing calls, and communication diversity. We find that the cell phone users with a power-law duration distribution fall into three anomalous clusters: robot-based callers, telecom fraud, and telephone sales. This information is of interest to both academics and practitioners, mobile telecom operators in particular. In contrast, the individual users with a Weibull duration distribution form the fourth cluster of ordinary cell phone users. We also discover more information about the calling patterns of these four clusters (e.g., the probability that a user will call the c(r)-th most contact and the probability distribution of burst sizes). Our findings may enable a more detailed analysis of the huge body of data contained in the logs of massive users.

  19. An extensive evaluation of wind resource using new methods and strategies for development and utilizing wind power in Mah-shahr station in Iran

    International Nuclear Information System (INIS)

    Nedaei, Mojtaba; Assareh, Ehsanolah; Biglari, Mojtaba

    2014-01-01

    Highlights: • In this research, results suggest that the Weibull distribution is the best function to model the wind data in Mah-shahr. • The compatibility of the wind data with three methods of estimating Weibull parameters was assessed. • An extreme analysis of wind in Mah-shahr was done to determine the maximum gust wind speeds over a 50-year return period. • An extensive economic evaluation of installing a wind park was performed on RETScreen® software. - Abstract: In this study, the 10-min period measured short-term wind speed data at 10 m and 40 m heights for Mah-shahr station in Iran was statistically analyzed to determine the potential of wind power generation. In addition in this paper different distribution functions are compared to each other in order to find the best distribution in Mah-shahr which is compatible with the wind data. It is found that the Weibull distribution is the best function to model the wind data in Mah-shahr. It should also be mentioned that the different methods for calculating Weibull parameters were used and compared to each other. Moreover an extreme analysis of wind data in Mah-shahr was carried out to determine the maximum gust wind speed over a 50-year return period (Ve50). Using the method of “periodic maxima” for analysis of extreme winds, it was revealed that the 50-year wind speed at 80 m in Mah-shahr which is 39.5 m/s is lower than usual values of Ve50 limits for different wind turbines. This means that the risk of extreme wind gusts in Mah-shahr might not be a problem for installation of wind turbines. From the primary evaluation of wind data in Mah-shahr it is found the studied site has an acceptable potential of wind power for electricity generation. Energy production of different wind turbines at different heights is determined. Then a simple economic evaluation carried out to determine whether studied site is suitable for development of commercial or small and residential wind turbines. It became clear

  20. User-friendly Tool for Power Flow Analysis and Distributed ...

    African Journals Online (AJOL)

    Akorede

    AKOREDE et al: TOOL FOR POWER FLOW ANALYSIS AND DISTRIBUTED GENERATION OPTIMISATION. 23 ... greenhouse gas emissions and the current deregulation of electric energy ..... Visual composition and temporal behaviour of GUI.

  1. Análisis cinético y estadístico de la precipitación en una aleación de Cu-0,49Co-0,44Ti, mediante microcalorimetría y medidas de microdureza

    Directory of Open Access Journals (Sweden)

    Donoso, E.

    2009-12-01

    Full Text Available Starting with a solid solution of Cu-0,49Co-0,44Ti tempered from 1173 K, the kinetics of precipitation of atoms of cobalt and titanium was studied by means of differential scanning calorimetry (DSC. The analysis of the calorimetric curves show the presence of an exothermal reaction that is attributed to the formation of particles of CoTi in the copper matrix. The energy of activation of the reaction was estimated by means of a modified method of Kissinger. The kinetic parameters were estimated with the use of the formalism of Johnson-Mehl-Avrami. On the other hand, a statistical analysis of the process of precipitation was performed by measuring the microhardness Vickers, employing a Weibull probability distribution function. Using minimum square method the Weibull parameters were estimated. The goodness of fit was analyzed by using the Chi square test with a condidence level of 95 percent. Increasing the aging time, for the sam annealing temperature, theWeibull modulus increase too, which may be attributed to precipitation of CoTi phase.

    Mediante calorimetría diferencial de barrido (DSC se estudió la cinética de precipitación de átomos de cobalto y titanio a partir de una solución sólida de Cu-0,49Co-0,44Ti, templada desde 1.173 K. El análisis de las curvas calorimétricas muestran la presencia de una reacción exotérmica la cual se atribuye a la formación de partículas de CoTi en la matriz de cobre. La energía de activación de la reacción fue estimada a partir del método de Kissinger modificado. Los parámetros cinéticos fueron calculados mediante el formalismo de Johnson- Mehl-Avrami. Por otra parte, se efectuó un análisis estadístico del proceso de precipitación, mediante medidas de microdureza Vickers, empleando una función de distribución de probabilidad deWeibull. Se estimaron los parámetros deWeibull mediante el método de mínimos cuadrados. La bondad de ajuste se analizó considerando el test de chi

  2. Molecular Isotopic Distribution Analysis (MIDAs) with adjustable mass accuracy.

    Science.gov (United States)

    Alves, Gelio; Ogurtsov, Aleksey Y; Yu, Yi-Kuo

    2014-01-01

    In this paper, we present Molecular Isotopic Distribution Analysis (MIDAs), a new software tool designed to compute molecular isotopic distributions with adjustable accuracies. MIDAs offers two algorithms, one polynomial-based and one Fourier-transform-based, both of which compute molecular isotopic distributions accurately and efficiently. The polynomial-based algorithm contains few novel aspects, whereas the Fourier-transform-based algorithm consists mainly of improvements to other existing Fourier-transform-based algorithms. We have benchmarked the performance of the two algorithms implemented in MIDAs with that of eight software packages (BRAIN, Emass, Mercury, Mercury5, NeutronCluster, Qmass, JFC, IC) using a consensus set of benchmark molecules. Under the proposed evaluation criteria, MIDAs's algorithms, JFC, and Emass compute with comparable accuracy the coarse-grained (low-resolution) isotopic distributions and are more accurate than the other software packages. For fine-grained isotopic distributions, we compared IC, MIDAs's polynomial algorithm, and MIDAs's Fourier transform algorithm. Among the three, IC and MIDAs's polynomial algorithm compute isotopic distributions that better resemble their corresponding exact fine-grained (high-resolution) isotopic distributions. MIDAs can be accessed freely through a user-friendly web-interface at http://www.ncbi.nlm.nih.gov/CBBresearch/Yu/midas/index.html.

  3. Sensitivity Analysis of Dynamic Tariff Method for Congestion Management in Distribution Networks

    DEFF Research Database (Denmark)

    Huang, Shaojun; Wu, Qiuwei; Liu, Zhaoxi

    2015-01-01

    The dynamic tariff (DT) method is designed for the distribution system operator (DSO) to alleviate the congestions that might occur in a distribution network with high penetration of distribute energy resources (DERs). Sensitivity analysis of the DT method is crucial because of its decentralized...... control manner. The sensitivity analysis can obtain the changes of the optimal energy planning and thereby the line loading profiles over the infinitely small changes of parameters by differentiating the KKT conditions of the convex quadratic programming, over which the DT method is formed. Three case...

  4. Beyond reliability, multi-state failure analysis of satellite subsystems: A statistical approach

    International Nuclear Information System (INIS)

    Castet, Jean-Francois; Saleh, Joseph H.

    2010-01-01

    Reliability is widely recognized as a critical design attribute for space systems. In recent articles, we conducted nonparametric analyses and Weibull fits of satellite and satellite subsystems reliability for 1584 Earth-orbiting satellites launched between January 1990 and October 2008. In this paper, we extend our investigation of failures of satellites and satellite subsystems beyond the binary concept of reliability to the analysis of their anomalies and multi-state failures. In reliability analysis, the system or subsystem under study is considered to be either in an operational or failed state; multi-state failure analysis introduces 'degraded states' or partial failures, and thus provides more insights through finer resolution into the degradation behavior of an item and its progression towards complete failure. The database used for the statistical analysis in the present work identifies five states for each satellite subsystem: three degraded states, one fully operational state, and one failed state (complete failure). Because our dataset is right-censored, we calculate the nonparametric probability of transitioning between states for each satellite subsystem with the Kaplan-Meier estimator, and we derive confidence intervals for each probability of transitioning between states. We then conduct parametric Weibull fits of these probabilities using the Maximum Likelihood Estimation (MLE) approach. After validating the results, we compare the reliability versus multi-state failure analyses of three satellite subsystems: the thruster/fuel; the telemetry, tracking, and control (TTC); and the gyro/sensor/reaction wheel subsystems. The results are particularly revealing of the insights that can be gleaned from multi-state failure analysis and the deficiencies, or blind spots, of the traditional reliability analysis. In addition to the specific results provided here, which should prove particularly useful to the space industry, this work highlights the importance

  5. Analysis of rainfall distribution in Kelantan river basin, Malaysia

    Science.gov (United States)

    Che Ros, Faizah; Tosaka, Hiroyuki

    2018-03-01

    Using rainfall gauge on its own as input carries great uncertainties regarding runoff estimation, especially when the area is large and the rainfall is measured and recorded at irregular spaced gauging stations. Hence spatial interpolation is the key to obtain continuous and orderly rainfall distribution at unknown points to be the input to the rainfall runoff processes for distributed and semi-distributed numerical modelling. It is crucial to study and predict the behaviour of rainfall and river runoff to reduce flood damages of the affected area along the Kelantan river. Thus, a good knowledge on rainfall distribution is essential in early flood prediction studies. Forty six rainfall stations and their daily time-series were used to interpolate gridded rainfall surfaces using inverse-distance weighting (IDW), inverse-distance and elevation weighting (IDEW) methods and average rainfall distribution. Sensitivity analysis for distance and elevation parameters were conducted to see the variation produced. The accuracy of these interpolated datasets was examined using cross-validation assessment.

  6. Theory model and experiment research about the cognition reliability of nuclear power plant operators

    International Nuclear Information System (INIS)

    Fang Xiang; Zhao Bingquan

    2000-01-01

    In order to improve the reliability of NPP operation, the simulation research on the reliability of nuclear power plant operators is needed. Making use of simulator of nuclear power plant as research platform, and taking the present international reliability research model-human cognition reliability for reference, the part of the model is modified according to the actual status of Chinese nuclear power plant operators and the research model of Chinese nuclear power plant operators obtained based on two-parameter Weibull distribution. Experiments about the reliability of nuclear power plant operators are carried out using the two-parameter Weibull distribution research model. Compared with those in the world, the same results are achieved. The research would be beneficial to the operation safety of nuclear power plant

  7. Effect of nonlinear stress-strain relationship on bending strength of isotropic graphite

    International Nuclear Information System (INIS)

    Arai, Taketoshi; Oku, Tatsuo

    1978-05-01

    Four-point bending tests were made on rectangular isotropic 7477PT graphite specimens of different sizes to observe the relation between load and outermost fiber strain. Analytical methods, allowing for nonlinear stress-strain relationships different between tension and compression, were developed for calculating the fiber stress distribution in a beam and the failure probability based on the Weibull statistical theory for bending fracture. With increase of the stress, the stress-strain curves for tension deviate from the linearity and also from those for compression. The true bending strengths of the rectangular bars are 10 -- 20 percent lower than elastic bending strengths. Revised Weibull theory gives failure probability distributions agreeing with measured ones, compared with the theory based on elastic behavior. (auth.)

  8. Silicon Bipolar Distributed Oscillator Design and Analysis | Aku ...

    African Journals Online (AJOL)

    The design of high frequency silicon bipolar oscillator using common emitter (CE) with distributed output and analysis is carried out. The general condition for oscillation and the resulting analytical expressions for the frequency of oscillators were reviewed. Transmission line design was carried out using Butterworth LC ...

  9. On process capability and system availability analysis of the inverse Rayleigh distribution

    Directory of Open Access Journals (Sweden)

    Sajid Ali

    2015-04-01

    Full Text Available In this article, process capability and system availability analysis is discussed for the inverse Rayleigh lifetime distribution. Bayesian approach with a conjugate gamma distribution is adopted for the analysis. Different types of loss functions are considered to find Bayes estimates of the process capability and system availability. A simulation study is conducted for the comparison of different loss functions.

  10. AND LANDSCAPE-ECOLOGICAL ANALYSIS OF ITS DISTRIBUTION

    OpenAIRE

    S. M. Musaeva

    2012-01-01

    The article is devoted to the study of helminthofauna of the striped lizard in Lankaran natural region. The landscape and ecological analysis of distribution of the helminthofauna is provided. As a result of studies on 99 individuals of striped lizard totally 14 species of helminthes, including 1 trematode species, 1 species of cestode, 3 species of akantocefals and 9 species of nematodes were found.

  11. AND LANDSCAPE-ECOLOGICAL ANALYSIS OF ITS DISTRIBUTION

    Directory of Open Access Journals (Sweden)

    S. M. Musaeva

    2012-01-01

    Full Text Available The article is devoted to the study of helminthofauna of the striped lizard in Lankaran natural region. The landscape and ecological analysis of distribution of the helminthofauna is provided. As a result of studies on 99 individuals of striped lizard totally 14 species of helminthes, including 1 trematode species, 1 species of cestode, 3 species of akantocefals and 9 species of nematodes were found.

  12. Analytical method for optimization of maintenance policy based on available system failure data

    International Nuclear Information System (INIS)

    Coria, V.H.; Maximov, S.; Rivas-Dávalos, F.; Melchor, C.L.; Guardado, J.L.

    2015-01-01

    An analytical optimization method for preventive maintenance (PM) policy with minimal repair at failure, periodic maintenance, and replacement is proposed for systems with historical failure time data influenced by a current PM policy. The method includes a new imperfect PM model based on Weibull distribution and incorporates the current maintenance interval T 0 and the optimal maintenance interval T to be found. The Weibull parameters are analytically estimated using maximum likelihood estimation. Based on this model, the optimal number of PM and the optimal maintenance interval for minimizing the expected cost over an infinite time horizon are also analytically determined. A number of examples are presented involving different failure time data and current maintenance intervals to analyze how the proposed analytical optimization method for periodic PM policy performances in response to changes in the distribution of the failure data and the current maintenance interval. - Highlights: • An analytical optimization method for preventive maintenance (PM) policy is proposed. • A new imperfect PM model is developed. • The Weibull parameters are analytically estimated using maximum likelihood. • The optimal maintenance interval and number of PM are also analytically determined. • The model is validated by several numerical examples

  13. Asymptotic Ergodic Capacity Analysis of Composite Lognormal Shadowed Channels

    KAUST Repository

    Ansari, Imran Shafique

    2015-05-01

    Capacity analysis of composite lognormal (LN) shadowed links, such as Rician-LN, Gamma-LN, and Weibull-LN, is addressed in this work. More specifically, an exact closed-form expression for the moments of the end-to-end signal-to-noise ratio (SNR) of a single composite link transmission system is presented in terms of well- known elementary functions. Capitalizing on these new moments expressions, we present asymptotically tight lower bounds for the ergodic capacity at high SNR. All the presented results are verified via computer-based Monte-Carlo simulations. © 2015 IEEE.

  14. Asymptotic Ergodic Capacity Analysis of Composite Lognormal Shadowed Channels

    KAUST Repository

    Ansari, Imran Shafique; Alouini, Mohamed-Slim

    2015-01-01

    Capacity analysis of composite lognormal (LN) shadowed links, such as Rician-LN, Gamma-LN, and Weibull-LN, is addressed in this work. More specifically, an exact closed-form expression for the moments of the end-to-end signal-to-noise ratio (SNR) of a single composite link transmission system is presented in terms of well- known elementary functions. Capitalizing on these new moments expressions, we present asymptotically tight lower bounds for the ergodic capacity at high SNR. All the presented results are verified via computer-based Monte-Carlo simulations. © 2015 IEEE.

  15. Residual stress distribution analysis of heat treated APS TBC using image based modelling.

    Science.gov (United States)

    Li, Chun; Zhang, Xun; Chen, Ying; Carr, James; Jacques, Simon; Behnsen, Julia; di Michiel, Marco; Xiao, Ping; Cernik, Robert

    2017-08-01

    We carried out a residual stress distribution analysis in a APS TBC throughout the depth of the coatings. The samples were heat treated at 1150 °C for 190 h and the data analysis used image based modelling based on the real 3D images measured by Computed Tomography (CT). The stress distribution in several 2D slices from the 3D model is included in this paper as well as the stress distribution along several paths shown on the slices. Our analysis can explain the occurrence of the "jump" features near the interface between the top coat and the bond coat. These features in the residual stress distribution trend were measured (as a function of depth) by high-energy synchrotron XRD (as shown in our related research article entitled 'Understanding the Residual Stress Distribution through the Thickness of Atmosphere Plasma Sprayed (APS) Thermal Barrier Coatings (TBCs) by high energy Synchrotron XRD; Digital Image Correlation (DIC) and Image Based Modelling') (Li et al., 2017) [1].

  16. Analysis and Comparison of Typical Models within Distribution Network Design

    DEFF Research Database (Denmark)

    Jørgensen, Hans Jacob; Larsen, Allan; Madsen, Oli B.G.

    Efficient and cost effective transportation and logistics plays a vital role in the supply chains of the modern world’s manufacturers. Global distribution of goods is a very complicated matter as it involves many different distinct planning problems. The focus of this presentation is to demonstrate...... a number of important issues which have been identified when addressing the Distribution Network Design problem from a modelling angle. More specifically, we present an analysis of the research which has been performed in utilizing operational research in developing and optimising distribution systems....

  17. An improved algorithm for connectivity analysis of distribution networks

    International Nuclear Information System (INIS)

    Kansal, M.L.; Devi, Sunita

    2007-01-01

    In the present paper, an efficient algorithm for connectivity analysis of moderately sized distribution networks has been suggested. Algorithm is based on generation of all possible minimal system cutsets. The algorithm is efficient as it identifies only the necessary and sufficient conditions of system failure conditions in n-out-of-n type of distribution networks. The proposed algorithm is demonstrated with the help of saturated and unsaturated distribution networks. The computational efficiency of the algorithm is justified by comparing the computational efforts with the previously suggested appended spanning tree (AST) algorithm. The proposed technique has the added advantage as it can be utilized for generation of system inequalities which is useful in reliability estimation of capacitated networks

  18. Risk analysis for a local gas distribution network

    International Nuclear Information System (INIS)

    Peters, J.W.

    1991-01-01

    Cost control and service reliability are popular topics when discussing strategic issues facing local distribution companies (LDCs) in the 1990s. The ability to provide secure and uninterrupted gas service is crucial for growth and company image, both with the public and regulatory agencies. At the same time, the industry is facing unprecedented competition from alternate fuels, and cost control is essential for maintaining a competitive edge in the market. On the surface, it would appear that cost control and service reliability are contradictory terms. Improvement in service reliability should cost something, or does it? Risk analysis can provide the answer from a distribution design perspective. From a gas distribution engineer's perspective, projects such as loops, backfeeds and even valve placement are designed to reduce, minimize and/or eliminate potential customer outages. These projects improve service reliability by acting as backups should a failure occur on a component of the distribution network. These contingency projects are cost-effective but their longterm benefit or true value is under question. Their purpose is to maintain supply to an area in the distribution network in the event of a failure somewhere else. Two phrases, potential customer outages and in the event of failure, identify uncertainty

  19. Aftershocks: an attempted analysis of their frequency and severity in view of risk assessment

    International Nuclear Information System (INIS)

    Mohammadioun, G.

    1985-08-01

    A catalogue comprising some 30.000 strong and moderate-sized earthquakes with their associated aftershocks, collected worldwide, has been established for the purpose of studying time and magnitude distributions with engineering applications in mind. An attempt is made to model this data base using, notably, Poisson and Weibull relations. Preliminary results include regional variations in a coefficient evaluated by means of the magnitude law and the computed probability of occurrence of an aftershock of given magnitude within a given time after the main shock. A relationship is likewise shown to exist between maximum-density magnitudes and the maximum-density time intervals separating individual events. 13 refs., 3 figs

  20. A Script Analysis of the Distribution of Counterfeit Alcohol Across Two European Jurisdictions

    OpenAIRE

    Lord, Nicholas; Spencer, Jonathan; Bellotti, Elisa; Benson, Katie

    2017-01-01

    This article presents a script analysis of the distribution of counterfeit alcohols across two European jurisdictions. Based on an analysis of case file data from a European regulator and interviews with investigators, the article deconstructs the organisation of the distribution of the alcohol across jurisdictions into five scenes (collection, logistics, delivery, disposal, proceeds/finance) and analyses the actual (or likely permutations of) behaviours within each scene. The analysis also i...

  1. Distributed analysis environment for HEP and interdisciplinary applications

    International Nuclear Information System (INIS)

    Moscicki, J.T.

    2003-01-01

    Huge data volumes of Larger Hadron Collider experiment require parallel end-user analysis on clusters of hundreds of machines. While the focus of end-user High-Energy Physics analysis is on ntuples, the master-worker model of parallel processing may be also used in other contexts such as detector simulation. The aim of DIANE R and D project (http://cern.ch/it-proj-diane) currently held by CERN IT/API group is to create a generic, component-based framework for distributed, parallel data processing in master-worker model. Pre-compiled user analysis code is loaded dynamically at runtime in component libraries and called back when appropriate. Such application-oriented framework must be flexible enough to integrate with the emerging GRID technologies as they become available in the time to come. Therefore, common services such as environment reconstruction, code distribution, load balancing and authentication are designed and implemented as pluggable modules. This approach allows to easily replace them with modules implemented with newer technology as necessary. The paper gives an overview of DIANE architecture and explains the main design choices. Selected examples of diverse applications from a variety of domains applicable to DIANE are presented. As well as preliminary benchmarking results

  2. Location Analysis of Freight Distribution Terminal of Jakarta City, Indonesia

    Directory of Open Access Journals (Sweden)

    Nahry Nahry

    2016-03-01

    Full Text Available Currently Jakarta has two freight terminals, namely Pulo Gebang and Tanah Merdeka. But, both terminals are just functioned for parking and have not been utilized properly yet, e.g. for consolidation. Goods consolidation, which is usually performed in distribution terminal, may reduce number of freight flow within the city. This paper is aimed to determine the best location of distribution terminal in Jakarta among those two terminals and two additional alternative sites, namely Lodan and Rawa Buaya. It is initialized by the identification of important factors that affect the location selection. It is carried out by Likert analysis through the questionnaires distributed to logistics firms. The best location is determined by applying Overlay Analysis using ArcGIS 9.2. Four grid maps are produced to represent the accessibility, cost, time, and environment factors as the important factors of location. The result shows that the ranking from the best is; Lodan, Tanah Merdeka, Pulo Gebang, and Rawa Buaya.

  3. Distributed resistance model for the analysis of wire-wrapped rod bundles

    International Nuclear Information System (INIS)

    Ha, K. S.; Jung, H. Y.; Kwon, Y. M.; Jang, W. P.; Lee, Y. B.

    2003-01-01

    A partial flow blockage within a fuel assembly in liquid metal reactor may result in localized boiling or a failure of the fuel cladding. Thus, the precise analysis for the phenomenon is required for a safe design of LMR. MATRA-LMR code developed by KAERI models the flow distribution in an assembly by using the wire forcing function to consider the effects of wire-wrap spacers, which is important to the analysis for flow blockage. However, the wire forcing function does not have the capabilities of analysis when the flow blockage is occurred. And thus this model was altered to the distributed resistance model and the validation calculation was carried out against to the experiment of FFM 2A

  4. Stochastic modeling of pitting corrosion: A new model for initiation and growth of multiple corrosion pits

    International Nuclear Information System (INIS)

    Valor, A.; Caleyo, F.; Alfonso, L.; Rivas, D.; Hallen, J.M.

    2007-01-01

    In this work, a new stochastic model capable of simulating pitting corrosion is developed and validated. Pitting corrosion is modeled as the combination of two stochastic processes: pit initiation and pit growth. Pit generation is modeled as a nonhomogeneous Poisson process, in which induction time for pit initiation is simulated as the realization of a Weibull process. In this way, the exponential and Weibull distributions can be considered as the possible distributions for pit initiation time. Pit growth is simulated using a nonhomogeneous Markov process. Extreme value statistics is used to find the distribution of maximum pit depths resulting from the combination of the initiation and growth processes for multiple pits. The proposed model is validated using several published experiments on pitting corrosion. It is capable of reproducing the experimental observations with higher quality than the stochastic models available in the literature for pitting corrosion

  5. Stochastic modeling of pitting corrosion: A new model for initiation and growth of multiple corrosion pits

    Energy Technology Data Exchange (ETDEWEB)

    Valor, A. [Facultad de Fisica, Universidad de La Habana, San Lazaro y L, Vedado, 10400 Havana (Cuba); Caleyo, F. [Departamento de Ingenieria, Metalurgica, IPN-ESIQIE, UPALM Edif. 7, Zacatenco, Mexico DF 07738 (Mexico)]. E-mail: fcaleyo@gmail.com; Alfonso, L. [Departamento de Ingenieria, Metalurgica, IPN-ESIQIE, UPALM Edif. 7, Zacatenco, Mexico DF 07738 (Mexico); Rivas, D. [Departamento de Ingenieria, Metalurgica, IPN-ESIQIE, UPALM Edif. 7, Zacatenco, Mexico DF 07738 (Mexico); Hallen, J.M. [Departamento de Ingenieria, Metalurgica, IPN-ESIQIE, UPALM Edif. 7, Zacatenco, Mexico DF 07738 (Mexico)

    2007-02-15

    In this work, a new stochastic model capable of simulating pitting corrosion is developed and validated. Pitting corrosion is modeled as the combination of two stochastic processes: pit initiation and pit growth. Pit generation is modeled as a nonhomogeneous Poisson process, in which induction time for pit initiation is simulated as the realization of a Weibull process. In this way, the exponential and Weibull distributions can be considered as the possible distributions for pit initiation time. Pit growth is simulated using a nonhomogeneous Markov process. Extreme value statistics is used to find the distribution of maximum pit depths resulting from the combination of the initiation and growth processes for multiple pits. The proposed model is validated using several published experiments on pitting corrosion. It is capable of reproducing the experimental observations with higher quality than the stochastic models available in the literature for pitting corrosion.

  6. Reliability Worth Analysis of Distribution Systems Using Cascade Correlation Neural Networks

    DEFF Research Database (Denmark)

    Heidari, Alireza; Agelidis, Vassilios; Pou, Josep

    2018-01-01

    Reliability worth analysis is of great importance in the area of distribution network planning and operation. The reliability worth's precision can be affected greatly by the customer interruption cost model used. The choice of the cost models can change system and load point reliability indices....... In this study, a cascade correlation neural network is adopted to further develop two cost models comprising a probabilistic distribution model and an average or aggregate model. A contingency-based analytical technique is adopted to conduct the reliability worth analysis. Furthermore, the possible effects...

  7. Precision Statistical Analysis of Images Based on Brightness Distribution

    Directory of Open Access Journals (Sweden)

    Muzhir Shaban Al-Ani

    2017-07-01

    Full Text Available Study the content of images is considered an important topic in which reasonable and accurate analysis of images are generated. Recently image analysis becomes a vital field because of huge number of images transferred via transmission media in our daily life. These crowded media with images lead to highlight in research area of image analysis. In this paper, the implemented system is passed into many steps to perform the statistical measures of standard deviation and mean values of both color and grey images. Whereas the last step of the proposed method concerns to compare the obtained results in different cases of the test phase. In this paper, the statistical parameters are implemented to characterize the content of an image and its texture. Standard deviation, mean and correlation values are used to study the intensity distribution of the tested images. Reasonable results are obtained for both standard deviation and mean value via the implementation of the system. The major issue addressed in the work is concentrated on brightness distribution via statistical measures applying different types of lighting.

  8. Challenges in risk estimation using routinely collected clinical data: The example of estimating cervical cancer risks from electronic health-records.

    Science.gov (United States)

    Landy, Rebecca; Cheung, Li C; Schiffman, Mark; Gage, Julia C; Hyun, Noorie; Wentzensen, Nicolas; Kinney, Walter K; Castle, Philip E; Fetterman, Barbara; Poitras, Nancy E; Lorey, Thomas; Sasieni, Peter D; Katki, Hormuzd A

    2018-06-01

    Electronic health-records (EHR) are increasingly used by epidemiologists studying disease following surveillance testing to provide evidence for screening intervals and referral guidelines. Although cost-effective, undiagnosed prevalent disease and interval censoring (in which asymptomatic disease is only observed at the time of testing) raise substantial analytic issues when estimating risk that cannot be addressed using Kaplan-Meier methods. Based on our experience analysing EHR from cervical cancer screening, we previously proposed the logistic-Weibull model to address these issues. Here we demonstrate how the choice of statistical method can impact risk estimates. We use observed data on 41,067 women in the cervical cancer screening program at Kaiser Permanente Northern California, 2003-2013, as well as simulations to evaluate the ability of different methods (Kaplan-Meier, Turnbull, Weibull and logistic-Weibull) to accurately estimate risk within a screening program. Cumulative risk estimates from the statistical methods varied considerably, with the largest differences occurring for prevalent disease risk when baseline disease ascertainment was random but incomplete. Kaplan-Meier underestimated risk at earlier times and overestimated risk at later times in the presence of interval censoring or undiagnosed prevalent disease. Turnbull performed well, though was inefficient and not smooth. The logistic-Weibull model performed well, except when event times didn't follow a Weibull distribution. We have demonstrated that methods for right-censored data, such as Kaplan-Meier, result in biased estimates of disease risks when applied to interval-censored data, such as screening programs using EHR data. The logistic-Weibull model is attractive, but the model fit must be checked against Turnbull non-parametric risk estimates. Copyright © 2017 The Authors. Published by Elsevier Inc. All rights reserved.

  9. Energy system analysis of fuel cells and distributed generation

    DEFF Research Database (Denmark)

    Mathiesen, Brian Vad; Lund, Henrik

    2007-01-01

    This chapter introduces Energy System Analysis methodologies and tools, which can be used for identifying the best application of different Fuel Cell (FC) technologies to different regional or national energy systems. The main point is that the benefits of using FC technologies indeed depend...... on the energy system in which they are used. Consequently, coherent energy systems analyses of specific and complete energy systems must be conducted in order to evaluate the benefits of FC technologies and in order to be able to compare alternative solutions. In relation to distributed generation, FC...... technologies are very often connected to the use of hydrogen, which has to be provided e.g. from electrolysers. Decentralised and distributed generation has the possibility of improving the overall energy efficiency and flexibility of energy systems. Therefore, energy system analysis tools and methodologies...

  10. Fracture criterion for brittle materials based on statistical cells of finite volume

    International Nuclear Information System (INIS)

    Cords, H.; Kleist, G.; Zimmermann, R.

    1986-06-01

    An analytical consideration of the Weibull Statistical Analysis of brittle materials established the necessity of including one additional material constant for a more comprehensive description of the failure behaviour. The Weibull analysis is restricted to infinitesimal volume elements in consequence of the differential calculus applied. It was found that infinitesimally small elements are in conflict with the basic statistical assumption and that the differential calculus is not needed in fact since nowadays most of the stress analyses are based on finite element calculations, and these are most suitable for a subsequent statistical analysis of strength. The size of a finite statistical cell has been introduced as the third material parameter. It should represent the minimum volume containing all statistical features of the material such as distribution of pores, flaws and grains. The new approach also contains a unique treatment of failure under multiaxial stresses. The quantity responsible for failure under multiaxial stresses is introduced as a modified strain energy. Sixteen different tensile specimens including CT-specimens have been investigated experimentally and analyzed with the probabilistic fracture criterion. As a result it can be stated that the failure rates of all types of specimens made from three different grades of graphite are predictable. The accuracy of the prediction is one standard deviation. (orig.) [de

  11. Making distributed ALICE analysis simple using the GRID plug-in

    International Nuclear Information System (INIS)

    Gheata, A; Gheata, M

    2012-01-01

    We have developed an interface within the ALICE analysis framework that allows transparent usage of the experiment's distributed resources. This analysis plug-in makes it possible to configure back-end specific parameters from a single interface and to run with no change the same custom user analysis in many computing environments, from local workstations to PROOF clusters or GRID resources. The tool is used now extensively in the ALICE collaboration for both end-user analysis and large scale productions.

  12. 5_29 - 37_Dikko et al.,_A New Generalized-Exponential-Weibull ...

    African Journals Online (AJOL)

    user pc

    ction, survival and hazard function, order statistics of the distribution. were estimated using ... he mathematical properties eibull distribution .... 2 December, 2017. Applying binomial expansion and further simplification, equation (15) become.

  13. Distribution System Reliability Analysis for Smart Grid Applications

    Science.gov (United States)

    Aljohani, Tawfiq Masad

    Reliability of power systems is a key aspect in modern power system planning, design, and operation. The ascendance of the smart grid concept has provided high hopes of developing an intelligent network that is capable of being a self-healing grid, offering the ability to overcome the interruption problems that face the utility and cost it tens of millions in repair and loss. To address its reliability concerns, the power utilities and interested parties have spent extensive amount of time and effort to analyze and study the reliability of the generation and transmission sectors of the power grid. Only recently has attention shifted to be focused on improving the reliability of the distribution network, the connection joint between the power providers and the consumers where most of the electricity problems occur. In this work, we will examine the effect of the smart grid applications in improving the reliability of the power distribution networks. The test system used in conducting this thesis is the IEEE 34 node test feeder, released in 2003 by the Distribution System Analysis Subcommittee of the IEEE Power Engineering Society. The objective is to analyze the feeder for the optimal placement of the automatic switching devices and quantify their proper installation based on the performance of the distribution system. The measures will be the changes in the reliability system indices including SAIDI, SAIFI, and EUE. The goal is to design and simulate the effect of the installation of the Distributed Generators (DGs) on the utility's distribution system and measure the potential improvement of its reliability. The software used in this work is DISREL, which is intelligent power distribution software that is developed by General Reliability Co.

  14. Comparing Distributions of Environmental Outcomes for Regulatory Environmental Justice Analysis

    Directory of Open Access Journals (Sweden)

    Glenn Sheriff

    2011-05-01

    Full Text Available Economists have long been interested in measuring distributional impacts of policy interventions. As environmental justice (EJ emerged as an ethical issue in the 1970s, the academic literature has provided statistical analyses of the incidence and causes of various environmental outcomes as they relate to race, income, and other demographic variables. In the context of regulatory impacts, however, there is a lack of consensus regarding what information is relevant for EJ analysis, and how best to present it. This paper helps frame the discussion by suggesting a set of questions fundamental to regulatory EJ analysis, reviewing past approaches to quantifying distributional equity, and discussing the potential for adapting existing tools to the regulatory context.

  15. Analysis and Comparison of Typical Models within Distribution Network Design

    DEFF Research Database (Denmark)

    Jørgensen, Hans Jacob; Larsen, Allan; Madsen, Oli B.G.

    This paper investigates the characteristics of typical optimisation models within Distribution Network Design. During the paper fourteen models known from the literature will be thoroughly analysed. Through this analysis a schematic approach to categorisation of distribution network design models...... for educational purposes. Furthermore, the paper can be seen as a practical introduction to network design modelling as well as a being an art manual or recipe when constructing such a model....

  16. DISCRN: A Distributed Storytelling Framework for Intelligence Analysis.

    Science.gov (United States)

    Shukla, Manu; Dos Santos, Raimundo; Chen, Feng; Lu, Chang-Tien

    2017-09-01

    Storytelling connects entities (people, organizations) using their observed relationships to establish meaningful storylines. This can be extended to spatiotemporal storytelling that incorporates locations, time, and graph computations to enhance coherence and meaning. But when performed sequentially these computations become a bottleneck because the massive number of entities make space and time complexity untenable. This article presents DISCRN, or distributed spatiotemporal ConceptSearch-based storytelling, a distributed framework for performing spatiotemporal storytelling. The framework extracts entities from microblogs and event data, and links these entities using a novel ConceptSearch to derive storylines in a distributed fashion utilizing key-value pair paradigm. Performing these operations at scale allows deeper and broader analysis of storylines. The novel parallelization techniques speed up the generation and filtering of storylines on massive datasets. Experiments with microblog posts such as Twitter data and Global Database of Events, Language, and Tone events show the efficiency of the techniques in DISCRN.

  17. Determination of utilizable wind energy for indoor ventilation in ...

    African Journals Online (AJOL)

    Journal of the Nigerian Association of Mathematical Physics ... Determination of utilizable wind energy for indoor ventilation in buildings across selected locations in Nigeria ... Weibull's distribution function was used for modeling of wind speed ...

  18. Valuing carbon assets for high-tech with application to the wind energy industry

    International Nuclear Information System (INIS)

    Han, Liyan; Liu, Yang; Lin, Qiang; Huang, Gubo

    2015-01-01

    In contrast to the traditional methods for high-tech evaluation, we introduce a new, more active idea for considering the carbon asset effect, in addition to the economic and technological considerations for strategic significance. The method proposed in this paper considers a reduced amount of carbon emissions, less than that of the current industry baseline, to be an asset that is beneficial to a firm that adopts a new technology. The measured carbon asset values vary across different technologies, in different industries and over time. The new method is applied to the valuing of wind energy technology and uses the Weibull distribution to estimate the wind energy capacity and a concrete sensitivity analysis. These applications support the validity of the new method and show that the impact of the fluctuations of carbon sinks on the values of carbon assets is significantly greater than that of volatility in the production output. The paper also presents some policy recommendations based on the results. - Highlights: • Carbon asset dimension for high-tech evaluation. • Valuing wind energy technology by Weibull distribution. • Greater impact of the carbon sink price on the carbon asset value than that of production output. • The environmental risk could be measured based on the carbon asset assessment.

  19. Recognition of battery aging variations for LiFePO4 batteries in 2nd use applications combining incremental capacity analysis and statistical approaches

    Science.gov (United States)

    Jiang, Yan; Jiang, Jiuchun; Zhang, Caiping; Zhang, Weige; Gao, Yang; Guo, Qipei

    2017-08-01

    To assess the economic benefits of battery reuse, the consistency and aging characteristics of a retired LiFePO4 battery pack are studied in this paper. The consistency of battery modules is analyzed from the perspective of the capacity and the internal resistance. Test results indicate that battery module parameter dispersion increases along with battery aging. However, battery modules with better capacity consistency doesn't ensure better resistance consistency. Then the aging characteristics of the battery pack are analyzed and the main results are as follow: (1) Weibull and normal distribution are feasible to fit the capacity and resistance distribution of battery modules respectively; (2) SOC imbalance is the dominating factor in the capacity fading process of the battery pack; (3) By employing the incremental capacity (IC) and IC peak area analysis, a consistency evaluation method representing the aging mechanism variations of the battery modules is proposed and then an accurate battery screening strategy is put forward. This study not only provides data support for evaluating economic benefits of retired batteries but also presents a method to recognize the battery aging variations, which is helpful for rapid evaluation and screening of retired batteries for 2nd use.

  20. Combining Static Analysis and Runtime Checking in Security Aspects for Distributed Tuple Spaces

    DEFF Research Database (Denmark)

    Yang, Fan; Aotani, Tomoyuki; Masuhara, Hidehiko

    2011-01-01

    Enforcing security policies to distributed systems is difficult, in particular, to a system containing untrusted components. We designed AspectKE*, an aspect-oriented programming language based on distributed tuple spaces to tackle this issue. One of the key features in AspectKE* is the program...... analysis predicates and functions that provide information on future behavior of a program. With a dual value evaluation mechanism that handles results of static analysis and runtime values at the same time, those functions and predicates enable the users to specify security policies in a uniform manner....... Our two-staged implementation strategy gathers fundamental static analysis information at load-time, so as to avoid performing all analysis at runtime. We built a compiler for AspectKE*, and successfully implemented security aspects for a distributed chat system and an electronic healthcare record...