An EOQ Model with Two-Parameter Weibull Distribution Deterioration and Price-Dependent Demand
Mukhopadhyay, Sushanta; Mukherjee, R. N.; Chaudhuri, K. S.
2005-01-01
An inventory replenishment policy is developed for a deteriorating item and price-dependent demand. The rate of deterioration is taken to be time-proportional and the time to deterioration is assumed to follow a two-parameter Weibull distribution. A power law form of the price dependence of demand is considered. The model is solved analytically…
Statistical analysis of wind speed using two-parameter Weibull distribution in Alaçatı region
International Nuclear Information System (INIS)
Ozay, Can; Celiktas, Melih Soner
2016-01-01
Highlights: • Wind speed & direction data from September 2008 to March 2014 has been analyzed. • Mean wind speed for the whole data set has been found to be 8.11 m/s. • Highest wind speed is observed in July with a monthly mean value of 9.10 m/s. • Wind speed with the most energy has been calculated as 12.77 m/s. • Observed data has been fit to a Weibull distribution and k &c parameters have been calculated as 2.05 and 9.16. - Abstract: Weibull Statistical Distribution is a common method for analyzing wind speed measurements and determining wind energy potential. Weibull probability density function can be used to forecast wind speed, wind density and wind energy potential. In this study a two-parameter Weibull statistical distribution is used to analyze the wind characteristics of Alaçatı region, located in Çeşme, İzmir. The data used in the density function are acquired from a wind measurement station in Alaçatı. Measurements were gathered on three different heights respectively 70, 50 and 30 m between 10 min intervals for five and half years. As a result of this study; wind speed frequency distribution, wind direction trends, mean wind speed, and the shape and the scale (k&c) Weibull parameters have been calculated for the region. Mean wind speed for the entirety of the data set is found to be 8.11 m/s. k&c parameters are found as 2.05 and 9.16 in relative order. Wind direction analysis along with a wind rose graph for the region is also provided with the study. Analysis suggests that higher wind speeds which range from 6–12 m/s are prevalent between the sectors 340–360°. Lower wind speeds, from 3 to 6 m/s occur between sectors 10–29°. Results of this study contribute to the general knowledge about the regions wind energy potential and can be used as a source for investors and academics.
Directory of Open Access Journals (Sweden)
Chris Bambey Guure
2012-01-01
Full Text Available The Weibull distribution has been observed as one of the most useful distribution, for modelling and analysing lifetime data in engineering, biology, and others. Studies have been done vigorously in the literature to determine the best method in estimating its parameters. Recently, much attention has been given to the Bayesian estimation approach for parameters estimation which is in contention with other estimation methods. In this paper, we examine the performance of maximum likelihood estimator and Bayesian estimator using extension of Jeffreys prior information with three loss functions, namely, the linear exponential loss, general entropy loss, and the square error loss function for estimating the two-parameter Weibull failure time distribution. These methods are compared using mean square error through simulation study with varying sample sizes. The results show that Bayesian estimator using extension of Jeffreys' prior under linear exponential loss function in most cases gives the smallest mean square error and absolute bias for both the scale parameter α and the shape parameter β for the given values of extension of Jeffreys' prior.
A MULTIVARIATE WEIBULL DISTRIBUTION
Directory of Open Access Journals (Sweden)
Cheng Lee
2010-07-01
Full Text Available A multivariate survival function of Weibull Distribution is developed by expanding the theorem by Lu and Bhattacharyya. From the survival function, the probability density function, the cumulative probability function, the determinant of the Jacobian Matrix, and the general moment are derived.
Transmuted Generalized Inverse Weibull Distribution
Merovci, Faton; Elbatal, Ibrahim; Ahmed, Alaa
2013-01-01
A generalization of the generalized inverse Weibull distribution so-called transmuted generalized inverse Weibull dis- tribution is proposed and studied. We will use the quadratic rank transmutation map (QRTM) in order to generate a flexible family of probability distributions taking generalized inverse Weibull distribution as the base value distribution by introducing a new parameter that would offer more distributional flexibility. Various structural properties including explicit expression...
Transmuted Complementary Weibull Geometric Distribution
Directory of Open Access Journals (Sweden)
Ahmed Z. A fify
2014-12-01
Full Text Available This paper provides a new generalization of the complementary Weibull geometric distribution that introduced by Tojeiro et al. (2014, using the quadratic rank transmutation map studied by Shaw and Buckley (2007. The new distribution is referred to as transmuted complementary Weibull geometric distribution (TCWGD. The TCWG distribution includes as special cases the complementary Weibull geometric distribution (CWGD, complementary exponential geometric distribution(CEGD,Weibull distribution (WD and exponential distribution (ED. Various structural properties of the new distribution including moments, quantiles, moment generating function and RØnyi entropy of the subject distribution are derived. We proposed the method of maximum likelihood for estimating the model parameters and obtain the observed information matrix. A real data set are used to compare the exibility of the transmuted version versus the complementary Weibull geometric distribution.
The Weibull distribution a handbook
Rinne, Horst
2008-01-01
The Most Comprehensive Book on the SubjectChronicles the Development of the Weibull Distribution in Statistical Theory and Applied StatisticsExploring one of the most important distributions in statistics, The Weibull Distribution: A Handbook focuses on its origin, statistical properties, and related distributions. The book also presents various approaches to estimate the parameters of the Weibull distribution under all possible situations of sampling data as well as approaches to parameter and goodness-of-fit testing.Describes the Statistical Methods, Concepts, Theories, and Applications of T
The Beta Transmuted Weibull Distribution
Directory of Open Access Journals (Sweden)
Manisha Pal
2014-06-01
Full Text Available The paper introduces a beta transmuted Weibull distribution, which contains a number ofdistributions as special cases. The properties of the distribution are discussed and explicit expressions are derived for the mean deviations, Bonferroni and Lorenz curves, and reliability. The distribution and moments of order statistics are also studied. Estimation of the model parameters by the method of maximum likelihood is discussed. The log beta transmuted Weibull model is introduced to analyze censored data. Finally, the usefulness of the new distribution in analyzing positive data is illustrated.
Modifications of the Weibull distribution: A review
International Nuclear Information System (INIS)
Almalki, Saad J.; Nadarajah, Saralees
2014-01-01
It is well known that the Weibull distribution is the most popular and the most widely used distribution in reliability and in analysis of lifetime data. Unfortunately, its hazard function cannot exhibit non-monotonic shapes like the bathtub shape or the unimodal shape. Since 1958, the Weibull distribution has been modified by many researchers to allow for non-monotonic hazard functions. This paper gives an extensive review of some discrete and continuous versions of the modifications of the Weibull distribution. - Highlights: • A comprehensive review of known discrete modifications and generalizations of the Weibull distribution. • A comprehensive review of known continuous modifications and generalizations of the Weibull distribution. • Over 110 references on modifications/generalizations of the Weibull distribution. • More than 55% of the cited references appeared in the last 5 years
The Transmuted Generalized Inverse Weibull Distribution
Directory of Open Access Journals (Sweden)
Faton Merovci
2014-05-01
Full Text Available A generalization of the generalized inverse Weibull distribution the so-called transmuted generalized inverse Weibull distribution is proposed and studied. We will use the quadratic rank transmutation map (QRTM in order to generate a flexible family of probability distributions taking the generalized inverseWeibull distribution as the base value distribution by introducing a new parameter that would offer more distributional flexibility. Various structural properties including explicit expressions for the moments, quantiles, and moment generating function of the new distribution are derived. We propose the method of maximum likelihood for estimating the model parameters and obtain the observed information matrix. A real data set are used to compare the flexibility of the transmuted version versus the generalized inverse Weibull distribution.
Reliability Estimation of Aero-engine Based on Mixed Weibull Distribution Model
Yuan, Zhongda; Deng, Junxiang; Wang, Dawei
2018-02-01
Aero-engine is a complex mechanical electronic system, based on analysis of reliability of mechanical electronic system, Weibull distribution model has an irreplaceable role. Till now, only two-parameter Weibull distribution model and three-parameter Weibull distribution are widely used. Due to diversity of engine failure modes, there is a big error with single Weibull distribution model. By contrast, a variety of engine failure modes can be taken into account with mixed Weibull distribution model, so it is a good statistical analysis model. Except the concept of dynamic weight coefficient, in order to make reliability estimation result more accurately, three-parameter correlation coefficient optimization method is applied to enhance Weibull distribution model, thus precision of mixed distribution reliability model is improved greatly. All of these are advantageous to popularize Weibull distribution model in engineering applications.
A CLASS OF WEIGHTED WEIBULL DISTRIBUTION
Directory of Open Access Journals (Sweden)
Saman Shahbaz
2010-07-01
Full Text Available The weighted Weibull model is proposed following the method of Azzalini (1985. Basic properties of the distribution; including moments, generating function, hazard rate function and estimation of parameters; have been studied. The weighted Weibull model is proposed following the method of Azzalini (1985. Basic properties of the distribution; including moments, generating function, hazard rate function and estimation of parameters; have been studied.
The McDonald Modified Weibull Distribution: Properties and Applications
Merovci, Faton; Elbatal, Ibrahim
2013-01-01
A six parameter distribution so-called the McDonald modified Weibull distribution is defined and studied. The new distribution contains, as special submodels, several important distributions discussed in the literature, such as the beta modified Weibull, Kumaraswamy modified Weibull, McDonald Weibull and modified Weibull distribution,among others. The new distribution can be used effectively in the analysis of survival data since it accommodates monotone, unimodal and bathtub-shaped hazard fu...
(AJST) MULTIPLE DEFECT DISTRIBUTIONS ON WEIBULL ...
African Journals Online (AJOL)
such as ceramics, which cannot correctly be statistically described by single Weibull distribution models. (Equations (1) and (2)) [6]. .... bottom filling through a filter at an initial runner velocity of less than 0.5 ms−1 beyond the filter, producing turbulence free conditions. The as-cast test bars were subjected to identical T6 heat.
Weibull distribution in reliability data analysis in nuclear power plant
International Nuclear Information System (INIS)
Ma Yingfei; Zhang Zhijian; Zhang Min; Zheng Gangyang
2015-01-01
Reliability is an important issue affecting each stage of the life cycle ranging from birth to death of a product or a system. The reliability engineering includes the equipment failure data processing, quantitative assessment of system reliability and maintenance, etc. Reliability data refers to the variety of data that describe the reliability of system or component during its operation. These data may be in the form of numbers, graphics, symbols, texts and curves. Quantitative reliability assessment is the task of the reliability data analysis. It provides the information related to preventing, detect, and correct the defects of the reliability design. Reliability data analysis under proceed with the various stages of product life cycle and reliability activities. Reliability data of Systems Structures and Components (SSCs) in Nuclear Power Plants is the key factor of probabilistic safety assessment (PSA); reliability centered maintenance and life cycle management. The Weibull distribution is widely used in reliability engineering, failure analysis, industrial engineering to represent manufacturing and delivery times. It is commonly used to model time to fail, time to repair and material strength. In this paper, an improved Weibull distribution is introduced to analyze the reliability data of the SSCs in Nuclear Power Plants. An example is given in the paper to present the result of the new method. The Weibull distribution of mechanical equipment for reliability data fitting ability is very strong in nuclear power plant. It's a widely used mathematical model for reliability analysis. The current commonly used methods are two-parameter and three-parameter Weibull distribution. Through comparison and analysis, the three-parameter Weibull distribution fits the data better. It can reflect the reliability characteristics of the equipment and it is more realistic to the actual situation. (author)
Bayesian estimation of Weibull distribution parameters
International Nuclear Information System (INIS)
Bacha, M.; Celeux, G.; Idee, E.; Lannoy, A.; Vasseur, D.
1994-11-01
In this paper, we expose SEM (Stochastic Expectation Maximization) and WLB-SIR (Weighted Likelihood Bootstrap - Sampling Importance Re-sampling) methods which are used to estimate Weibull distribution parameters when data are very censored. The second method is based on Bayesian inference and allow to take into account available prior informations on parameters. An application of this method, with real data provided by nuclear power plants operation feedback analysis has been realized. (authors). 8 refs., 2 figs., 2 tabs
Transmuted New Generalized Inverse Weibull Distribution
Directory of Open Access Journals (Sweden)
Muhammad Shuaib Khan
2017-06-01
Full Text Available This paper introduces the transmuted new generalized inverse Weibull distribution by using the quadratic rank transmutation map (QRTM scheme studied by Shaw et al. (2007. The proposed model contains the twenty three lifetime distributions as special sub-models. Some mathematical properties of the new distribution are formulated, such as quantile function, Rényi entropy, mean deviations, moments, moment generating function and order statistics. The method of maximum likelihood is used for estimating the model parameters. We illustrate the flexibility and potential usefulness of the new distribution by using reliability data.
Using the Weibull distribution reliability, modeling and inference
McCool, John I
2012-01-01
Understand and utilize the latest developments in Weibull inferential methods While the Weibull distribution is widely used in science and engineering, most engineers do not have the necessary statistical training to implement the methodology effectively. Using the Weibull Distribution: Reliability, Modeling, and Inference fills a gap in the current literature on the topic, introducing a self-contained presentation of the probabilistic basis for the methodology while providing powerful techniques for extracting information from data. The author explains the use of the Weibull distribution
Comparison of estimation methods for fitting weibull distribution to ...
African Journals Online (AJOL)
Comparison of estimation methods for fitting weibull distribution to the natural stand of Oluwa Forest Reserve, Ondo State, Nigeria. ... Journal of Research in Forestry, Wildlife and Environment ... The result revealed that maximum likelihood method was more accurate in fitting the Weibull distribution to the natural stand.
The McDonald’s Inverse Weibull Distribution
Directory of Open Access Journals (Sweden)
Saman Hanif Shahbaz
2016-12-01
Full Text Available We have proposed a new Inverse Weibull distribution by using the generalized Beta distribution of McDonald (1984. Basic properties of the proposed distribution has been studied. Parameter estimation has been discussed alongside an illustrative example.
Bayesian Estimation of the Kumaraswamy InverseWeibull Distribution
Directory of Open Access Journals (Sweden)
Felipe R.S. de Gusmao
2017-05-01
Full Text Available The Kumaraswamy InverseWeibull distribution has the ability to model failure rates that have unimodal shapes and are quite common in reliability and biological studies. The three-parameter Kumaraswamy InverseWeibull distribution with decreasing and unimodal failure rate is introduced. We provide a comprehensive treatment of the mathematical properties of the Kumaraswany Inverse Weibull distribution and derive expressions for its moment generating function and the ligrl/ig-th generalized moment. Some properties of the model with some graphs of density and hazard function are discussed. We also discuss a Bayesian approach for this distribution and an application was made for a real data set.
A Weibull distribution accrual failure detector for cloud computing.
Liu, Jiaxi; Wu, Zhibo; Wu, Jin; Dong, Jian; Zhao, Yao; Wen, Dongxin
2017-01-01
Failure detectors are used to build high availability distributed systems as the fundamental component. To meet the requirement of a complicated large-scale distributed system, accrual failure detectors that can adapt to multiple applications have been studied extensively. However, several implementations of accrual failure detectors do not adapt well to the cloud service environment. To solve this problem, a new accrual failure detector based on Weibull Distribution, called the Weibull Distribution Failure Detector, has been proposed specifically for cloud computing. It can adapt to the dynamic and unexpected network conditions in cloud computing. The performance of the Weibull Distribution Failure Detector is evaluated and compared based on public classical experiment data and cloud computing experiment data. The results show that the Weibull Distribution Failure Detector has better performance in terms of speed and accuracy in unstable scenarios, especially in cloud computing.
Evaluation of burst probability for tubes by Weibull distributions
International Nuclear Information System (INIS)
Kao, S.
1975-10-01
The investigations of candidate distributions that best describe the burst pressure failure probability characteristics of nuclear power steam generator tubes has been continued. To date it has been found that the Weibull distribution provides an acceptable fit for the available data from both the statistical and physical viewpoints. The reasons for the acceptability of the Weibull distribution are stated together with the results of tests for the suitability of fit. In exploring the acceptability of the Weibull distribution for the fitting, a graphical method to be called the ''density-gram'' is employed instead of the usual histogram. With this method a more sensible graphical observation on the empirical density may be made for cases where the available data is very limited. Based on these methods estimates of failure pressure are made for the left-tail probabilities
Single versus mixture Weibull distributions for nonparametric satellite reliability
International Nuclear Information System (INIS)
Castet, Jean-Francois; Saleh, Joseph H.
2010-01-01
Long recognized as a critical design attribute for space systems, satellite reliability has not yet received the proper attention as limited on-orbit failure data and statistical analyses can be found in the technical literature. To fill this gap, we recently conducted a nonparametric analysis of satellite reliability for 1584 Earth-orbiting satellites launched between January 1990 and October 2008. In this paper, we provide an advanced parametric fit, based on mixture of Weibull distributions, and compare it with the single Weibull distribution model obtained with the Maximum Likelihood Estimation (MLE) method. We demonstrate that both parametric fits are good approximations of the nonparametric satellite reliability, but that the mixture Weibull distribution provides significant accuracy in capturing all the failure trends in the failure data, as evidenced by the analysis of the residuals and their quasi-normal dispersion.
Statistical Analysis Of Failure Strength Of Material Using Weibull Distribution
International Nuclear Information System (INIS)
Entin Hartini; Mike Susmikanti; Antonius Sitompul
2008-01-01
In evaluation of ceramic and glass materials strength a statistical approach is necessary Strength of ceramic and glass depend on its measure and size distribution of flaws in these material. The distribution of strength for ductile material is narrow and close to a Gaussian distribution while strength of brittle materials as ceramic and glass following Weibull distribution. The Weibull distribution is an indicator of the failure of material strength resulting from a distribution of flaw size. In this paper, cumulative probability of material strength to failure probability, cumulative probability of failure versus fracture stress and cumulative probability of reliability of material were calculated. Statistical criteria calculation supporting strength analysis of Silicon Nitride material were done utilizing MATLAB. (author)
Analysis of the upper-truncated Weibull distribution for wind speed
International Nuclear Information System (INIS)
Kantar, Yeliz Mert; Usta, Ilhan
2015-01-01
Highlights: • Upper-truncated Weibull distribution is proposed to model wind speed. • Upper-truncated Weibull distribution nests Weibull distribution as special case. • Maximum likelihood is the best method for upper-truncated Weibull distribution. • Fitting accuracy of upper-truncated Weibull is analyzed on wind speed data. - Abstract: Accurately modeling wind speed is critical in estimating the wind energy potential of a certain region. In order to model wind speed data smoothly, several statistical distributions have been studied. Truncated distributions are defined as a conditional distribution that results from restricting the domain of statistical distribution and they also cover base distribution. This paper proposes, for the first time, the use of upper-truncated Weibull distribution, in modeling wind speed data and also in estimating wind power density. In addition, a comparison is made between upper-truncated Weibull distribution and well known Weibull distribution using wind speed data measured in various regions of Turkey. The obtained results indicate that upper-truncated Weibull distribution shows better performance than Weibull distribution in estimating wind speed distribution and wind power. Therefore, upper-truncated Weibull distribution can be an alternative for use in the assessment of wind energy potential
Using Weibull Distribution Analysis to Evaluate ALARA Performance
International Nuclear Information System (INIS)
Frome, E.L.; Watkins, J.P.; Hagemeyer, D.A.
2009-01-01
As Low as Reasonably Achievable (ALARA) is the underlying principle for protecting nuclear workers from potential health outcomes related to occupational radiation exposure. Radiation protection performance is currently evaluated by measures such as collective dose and average measurable dose, which do not indicate ALARA performance. The purpose of this work is to show how statistical modeling of individual doses using the Weibull distribution can provide objective supplemental performance indicators for comparing ALARA implementation among sites and for insights into ALARA practices within a site. Maximum likelihood methods were employed to estimate the Weibull shape and scale parameters used for performance indicators. The shape parameter reflects the effectiveness of maximizing the number of workers receiving lower doses and is represented as the slope of the fitted line on a Weibull probability plot. Additional performance indicators derived from the model parameters include the 99th percentile and the exceedance fraction. When grouping sites by collective total effective dose equivalent (TEDE) and ranking by 99th percentile with confidence intervals, differences in performance among sites can be readily identified. Applying this methodology will enable more efficient and complete evaluation of the effectiveness of ALARA implementation.
Steve P. Verrill; James W. Evans; David E. Kretschmann; Cherilyn A. Hatfield
2014-01-01
Two important wood properties are the modulus of elasticity (MOE) and the modulus of rupture (MOR). In the past, the statistical distribution of the MOE has often been modeled as Gaussian, and that of the MOR as lognormal or as a two- or three-parameter Weibull distribution. It is well known that MOE and MOR are positively correlated. To model the simultaneous behavior...
Improvement for Amelioration Inventory Model with Weibull Distribution
Directory of Open Access Journals (Sweden)
Han-Wen Tuan
2017-01-01
Full Text Available Most inventory models dealt with deteriorated items. On the contrary, just a few papers considered inventory systems under amelioration environment. We study an amelioration inventory model with Weibull distribution. However, there are some questionable results in the amelioration paper. We will first point out those questionable results in the previous paper that did not derive the optimal solution and then provide some improvements. We will provide a rigorous analytical work for different cases dependent on the size of the shape parameter. We present a detailed numerical example for different ranges of the sharp parameter to illustrate that our solution method attains the optimal solution. We developed a new amelioration model and then provided a detailed analyzed procedure to find the optimal solution. Our findings will help researchers develop their new inventory models.
A Study on The Mixture of Exponentiated-Weibull Distribution
Directory of Open Access Journals (Sweden)
Adel Tawfik Elshahat
2016-12-01
Full Text Available Mixtures of measures or distributions occur frequently in the theory and applications of probability and statistics. In the simplest case it may, for example, be reasonable to assume that one is dealing with the mixture in given proportions of a finite number of normal populations with different means or variances. The mixture parameter may also be denumerable infinite, as in the theory of sums of a random number of random variables, or continuous, as in the compound Poisson distribution. The use of finite mixture distributions, to control for unobserved heterogeneity, has become increasingly popular among those estimating dynamic discrete choice models. One of the barriers to using mixture models is that parameters that could previously be estimated in stages must now be estimated jointly: using mixture distributions destroys any additive reparability of the log likelihood function. In this thesis, the maximum likelihood estimators have been obtained for the parameters of the mixture of exponentiated Weibull distribution when sample is available from censoring scheme. The maximum likelihood estimators of the parameters and the asymptotic variance covariance matrix have been also obtained. A numerical illustration for these new results is given.
Directory of Open Access Journals (Sweden)
Manna S.K.
2008-01-01
Full Text Available In this paper, we consider the problem of simultaneous determination of retail price and lot-size (RPLS under the assumption that the supplier offers a fixed credit period to the retailer. It is assumed that the item in stock deteriorates over time at a rate that follows a two-parameter Weibull distribution and that the price-dependent demand is represented by a constant-price-elasticity function of retail price. The RPLS decision model is developed and solved analytically. Results are illustrated with the help of a base example. Computational results show that the supplier earns more profits when the credit period is greater than the replenishment cycle length. Sensitivity analysis of the solution to changes in the value of input parameters of the base example is also discussed.
International Nuclear Information System (INIS)
Bebbington, Mark; Lai, C.-D.; Zitikis, Ricardas
2007-01-01
We propose a new two-parameter ageing distribution which is a generalization of the Weibull and study its properties. It has a simple failure rate (hazard rate) function. With appropriate choice of parameter values, it is able to model various ageing classes of life distributions including IFR, IFRA and modified bathtub (MBT). The ranges of the two parameters are clearly demarcated to separate these classes. It thus provides an alternative to many existing life distributions. Details of parameter estimation are provided through a Weibull-type probability plot and maximum likelihood. We also derive explicit formulas for the turning points of the failure rate function in terms of its parameters. This, combined with the parameter estimation procedures, will allow empirical estimation of the turning points for real data sets, which provides useful information for reliability policies
Statistical distribution of the estimator of Weibull modulus
Barbero, Enrique; Fernández-Sáez, José; Navarro Ugena, Carlos
2001-01-01
3 pages, 3 figures. The Weibull statistic has been widely used to study the inherent scatter existing in the strength properties of many advanced materials, as well as in the fracture toughness of steels in the ductile-brittle transition region. The authors are indebted to the Fundación Ramón Areces (Área de Materiales, IX Concurso Nacional) for its financial support of this research. Publicado
On alternative q-Weibull and q-extreme value distributions: Properties and applications
Zhang, Fode; Ng, Hon Keung Tony; Shi, Yimin
2018-01-01
Tsallis statistics and Tsallis distributions have been attracting a significant amount of research work in recent years. Importantly, the Tsallis statistics, q-distributions have been applied in different disciplines. Yet, a relationship between some existing q-Weibull distributions and q-extreme value distributions that is parallel to the well-established relationship between the conventional Weibull and extreme value distributions through a logarithmic transformation has not be established. In this paper, we proposed an alternative q-Weibull distribution that leads to a q-extreme value distribution via the q-logarithm transformation. Some important properties of the proposed q-Weibull and q-extreme value distributions are studied. Maximum likelihood and least squares estimation methods are used to estimate the parameters of q-Weibull distribution and their performances are investigated through a Monte Carlo simulation study. The methodologies and the usefulness of the proposed distributions are illustrated by fitting the 2014 traffic fatalities data from The National Highway Traffic Safety Administration.
International Nuclear Information System (INIS)
Rauhut, J.
1982-01-01
Established methods are presented by which life distributions of machine elements can be determined on the basis of laboratory experiments and operational observations. Practical observations are given special attention as the results estimated on the basis of conventional have not been accurate enough. As an introduction, the stochastic life concept, the general method of determining life distributions, various sampling methods, and the Weibull distribution are explained. Further, possible life testing schedules and maximum-likelihood estimates are discussed for the complete sample case and for censered sampling without replacement in laboratory experiments. Finally, censered sampling with replacement in laboratory experiments is discussed; it is shown how suitable parameter estimates can be obtained for given life distributions by means of the maximum-likelihood method. (orig./RW) [de
Inference on the reliability of Weibull distribution with multiply Type-I censored data
International Nuclear Information System (INIS)
Jia, Xiang; Wang, Dong; Jiang, Ping; Guo, Bo
2016-01-01
In this paper, we focus on the reliability of Weibull distribution under multiply Type-I censoring, which is a general form of Type-I censoring. In multiply Type-I censoring in this study, all units in the life testing experiment are terminated at different times. Reliability estimation with the maximum likelihood estimate of Weibull parameters is conducted. With the delta method and Fisher information, we propose a confidence interval for reliability and compare it with the bias-corrected and accelerated bootstrap confidence interval. Furthermore, a scenario involving a few expert judgments of reliability is considered. A method is developed to generate extended estimations of reliability according to the original judgments and transform them to estimations of Weibull parameters. With Bayes theory and the Monte Carlo Markov Chain method, a posterior sample is obtained to compute the Bayes estimate and credible interval for reliability. Monte Carlo simulation demonstrates that the proposed confidence interval outperforms the bootstrap one. The Bayes estimate and credible interval for reliability are both satisfactory. Finally, a real example is analyzed to illustrate the application of the proposed methods. - Highlights: • We focus on reliability of Weibull distribution under multiply Type-I censoring. • The proposed confidence interval for the reliability is superior after comparison. • The Bayes estimates with a few expert judgements on reliability are satisfactory. • We specify the cases where the MLEs do not exist and present methods to remedy it. • The distribution of estimate of reliability should be used for accurate estimate.
comparison of estimation methods for fitting weibull distribution
African Journals Online (AJOL)
Tersor
Tree diameter characterisation using probability distribution functions is essential for determining the structure of forest stands. This has been an intrinsic part of forest management planning, decision-making and research in recent times. The distribution of species and tree size in a forest area gives the structure of the stand.
Probabilistic analysis of glass elements with three-parameter Weibull distribution
International Nuclear Information System (INIS)
Ramos, A.; Muniz-Calvente, M.; Fernandez, P.; Fernandez Cantel, A.; Lamela, M. J.
2015-01-01
Glass and ceramics present a brittle behaviour so a large scatter in the test results is obtained. This dispersion is mainly due to the inevitable presence of micro-cracks on its surface, edge defects or internal defects, which must be taken into account using an appropriate failure criteria non-deterministic but probabilistic. Among the existing probability distributions, the two or three parameter Weibull distribution is generally used in adjusting material resistance results, although the method of use thereof is not always correct. Firstly, in this work, the results of a large experimental programme using annealed glass specimens of different dimensions based on four-point bending and coaxial double ring tests was performed. Then, the finite element models made for each type of test, the adjustment of the parameters of the three-parameter Weibull distribution function (cdf) (λ: location, β: shape, d: scale) for a certain failure criterion and the calculation of the effective areas from the cumulative distribution function are presented. Summarizing, this work aims to generalize the use of the three-parameter Weibull function in structural glass elements with stress distributions not analytically described, allowing to apply the probabilistic model proposed in general loading distributions. (Author)
Energy Technology Data Exchange (ETDEWEB)
Ramos, A.; Muniz-Calvente, M.; Fernandez, P.; Fernandez Cantel, A.; Lamela, M. J.
2015-10-01
Glass and ceramics present a brittle behaviour so a large scatter in the test results is obtained. This dispersion is mainly due to the inevitable presence of micro-cracks on its surface, edge defects or internal defects, which must be taken into account using an appropriate failure criteria non-deterministic but probabilistic. Among the existing probability distributions, the two or three parameter Weibull distribution is generally used in adjusting material resistance results, although the method of use thereof is not always correct. Firstly, in this work, the results of a large experimental programme using annealed glass specimens of different dimensions based on four-point bending and coaxial double ring tests was performed. Then, the finite element models made for each type of test, the adjustment of the parameters of the three-parameter Weibull distribution function (cdf) (λ: location, β: shape, d: scale) for a certain failure criterion and the calculation of the effective areas from the cumulative distribution function are presented. Summarizing, this work aims to generalize the use of the three-parameter Weibull function in structural glass elements with stress distributions not analytically described, allowing to apply the probabilistic model proposed in general loading distributions. (Author)
Directory of Open Access Journals (Sweden)
Anupam Pathak
2014-11-01
Full Text Available Abstract: Problem Statement: The two-parameter exponentiated Rayleigh distribution has been widely used especially in the modelling of life time event data. It provides a statistical model which has a wide variety of application in many areas and the main advantage is its ability in the context of life time event among other distributions. The uniformly minimum variance unbiased and maximum likelihood estimation methods are the way to estimate the parameters of the distribution. In this study we explore and compare the performance of the uniformly minimum variance unbiased and maximum likelihood estimators of the reliability function R(t=P(X>t and P=P(X>Y for the two-parameter exponentiated Rayleigh distribution. Approach: A new technique of obtaining these parametric functions is introduced in which major role is played by the powers of the parameter(s and the functional forms of the parametric functions to be estimated are not needed. We explore the performance of these estimators numerically under varying conditions. Through the simulation study a comparison are made on the performance of these estimators with respect to the Biasness, Mean Square Error (MSE, 95% confidence length and corresponding coverage percentage. Conclusion: Based on the results of simulation study the UMVUES of R(t and ‘P’ for the two-parameter exponentiated Rayleigh distribution found to be superior than MLES of R(t and ‘P’.
International Nuclear Information System (INIS)
Xie, M.; Goh, T.N.; Tang, Y.
2004-01-01
The failure rate function and mean residual life function are two important characteristics in reliability analysis. Although many papers have studied distributions with bathtub-shaped failure rate and their properties, few have focused on the underlying associations between the mean residual life and failure rate function of these distributions, especially with respect to their changing points. It is known that the change point for mean residual life can be much earlier than that of failure rate function. In fact, the failure rate function should be flat for a long period of time for a distribution to be useful in practice. When the difference between the change points is large, the flat portion tends to be longer. This paper investigates the change points and focuses on the difference of the changing points. The exponentiated Weibull, a modified Weibull, and an extended Weibull distribution, all with bathtub-shaped failure rate function will be used. Some other issues related to the flatness of the bathtub curve are discussed
Statistics for Ratios of Rayleigh, Rician, Nakagami-m, and Weibull Distributed Random Variables
Directory of Open Access Journals (Sweden)
Dragana Č. Pavlović
2013-01-01
Full Text Available The distributions of ratios of random variables are of interest in many areas of the sciences. In this brief paper, we present the joint probability density function (PDF and PDF of maximum of ratios μ1=R1/r1 and μ2=R2/r2 for the cases where R1, R2, r1, and r2 are Rayleigh, Rician, Nakagami-m, and Weibull distributed random variables. Random variables R1 and R2, as well as random variables r1 and r2, are correlated. Ascertaining on the suitability of the Weibull distribution to describe fading in both indoor and outdoor environments, special attention is dedicated to the case of Weibull random variables. For this case, analytical expressions for the joint PDF, PDF of maximum, PDF of minimum, and product moments of arbitrary number of ratios μi=Ri/ri, i=1,…,L are obtained. Random variables in numerator, Ri, as well as random variables in denominator, ri, are exponentially correlated. To the best of the authors' knowledge, analytical expressions for the PDF of minimum and product moments of {μi}i=1L are novel in the open technical literature. The proposed mathematical analysis is complemented by various numerical results. An application of presented theoretical results is illustrated with respect to performance assessment of wireless systems.
Energy Technology Data Exchange (ETDEWEB)
Senvar, O.; Sennaroglu, B.
2016-07-01
This study examines Clements’ Approach (CA), Box-Cox transformation (BCT), and Johnson transformation (JT) methods for process capability assessments through Weibull-distributed data with different parameters to figure out the effects of the tail behaviours on process capability and compares their estimation performances in terms of accuracy and precision. Design/methodology/approach: Usage of process performance index (PPI) Ppu is handled for process capability analysis (PCA) because the comparison issues are performed through generating Weibull data without subgroups. Box plots, descriptive statistics, the root-mean-square deviation (RMSD), which is used as a measure of error, and a radar chart are utilized all together for evaluating the performances of the methods. In addition, the bias of the estimated values is important as the efficiency measured by the mean square error. In this regard, Relative Bias (RB) and the Relative Root Mean Square Error (RRMSE) are also considered. Findings: The results reveal that the performance of a method is dependent on its capability to fit the tail behavior of the Weibull distribution and on targeted values of the PPIs. It is observed that the effect of tail behavior is more significant when the process is more capable. Research limitations/implications: Some other methods such as Weighted Variance method, which also give good results, were also conducted. However, we later realized that it would be confusing in terms of comparison issues between the methods for consistent interpretations... (Author)
A spatial scan statistic for survival data based on Weibull distribution.
Bhatt, Vijaya; Tiwari, Neeraj
2014-05-20
The spatial scan statistic has been developed as a geographical cluster detection analysis tool for different types of data sets such as Bernoulli, Poisson, ordinal, normal and exponential. We propose a scan statistic for survival data based on Weibull distribution. It may also be used for other survival distributions, such as exponential, gamma, and log normal. The proposed method is applied on the survival data of tuberculosis patients for the years 2004-2005 in Nainital district of Uttarakhand, India. Simulation studies reveal that the proposed method performs well for different survival distribution functions. Copyright © 2013 John Wiley & Sons, Ltd.
Directory of Open Access Journals (Sweden)
Farhad Yahgmaei
2013-01-01
Full Text Available This paper proposes different methods of estimating the scale parameter in the inverse Weibull distribution (IWD. Specifically, the maximum likelihood estimator of the scale parameter in IWD is introduced. We then derived the Bayes estimators for the scale parameter in IWD by considering quasi, gamma, and uniform priors distributions under the square error, entropy, and precautionary loss functions. Finally, the different proposed estimators have been compared by the extensive simulation studies in corresponding the mean square errors and the evolution of risk functions.
Anomalous diffusion and q-Weibull velocity distributions in epithelial cell migration.
Directory of Open Access Journals (Sweden)
Tatiane Souza Vilela Podestá
Full Text Available In multicellular organisms, cell motility is central in all morphogenetic processes, tissue maintenance, wound healing and immune surveillance. Hence, the control of cell motion is a major demand in the creation of artificial tissues and organs. Here, cell migration assays on plastic 2D surfaces involving normal (MDCK and tumoral (B16F10 epithelial cell lines were performed varying the initial density of plated cells. Through time-lapse microscopy quantities such as speed distributions, velocity autocorrelations and spatial correlations, as well as the scaling of mean-squared displacements were determined. We find that these cells exhibit anomalous diffusion with q-Weibull speed distributions that evolves non-monotonically to a Maxwellian distribution as the initial density of plated cells increases. Although short-ranged spatial velocity correlations mark the formation of small cell clusters, the emergence of collective motion was not observed. Finally, simulational results from a correlated random walk and the Vicsek model of collective dynamics evidence that fluctuations in cell velocity orientations are sufficient to produce q-Weibull speed distributions seen in our migration assays.
International Nuclear Information System (INIS)
Bai, D.S.; Chun, Y.R.; Kim, J.G.
1995-01-01
This paper considers the design of life-test sampling plans based on failure-censored accelerated life tests. The lifetime distribution of products is assumed to be Weibull with a scale parameter that is a log linear function of a (possibly transformed) stress. Two levels of stress higher than the use condition stress, high and low, are used. Sampling plans with equal expected test times at high and low test stresses which satisfy the producer's and consumer's risk requirements and minimize the asymptotic variance of the test statistic used to decide lot acceptability are obtained. The properties of the proposed life-test sampling plans are investigated
Directory of Open Access Journals (Sweden)
Ozlem Senvar
2016-08-01
Full Text Available Purpose: This study examines Clements’ Approach (CA, Box-Cox transformation (BCT, and Johnson transformation (JT methods for process capability assessments through Weibull-distributed data with different parameters to figure out the effects of the tail behaviours on process capability and compares their estimation performances in terms of accuracy and precision. Design/methodology/approach: Usage of process performance index (PPI Ppu is handled for process capability analysis (PCA because the comparison issues are performed through generating Weibull data without subgroups. Box plots, descriptive statistics, the root-mean-square deviation (RMSD, which is used as a measure of error, and a radar chart are utilized all together for evaluating the performances of the methods. In addition, the bias of the estimated values is important as the efficiency measured by the mean square error. In this regard, Relative Bias (RB and the Relative Root Mean Square Error (RRMSE are also considered. Findings: The results reveal that the performance of a method is dependent on its capability to fit the tail behavior of the Weibull distribution and on targeted values of the PPIs. It is observed that the effect of tail behavior is more significant when the process is more capable. Research limitations/implications: Some other methods such as Weighted Variance method, which also give good results, were also conducted. However, we later realized that it would be confusing in terms of comparison issues between the methods for consistent interpretations. Practical implications: Weibull distribution covers a wide class of non-normal processes due to its capability to yield a variety of distinct curves based on its parameters. Weibull distributions are known to have significantly different tail behaviors, which greatly affects the process capability. In quality and reliability applications, they are widely used for the analyses of failure data in order to understand how
The Transmuted Geometric-Weibull distribution: Properties, Characterizations and Regression Models
Directory of Open Access Journals (Sweden)
Zohdy M Nofal
2017-06-01
Full Text Available We propose a new lifetime model called the transmuted geometric-Weibull distribution. Some of its structural properties including ordinary and incomplete moments, quantile and generating functions, probability weighted moments, Rényi and q-entropies and order statistics are derived. The maximum likelihood method is discussed to estimate the model parameters by means of Monte Carlo simulation study. A new location-scale regression model is introduced based on the proposed distribution. The new distribution is applied to two real data sets to illustrate its flexibility. Empirical results indicate that proposed distribution can be alternative model to other lifetime models available in the literature for modeling real data in many areas.
International Nuclear Information System (INIS)
Xu, Meng; Droguett, Enrique López; Lins, Isis Didier; Chagas Moura, Márcio das
2017-01-01
The q-Weibull model is based on the Tsallis non-extensive entropy and is able to model various behaviors of the hazard rate function, including bathtub curves, by using a single set of parameters. Despite its flexibility, the q-Weibull has not been widely used in reliability applications partly because of the complicated parameters estimation. In this work, the parameters of the q-Weibull are estimated by the maximum likelihood (ML) method. Due to the intricate system of nonlinear equations, derivative-based optimization methods may fail to converge. Thus, the heuristic optimization method of artificial bee colony (ABC) is used instead. To deal with the slow convergence of ABC, it is proposed an adaptive hybrid ABC (AHABC) algorithm that dynamically combines Nelder-Mead simplex search method with ABC for the ML estimation of the q-Weibull parameters. Interval estimates for the q-Weibull parameters, including confidence intervals based on the ML asymptotic theory and on bootstrap methods, are also developed. The AHABC is validated via numerical experiments involving the q-Weibull ML for reliability applications and results show that it produces faster and more accurate convergence when compared to ABC and similar approaches. The estimation procedure is applied to real reliability failure data characterized by a bathtub-shaped hazard rate. - Highlights: • Development of an Adaptive Hybrid ABC (AHABC) algorithm for q-Weibull distribution. • AHABC combines local Nelder-Mead simplex method with ABC to enhance local search. • AHABC efficiently finds the optimal solution for the q-Weibull ML problem. • AHABC outperforms ABC and self-adaptive hybrid ABC in accuracy and convergence speed. • Useful model for reliability data with non-monotonic hazard rate.
International Nuclear Information System (INIS)
Musleh, Rola M.; Helu, Amal
2014-01-01
In this article we consider statistical inferences about the unknown parameters of the Inverse Weibull distribution based on progressively type-II censoring using classical and Bayesian procedures. For classical procedures we propose using the maximum likelihood; the least squares methods and the approximate maximum likelihood estimators. The Bayes estimators are obtained based on both the symmetric and asymmetric (Linex, General Entropy and Precautionary) loss functions. There are no explicit forms for the Bayes estimators, therefore, we propose Lindley's approximation method to compute the Bayes estimators. A comparison between these estimators is provided by using extensive simulation and three criteria, namely, Bias, mean squared error and Pitman nearness (PN) probability. It is concluded that the approximate Bayes estimators outperform the classical estimators most of the time. Real life data example is provided to illustrate our proposed estimators. - Highlights: • We consider progressively type-II censored data from the Inverse Weibull distribution (IW). • We derive MLEs, approximate MLEs, LS and Bayes estimate methods of scale and shape parameters of the IW. • Bayes estimator of shape parameter cannot be expressed in closed forms. • We suggest using Lindley's approximation. • We conclude that the Bayes estimates outperform the classical methods
International Nuclear Information System (INIS)
Kantar, Yeliz Mert; Usta, Ilhan
2008-01-01
In this study, the minimum cross entropy (MinxEnt) principle is applied for the first time to the wind energy field. This principle allows the inclusion of previous information of a wind speed distribution and covers the maximum entropy (MaxEnt) principle, which is also discussed by Li and Li and Ramirez as special cases in their wind power study. The MinxEnt probability density function (pdf) derived from the MinxEnt principle are used to determine the diurnal, monthly, seasonal and annual wind speed distributions. A comparison between MinxEnt pdfs defined on the basis of the MinxEnt principle and the Weibull pdf on wind speed data, which are taken from different sources and measured in various regions, is conducted. The wind power densities of the considered regions obtained from Weibull and MinxEnt pdfs are also compared. The results indicate that the pdfs derived from the MinxEnt principle fit better to a variety of measured wind speed data than the conventionally applied empirical Weibull pdf. Therefore, it is shown that the MinxEnt principle can be used as an alternative method to estimate both wind distribution and wind power accurately
A robust approach based on Weibull distribution for clustering gene expression data
Directory of Open Access Journals (Sweden)
Gong Binsheng
2011-05-01
Full Text Available Abstract Background Clustering is a widely used technique for analysis of gene expression data. Most clustering methods group genes based on the distances, while few methods group genes according to the similarities of the distributions of the gene expression levels. Furthermore, as the biological annotation resources accumulated, an increasing number of genes have been annotated into functional categories. As a result, evaluating the performance of clustering methods in terms of the functional consistency of the resulting clusters is of great interest. Results In this paper, we proposed the WDCM (Weibull Distribution-based Clustering Method, a robust approach for clustering gene expression data, in which the gene expressions of individual genes are considered as the random variables following unique Weibull distributions. Our WDCM is based on the concept that the genes with similar expression profiles have similar distribution parameters, and thus the genes are clustered via the Weibull distribution parameters. We used the WDCM to cluster three cancer gene expression data sets from the lung cancer, B-cell follicular lymphoma and bladder carcinoma and obtained well-clustered results. We compared the performance of WDCM with k-means and Self Organizing Map (SOM using functional annotation information given by the Gene Ontology (GO. The results showed that the functional annotation ratios of WDCM are higher than those of the other methods. We also utilized the external measure Adjusted Rand Index to validate the performance of the WDCM. The comparative results demonstrate that the WDCM provides the better clustering performance compared to k-means and SOM algorithms. The merit of the proposed WDCM is that it can be applied to cluster incomplete gene expression data without imputing the missing values. Moreover, the robustness of WDCM is also evaluated on the incomplete data sets. Conclusions The results demonstrate that our WDCM produces clusters
Directory of Open Access Journals (Sweden)
Kalaba Dragan V.
2014-01-01
Full Text Available The main subject of this paper is the representation of the probabilistic technique for thermal power system reliability assessment. Exploitation research of the reliability of the fossil fuel power plant system has defined the function, or the probabilistic law, according to which the random variable behaves (occurrence of complete unplanned standstill. Based on these data, and by applying the reliability theory to this particular system, using simple and complex Weibull distribution, a hypothesis has been confirmed that the distribution of the observed random variable fully describes the behaviour of such a system in terms of reliability. Establishing a comprehensive insight in the field of probabilistic power system reliability assessment technique could serve as an input for further research and development in the area of power system planning and operation.
Directory of Open Access Journals (Sweden)
Hamdy Mohamed Salem
2018-03-01
Full Text Available This paper considers life-testing experiments and how it is effected by stress factors: namely temperature, electricity loads, cycling rate and pressure. A major type of accelerated life tests is a step-stress model that allows the experimenter to increase stress levels more than normal use during the experiment to see the failure items. The test items are assumed to follow Gamma Dual Weibull distribution. Different methods for estimating the parameters are discussed. These include Maximum Likelihood Estimations and Confidence Interval Estimations which is based on asymptotic normality generate narrow intervals to the unknown distribution parameters with high probability. MathCAD (2001 program is used to illustrate the optimal time procedure through numerical examples.
Directory of Open Access Journals (Sweden)
Abeer Abd-Alla EL-Helbawy
2016-09-01
Full Text Available The accelerated life tests provide quick information on the life time distributions by testing materials or products at higher than basic conditional levels of stress such as pressure, high temperature, vibration, voltage or load to induce failures. In this paper, the acceleration model assumed is log linear model. Constant stress tests are discussed based on Type I and Type II censoring. The Kumaraswmay Weibull distribution is used. The estimators of the parameters, reliability, hazard rate functions and p-th percentile at normal condition, low stress, and high stress are obtained. In addition, credible intervals for parameters of the models are constructed. Optimum test plan are designed. Some numerical studies are used to solve the complicated integrals such as Laplace and Markov Chain Monte Carlo methods.
Directory of Open Access Journals (Sweden)
Abeer Abd-Alla EL-Helbawy
2016-12-01
Full Text Available The accelerated life tests provide quick information on the life time distributions by testing materials or products at higher than basic conditional levels of stress such as pressure, high temperature, vibration, voltage or load to induce failures. In this paper, the acceleration model assumed is log linear model. Constant stress tests are discussed based on Type I and Type II censoring. The Kumaraswmay Weibull distribution is used. The estimators of the parameters, reliability, hazard rate functions and p-th percentile at normal condition, low stress, and high stress are obtained. In addition, credible intervals for parameters of the models are constructed. Optimum test plan are designed. Some numerical studies are used to solve the complicated integrals such as Laplace and Markov Chain Monte Carlo methods.
International Nuclear Information System (INIS)
Bistouni, Fathollah; Jahanshahi, Mohsen
2015-01-01
Fault-tolerant multistage interconnection networks (MINs) play a vital role in the performance of multiprocessor systems where reliability evaluation becomes one of the main concerns in analyzing these networks properly. In many cases, the primary objective in system reliability analysis is to compute a failure distribution of the entire system according to that of its components. However, since the problem is known to be NP-hard, in none of the previous efforts, the precise evaluation of the system failure rate has been performed. Therefore, our goal is to investigate this parameter for different fault-tolerant MINs using Weibull life distribution that is one of the most commonly used distributions in reliability. In this paper, four important groups of fault-tolerant MINs will be examined to find the best fault-tolerance techniques in terms of failure rate; (1) Extra-stage MINs, (2) Parallel MINs, (3) Rearrangeable non-blocking MINs, and (4) Replicated MINs. This paper comprehensively analyzes all perspectives of the reliability (terminal, broadcast, and network reliability). Moreover, in this study, all reliability equations are calculated for different network sizes. - Highlights: • The failure rate of different MINs is analyzed by using Weibull life distribution. • This article tries to find the best fault-tolerance technique in the field of MINs. • Complex series-parallel RBDs are used to determine the reliability of the MINs. • All aspects of the reliability (i.e. terminal, broadcast, and network) are analyzed. • All reliability equations will be calculated for different size N×N.
The discrete additive Weibull distribution: A bathtub-shaped hazard for discontinuous failure data
International Nuclear Information System (INIS)
Bebbington, Mark; Lai, Chin-Diew; Wellington, Morgan; Zitikis, Ričardas
2012-01-01
Although failure data are usually treated as being continuous, they may have been collected in a discrete manner, or in fact be discrete in nature. Reliability models with bathtub-shaped hazard rate are fundamental to the concepts of burn-in and maintenance, but how well do they incorporate discrete data? We explore discrete versions of the additive Weibull distribution, which has the twin virtues of mathematical tractability and the ability to produce bathtub-shaped hazard rate functions. We derive conditions on the parameters for the hazard rate function to be increasing, decreasing, or bathtub shaped. While discrete versions may have the same shaped hazard rate for the same parameter values, we find that when fitted to data the fitted hazard rate shapes can vary between versions. Our results are illustrated using several real-life data sets, and the implications of using continuous models for discrete data discussed.
Directory of Open Access Journals (Sweden)
P Bhattacharya
2016-09-01
Full Text Available The wind resource varies with of the day and the season of the year and even some extent from year to year. Wind energy has inherent variances and hence it has been expressed by distribution functions. In this paper, we present some methods for estimating Weibull parameters in case of a low wind speed characterization, namely, shape parameter (k, scale parameter (c and characterize the discrete wind data sample by the discrete Hilbert transform. We know that the Weibull distribution is an important distribution especially for reliability and maintainability analysis. The suitable values for both shape parameter and scale parameters of Weibull distribution are important for selecting locations of installing wind turbine generators. The scale parameter of Weibull distribution also important to determine whether a wind farm is good or not. Thereafter the use of discrete Hilbert transform (DHT for wind speed characterization provides a new era of using DHT besides its application in digital signal processing. Basically in this paper, discrete Hilbert transform has been applied to characterize the wind sample data measured on College of Engineering and Management, Kolaghat, East Midnapore, India in January 2011.
Shabetia, Alexander; Rodichev, Yurii; Veer, F.A.; Soroka, Elena; Louter, Christian; Bos, Freek; Belis, Jan; Veer, Fred; Nijsse, Rob
An analytical approach based on the on the sequential partitioning of the data and Weibull Statistical Distribution for inhomogeneous - defective materials is proposed. It allows assessing the guaranteed strength of glass structures for the low probability of fracture with a higher degree of
International Nuclear Information System (INIS)
Mohammadi, Kasra; Alavi, Omid; Mostafaeipour, Ali; Goudarzi, Navid; Jalilvand, Mahdi
2016-01-01
Highlights: • Effectiveness of six numerical methods is evaluated to determine wind power density. • More appropriate method for computing the daily wind power density is estimated. • Four windy stations located in the south part of Alberta, Canada namely is investigated. • The more appropriate parameters estimation method was not identical among all examined stations. - Abstract: In this study, the effectiveness of six numerical methods is evaluated to determine the shape (k) and scale (c) parameters of Weibull distribution function for the purpose of calculating the wind power density. The selected methods are graphical method (GP), empirical method of Justus (EMJ), empirical method of Lysen (EML), energy pattern factor method (EPF), maximum likelihood method (ML) and modified maximum likelihood method (MML). The purpose of this study is to identify the more appropriate method for computing the wind power density in four stations distributed in Alberta province of Canada namely Edmonton City Center Awos, Grande Prairie A, Lethbridge A and Waterton Park Gate. To provide a complete analysis, the evaluations are performed on both daily and monthly scales. The results indicate that the precision of computed wind power density values change when different parameters estimation methods are used to determine the k and c parameters. Four methods of EMJ, EML, EPF and ML present very favorable efficiency while the GP method shows weak ability for all stations. However, it is found that the more effective method is not similar among stations owing to the difference in the wind characteristics.
International Nuclear Information System (INIS)
Iskandar, Ismed; Gondokaryono, Yudi Satria
2016-01-01
In reliability theory, the most important problem is to determine the reliability of a complex system from the reliability of its components. The weakness of most reliability theories is that the systems are described and explained as simply functioning or failed. In many real situations, the failures may be from many causes depending upon the age and the environment of the system and its components. Another problem in reliability theory is one of estimating the parameters of the assumed failure models. The estimation may be based on data collected over censored or uncensored life tests. In many reliability problems, the failure data are simply quantitatively inadequate, especially in engineering design and maintenance system. The Bayesian analyses are more beneficial than the classical one in such cases. The Bayesian estimation analyses allow us to combine past knowledge or experience in the form of an apriori distribution with life test data to make inferences of the parameter of interest. In this paper, we have investigated the application of the Bayesian estimation analyses to competing risk systems. The cases are limited to the models with independent causes of failure by using the Weibull distribution as our model. A simulation is conducted for this distribution with the objectives of verifying the models and the estimators and investigating the performance of the estimators for varying sample size. The simulation data are analyzed by using Bayesian and the maximum likelihood analyses. The simulation results show that the change of the true of parameter relatively to another will change the value of standard deviation in an opposite direction. For a perfect information on the prior distribution, the estimation methods of the Bayesian analyses are better than those of the maximum likelihood. The sensitivity analyses show some amount of sensitivity over the shifts of the prior locations. They also show the robustness of the Bayesian analysis within the range
Ben Ali, Jaouher; Chebel-Morello, Brigitte; Saidi, Lotfi; Malinowski, Simon; Fnaiech, Farhat
2015-05-01
Accurate remaining useful life (RUL) prediction of critical assets is an important challenge in condition based maintenance to improve reliability and decrease machine's breakdown and maintenance's cost. Bearing is one of the most important components in industries which need to be monitored and the user should predict its RUL. The challenge of this study is to propose an original feature able to evaluate the health state of bearings and to estimate their RUL by Prognostics and Health Management (PHM) techniques. In this paper, the proposed method is based on the data-driven prognostic approach. The combination of Simplified Fuzzy Adaptive Resonance Theory Map (SFAM) neural network and Weibull distribution (WD) is explored. WD is used just in the training phase to fit measurement and to avoid areas of fluctuation in the time domain. SFAM training process is based on fitted measurements at present and previous inspection time points as input. However, the SFAM testing process is based on real measurements at present and previous inspections. Thanks to the fuzzy learning process, SFAM has an important ability and a good performance to learn nonlinear time series. As output, seven classes are defined; healthy bearing and six states for bearing degradation. In order to find the optimal RUL prediction, a smoothing phase is proposed in this paper. Experimental results show that the proposed method can reliably predict the RUL of rolling element bearings (REBs) based on vibration signals. The proposed prediction approach can be applied to prognostic other various mechanical assets.
Directory of Open Access Journals (Sweden)
Chris Bambey Guure
2012-01-01
Full Text Available The survival function of the Weibull distribution determines the probability that a unit or an individual will survive beyond a certain specified time while the failure rate is the rate at which a randomly selected individual known to be alive at time will die at time (. The classical approach for estimating the survival function and the failure rate is the maximum likelihood method. In this study, we strive to determine the best method, by comparing the classical maximum likelihood against the Bayesian estimators using an informative prior and a proposed data-dependent prior known as generalised noninformative prior. The Bayesian estimation is considered under three loss functions. Due to the complexity in dealing with the integrals using the Bayesian estimator, Lindley’s approximation procedure is employed to reduce the ratio of the integrals. For the purpose of comparison, the mean squared error (MSE and the absolute bias are obtained. This study is conducted via simulation by utilising different sample sizes. We observed from the study that the generalised prior we assumed performed better than the others under linear exponential loss function with respect to MSE and under general entropy loss function with respect to absolute bias.
Power Loss Analysis for Wind Power Grid Integration Based on Weibull Distribution
Directory of Open Access Journals (Sweden)
Ahmed Al Ameri
2017-04-01
Full Text Available The growth of electrical demand increases the need of renewable energy sources, such as wind energy, to meet that need. Electrical power losses are an important factor when wind farm location and size are selected. The capitalized cost of constant power losses during the life of a wind farm will continue to high levels. During the operation period, a method to determine if the losses meet the requirements of the design is significantly needed. This article presents a Simulink simulation of wind farm integration into the grid; the aim is to achieve a better understanding of wind variation impact on grid losses. The real power losses are set as a function of the annual variation, considering a Weibull distribution. An analytical method has been used to select the size and placement of a wind farm, taking into account active power loss reduction. It proposes a fast linear model estimation to find the optimal capacity of a wind farm based on DC power flow and graph theory. The results show that the analytical approach is capable of predicting the optimal size and location of wind turbines. Furthermore, it revealed that the annual variation of wind speed could have a strong effect on real power loss calculations. In addition to helping to improve utility efficiency, the proposed method can develop specific designs to speeding up integration of wind farms into grids.
International Nuclear Information System (INIS)
Li, M.; Li, X.
2005-01-01
The probabilistic distribution of wind speed is one of the important wind characteristics for the assessment of wind energy potential and for the performance of wind energy conversion systems, as well as for the structural and environmental design and analysis. In this study, an exponential family of distribution functions has been developed for the description of the probabilistic distribution of wind speed, and comparison with the wind speed data taken from different sources and measured at different geographical locations in the world has been made. This family of distributions is developed by introducing a pre-exponential term to the theoretical distribution derived from the Maximum Entropy Principle (MEP). The statistical analysis parameter based on the wind power density is used as the suitability judgement for the distribution functions. It is shown that the MEP-type distributions not only agree better with a variety of the measured wind speed data than the conventionally used empirical Weibull distribution, but also can represent the wind power density much more accurately. Therefore, the MEP-type distributions are more suitable for the assessment of the wind energy potential and the performance of wind energy conversion systems. (author)
Janković, Bojan
2009-10-01
The decomposition process of sodium bicarbonate (NaHCO3) has been studied by thermogravimetry in isothermal conditions at four different operating temperatures (380 K, 400 K, 420 K, and 440 K). It was found that the experimental integral and differential conversion curves at the different operating temperatures can be successfully described by the isothermal Weibull distribution function with a unique value of the shape parameter ( β = 1.07). It was also established that the Weibull distribution parameters ( β and η) show independent behavior on the operating temperature. Using the integral and differential (Friedman) isoconversional methods, in the conversion (α) range of 0.20 ≤ α ≤ 0.80, the apparent activation energy ( E a ) value was approximately constant ( E a, int = 95.2 kJmol-1 and E a, diff = 96.6 kJmol-1, respectively). The values of E a calculated by both isoconversional methods are in good agreement with the value of E a evaluated from the Arrhenius equation (94.3 kJmol-1), which was expressed through the scale distribution parameter ( η). The Málek isothermal procedure was used for estimation of the kinetic model for the investigated decomposition process. It was found that the two-parameter Šesták-Berggren (SB) autocatalytic model best describes the NaHCO3 decomposition process with the conversion function f(α) = α0.18(1-α)1.19. It was also concluded that the calculated density distribution functions of the apparent activation energies ( ddfE a ’s) are not dependent on the operating temperature, which exhibit the highly symmetrical behavior (shape factor = 1.00). The obtained isothermal decomposition results were compared with corresponding results of the nonisothermal decomposition process of NaHCO3.
Energy Technology Data Exchange (ETDEWEB)
Cardil Forradellas, A.; Molina Terrén, D.M.; Oliveres, J.; Castellnou, M.
2016-07-01
Aim of study: In this study we compare the accuracy of three bivariate distributions: Johnson’s SBB, Weibull-2P and LL-2P functions for characterizing the joint distribution of tree diameters and heights. Area of study: North-West of Spain. Material and methods: Diameter and height measurements of 128 plots of pure and even-aged Tasmanian blue gum (Eucalyptus globulus Labill.) stands located in the North-west of Spain were considered in the present study. The SBB bivariate distribution was obtained from SB marginal distributions using a Normal Copula based on a four-parameter logistic transformation. The Plackett Copula was used to obtain the bivariate models from the Weibull and Logit-logistic univariate marginal distributions. The negative logarithm of the maximum likelihood function was used to compare the results and the Wilcoxon signed-rank test was used to compare the related samples of these logarithms calculated for each sample plot and each distribution. Main results: The best results were obtained by using the Plackett copula and the best marginal distribution was the Logit-logistic. Research highlights: The copulas used in this study have shown a good performance for modeling the joint distribution of tree diameters and heights. They could be easily extended for modelling multivariate distributions involving other tree variables, such as tree volume or biomass. (Author)
A study of two estimation approaches for parameters of Weibull distribution based on WPP
International Nuclear Information System (INIS)
Zhang, L.F.; Xie, M.; Tang, L.C.
2007-01-01
Least-squares estimation (LSE) based on Weibull probability plot (WPP) is the most basic method for estimating the Weibull parameters. The common procedure of this method is using the least-squares regression of Y on X, i.e. minimizing the sum of squares of the vertical residuals, to fit a straight line to the data points on WPP and then calculate the LS estimators. This method is known to be biased. In the existing literature the least-squares regression of X on Y, i.e. minimizing the sum of squares of the horizontal residuals, has been used by the Weibull researchers. This motivated us to carry out this comparison between the estimators of the two LS regression methods using intensive Monte Carlo simulations. Both complete and censored data are examined. Surprisingly, the result shows that LS Y on X performs better for small, complete samples, while the LS X on Y performs better in other cases in view of bias of the estimators. The two methods are also compared in terms of other model statistics. In general, when the shape parameter is less than one, LS Y on X provides a better model; otherwise, LS X on Y tends to be better
An EOQ model for weibull distribution deterioration with time-dependent cubic demand and backlogging
Santhi, G.; Karthikeyan, K.
2017-11-01
In this article we introduce an economic order quantity model with weibull deterioration and time dependent cubic demand rate where holding costs as a linear function of time. Shortages are allowed in the inventory system are partially and fully backlogging. The objective of this model is to minimize the total inventory cost by using the optimal order quantity and the cycle length. The proposed model is illustrated by numerical examples and the sensitivity analysis is performed to study the effect of changes in parameters on the optimum solutions.
Dependence of Weibull distribution parameters on the CNR threshold i wind lidar data
DEFF Research Database (Denmark)
Gryning, Sven-Erik; Batchvarova, Ekaterina; Floors, Rogier Ralph
2015-01-01
in the boundary layer. Observations from tall towers in combination with observations from a lidar of wind speed up to 600 m are used to study the long-term variability of the wind profile over sub-urban, rural, coastal and marine areas. The variability is expressed in terms of the shape parameter in the Weibull...... over land, both terms are about equally important in the coastal area where the height of the reversal height is low and in the marine conditions, the second term dominates....
Directory of Open Access Journals (Sweden)
Ruben M. Mouangue
2014-05-01
Full Text Available The modeling of the wind speed distribution is of great importance for the assessment of wind energy potential and the performance of wind energy conversion system. In this paper, the choice of two determination methods of Weibull parameters shows theirs influences on the Weibull distribution performances. Because of important calm winds on the site of Ngaoundere airport, we characterize the wind potential using the approach of Weibull distribution with parameters which are determined by the modified maximum likelihood method. This approach is compared to the Weibull distribution with parameters which are determined by the maximum likelihood method and the hybrid distribution which is recommended for wind potential assessment of sites having nonzero probability of calm. Using data provided by the ASECNA Weather Service (Agency for the Safety of Air Navigation in Africa and Madagascar, we evaluate the goodness of fit of the various fitted distributions to the wind speed data using the Q – Q plots, the Pearson’s coefficient of correlation, the mean wind speed, the mean square error, the energy density and its relative error. It appears from the results that the accuracy of the Weibull distribution with parameters which are determined by the modified maximum likelihood method is higher than others. Then, this approach is used to estimate the monthly and annual energy productions of the site of the Ngaoundere airport. The most energy contribution is made in March with 255.7 MWh. It also appears from the results that a wind turbine generator installed on this particular site could not work for at least a half of the time because of higher frequency of calm. For this kind of sites, the modified maximum likelihood method proposed by Seguro and Lambert in 2000 is one of the best methods which can be used to determinate the Weibull parameters.
Directory of Open Access Journals (Sweden)
D. Kidmo Kaoga
2015-07-01
Full Text Available In this study, five numerical Weibull distribution methods, namely, the maximum likelihood method, the modified maximum likelihood method (MLM, the energy pattern factor method (EPF, the graphical method (GM, and the empirical method (EM were explored using hourly synoptic data collected from 1985 to 2013 in the district of Maroua in Cameroon. The performance analysis revealed that the MLM was the most accurate model followed by the EPF and the GM. Furthermore, the comparison between the wind speed standard deviation predicted by the proposed models and the measured data showed that the MLM has a smaller relative error of -3.33% on average compared to -11.67% on average for the EPF and -8.86% on average for the GM. As a result, the MLM was precisely recommended to estimate the scale and shape parameters for an accurate and efficient wind energy potential evaluation.
Directory of Open Access Journals (Sweden)
D. Kidmo Kaoga
2014-12-01
Full Text Available In this study, five numerical Weibull distribution methods, namely, the maximum likelihood method, the modified maximum likelihood method (MLM, the energy pattern factor method (EPF, the graphical method (GM, and the empirical method (EM were explored using hourly synoptic data collected from 1985 to 2013 in the district of Maroua in Cameroon. The performance analysis revealed that the MLM was the most accurate model followed by the EPF and the GM. Furthermore, the comparison between the wind speed standard deviation predicted by the proposed models and the measured data showed that the MLM has a smaller relative error of -3.33% on average compared to -11.67% on average for the EPF and -8.86% on average for the GM. As a result, the MLM was precisely recommended to estimate the scale and shape parameters for an accurate and efficient wind energy potential evaluation.
International Nuclear Information System (INIS)
Diaz, Gerardo; Donoso, Eduardo; Varschavsky, Ari
2004-01-01
A statistical analysis was carried out of the distribution of Vickers micro hardness values of nickel and aluminum atom precipitates from a solid solution of Cu-Ni-Al. Non isothermal calorimetric curves confirmed the formation of two types of precipitates: Ni Al from 45 K to 600 K, and Ni 3 Al from 650 K to 800 K. The micro hardness measurements were done at room temperature in the previously quenched material and submitted to isothermal and iso chronic annealing treatments. A lower dispersion in the distribution of the Vickers micro hardness values in the Ni Al precipitate for the entire formation temperature was determined with a lesser average micro hardness than the Ni 3 Al precipitate. The Weibull modules were estimated from the respective Weibull diagrams. The lesser dispersion was proven by the elevated values of the Wobble modules. The maximum average micro hardness attained by the Ni Al phase was 148, with a Weibull module of 26 and an annealing temperature of 553 K maintained for 40 minutes. The Ni 3 Al reached a maximum average micro hardness of 248 with a Weibull module of 10 and a annealing temperature of 793 K during 40 minutes (CW)
International Nuclear Information System (INIS)
Guilani, Pedram Pourkarim; Azimi, Parham; Niaki, S.T.A.; Niaki, Seyed Armin Akhavan
2016-01-01
The redundancy allocation problem (RAP) is a useful method to enhance system reliability. In most works involving RAP, failure rates of the system components are assumed to follow either exponential or k-Erlang distributions. In real world problems however, many systems have components with increasing failure rates. This indicates that as time passes by, the failure rates of the system components increase in comparison to their initial failure rates. In this paper, the redundancy allocation problem of a series–parallel system with components having an increasing failure rate based on Weibull distribution is investigated. An optimization method via simulation is proposed for modeling and a genetic algorithm is developed to solve the problem. - Highlights: • The redundancy allocation problem of a series–parallel system is aimed. • Components possess an increasing failure rate based on Weibull distribution. • An optimization method via simulation is proposed for modeling. • A genetic algorithm is developed to solve the problem.
Abas, Norzaida; Daud, Zalina M.; Yusof, Fadhilah
2014-11-01
A stochastic rainfall model is presented for the generation of hourly rainfall data in an urban area in Malaysia. In view of the high temporal and spatial variability of rainfall within the tropical rain belt, the Spatial-Temporal Neyman-Scott Rectangular Pulse model was used. The model, which is governed by the Neyman-Scott process, employs a reasonable number of parameters to represent the physical attributes of rainfall. A common approach is to attach each attribute to a mathematical distribution. With respect to rain cell intensity, this study proposes the use of a mixed exponential distribution. The performance of the proposed model was compared to a model that employs the Weibull distribution. Hourly and daily rainfall data from four stations in the Damansara River basin in Malaysia were used as input to the models, and simulations of hourly series were performed for an independent site within the basin. The performance of the models was assessed based on how closely the statistical characteristics of the simulated series resembled the statistics of the observed series. The findings obtained based on graphical representation revealed that the statistical characteristics of the simulated series for both models compared reasonably well with the observed series. However, a further assessment using the AIC, BIC and RMSE showed that the proposed model yields better results. The results of this study indicate that for tropical climates, the proposed model, using a mixed exponential distribution, is the best choice for generation of synthetic data for ungauged sites or for sites with insufficient data within the limit of the fitted region.
Directory of Open Access Journals (Sweden)
Asoke Kumar Bhunia
2014-06-01
Full Text Available In this paper, an attempt is made to develop two inventory models for deteriorating items with variable demand dependent on the selling price and frequency of advertisement of items. In the first model, shortages are not allowed whereas in the second, these are allowed and partially backlogged with a variable rate dependent on the duration of waiting time up to the arrival of next lot. In both models, the deterioration rate follows three-parameter Weibull distribution and the transportation cost is considered explicitly for replenishing the order quantity. This cost is dependent on the lot-size as well as the distance from the source to the destination. The corresponding models have been formulated and solved. Two numerical examples have been considered to illustrate the results and the significant features of the results are discussed. Finally, based on these examples, the effects of different parameters on the initial stock level, shortage level (in case of second model only, cycle length along with the optimal profit have been studied by sensitivity analyses taking one parameter at a time keeping the other parameters as same.
Jeffrey H. Gove
2003-01-01
Many of the most popular sampling schemes used in forestry are probability proportional to size methods. These methods are also referred to as size biased because sampling is actually from a weighted form of the underlying population distribution. Length- and area-biased sampling are special cases of size-biased sampling where the probability weighting comes from a...
Steve P. Verrill; David E. Kretschmann; James W. Evans
2016-01-01
Two important wood properties are stiffness (modulus of elasticity, MOE) and bending strength (modulus of rupture, MOR). In the past, MOE has often been modeled as a Gaussian and MOR as a lognormal or a two- or threeparameter Weibull. It is well known that MOE and MOR are positively correlated. To model the simultaneous behavior of MOE and MOR for the purposes of wood...
Energy Technology Data Exchange (ETDEWEB)
Gabriel Filho, Luis Roberto Almeida [Universidade Estadual Paulista (CE/UNESP), Tupa, SP (Brazil). Coordenacao de Estagio; Cremasco, Camila Pires [Faculdade de Tecnologia de Presidente Prudente, SP (Brazil); Seraphim, Odivaldo Jose [Universidade Estadual Paulista (FCA/UNESP), Botucatu, SP (Brazil). Fac. de Ciencias Agronomicas; Cagnon, Jose Angelo [Universidade Estadual Paulista (FEB/UNESP), Bauru, SP (Brazil). Faculdade de Engenharia
2008-07-01
The wind behavior of a region can be described by frequency distribution that provide information and characteristics needed for a possible deployment of wind energy harvesting in the region. These characteristics, such as the annual average speed, the variance and shunting line standard of the registered speeds and the density of aeolian power average hourly, can be gotten by the frequency of occurrence of determined speed, that in turn must be studied through analytical expressions. The more adjusted analytical function for aeolian distributions is the function of density of Weibull, that can be determined by numerical methods and linear regressions. Once you have determined this function, all wind characteristics mentioned above may be determined accurately. The objective of this work is to characterize the aeolian behavior in the region of Botucatu-SP and to determine the energy potential for implementation of aeolian turbines. For the development of the present research, was used an Monitorial Young Wind anemometer of Campbell company installed a 10 meters of height. The experiment was developed in the Nucleus of Alternative Energies and Renewed - NEAR of the Laboratory of Agricultural Energize of the Department of Agricultural Engineering of the UNESP, Agronomy Sciences Faculty, Lageado Experimental Farm, located in the city of Botucatu - SP. The geographic localization is defined by the coordinates 22 deg 51' South latitude (S) and 48 deg 26' Longitude West (W) and average altitude of 786 meters above sea level. The analysis was carried through using registers of speed of the wind during the period of September of 2004 the September of 2005. After determined the distribution of frequencies of the hourly average speed of the wind, it was determined function of associated Weibull, thus making possible the determination of the annual average speed of the wind (2,77 m/s), of the shunting line standard of the registered speeds (0,55 m/s), of the
Directory of Open Access Journals (Sweden)
Jianwei Yang
2016-06-01
Full Text Available In order to solve the reliability assessment of braking system component of high-speed electric multiple units, this article, based on two-parameter exponential distribution, provides the maximum likelihood estimation and Bayes estimation under a type-I life test. First of all, we evaluate the failure probability value according to the classical estimation method and then obtain the maximum likelihood estimation of parameters of two-parameter exponential distribution by performing and using the modified likelihood function. On the other hand, based on Bayesian theory, this article also selects the beta and gamma distributions as the prior distribution, combines with the modified maximum likelihood function, and innovatively applies a Markov chain Monte Carlo algorithm to parameters assessment based on Bayes estimation method for two-parameter exponential distribution, so that two reliability mathematical models of the electromagnetic valve are obtained. Finally, through type-I life test, the failure rates according to maximum likelihood estimation and Bayes estimation method based on Markov chain Monte Carlo algorithm are, respectively, 2.650 × 10−5 and 3.037 × 10−5. Compared with the failure rate of a electromagnetic valve 3.005 × 10−5, it proves that the Bayes method can use a Markov chain Monte Carlo algorithm to estimate reliability for two-parameter exponential distribution and Bayes estimation is more closer to the value of electromagnetic valve. So, by fully integrating multi-source, Bayes estimation method can preferably modify and precisely estimate the parameters, which can provide a certain theoretical basis for the safety operation of high-speed electric multiple units.
Estimation for a Weibull accelerated life testing model
International Nuclear Information System (INIS)
Glaser, R.E.
1984-01-01
It is sometimes reasonable to assume that the lifetime distribution of an item belongs to a certain parametric family, and that actual parameter values depend upon the testing environment of the item. In the two-parameter Weibull family setting, suppose both the shape and scale parameters are expressible as functions of the testing environment. For various models of functional dependency on environment, maximum likelihood methods are used to estimate characteristics of interest at specified environmental levels. The methodology presented handles exact, censored, and grouped data. A detailed accelerated life testing analysis of stress-rupture data for Kevlar/epoxy composites is given. 10 references, 1 figure, 2 tables
Steve P. Verrill; James W. Evans; David E. Kretschmann; Cherilyn A. Hatfield
2012-01-01
Two important wood properties are stiffness (modulus of elasticity or MOE) and bending strength (modulus of rupture or MOR). In the past, MOE has often been modeled as a Gaussian and MOR as a lognormal or a two or three parameter Weibull. It is well known that MOE and MOR are positively correlated. To model the simultaneous behavior of MOE and MOR for the purposes of...
Directory of Open Access Journals (Sweden)
Ingo W Nader
Full Text Available Parameters of the two-parameter logistic model are generally estimated via the expectation-maximization algorithm, which improves initial values for all parameters iteratively until convergence is reached. Effects of initial values are rarely discussed in item response theory (IRT, but initial values were recently found to affect item parameters when estimating the latent distribution with full non-parametric maximum likelihood. However, this method is rarely used in practice. Hence, the present study investigated effects of initial values on item parameter bias and on recovery of item characteristic curves in BILOG-MG 3, a widely used IRT software package. Results showed notable effects of initial values on item parameters. For tighter convergence criteria, effects of initial values decreased, but item parameter bias increased, and the recovery of the latent distribution worsened. For practical application, it is advised to use the BILOG default convergence criterion with appropriate initial values when estimating the latent distribution from data.
Probability Distribution Function of the Upper Equatorial Pacific Current Speeds
National Research Council Canada - National Science Library
Chu, Peter C
2005-01-01
...), constructed from hourly ADCP data (1990-2007) at six stations for the Tropical Atmosphere Ocean project satisfies the two-parameter Weibull distribution reasonably well with different characteristics between El Nino and La Nina events...
Weibull Model Allowing Nearly Instantaneous Failures
Directory of Open Access Journals (Sweden)
C. D. Lai
2007-01-01
expressed as a mixture of the uniform distribution and the Weibull distribution. Properties of the resulting distribution are derived; in particular, the probability density function, survival function, and the hazard rate function are obtained. Some selected plots of these functions are also presented. An R script was written to fit the model parameters. An application of the modified model is illustrated.
Aragao, Glaucia M F; Corradini, Maria G; Normand, Mark D; Peleg, Micha
2007-11-01
Published survival curves of Escherichia coli in two growth media, with and without the presence of salt, at various temperatures and in a Greek eggplant salad having various levels of essential oil, all had a characteristic downward concavity when plotted on semi logarithmic coordinates. Some also exhibited what appeared as a 'shoulder' of considerable length. Regardless of whether a shoulder was noticed, the survival pattern could be considered as a manifestation of an underlying unimodal distribution of the cells' death times. Mathematically, the data could be described equally well by the Weibull and log normal distribution functions, which had similar modes, means, standard deviations and coefficients of skewness. When plotted in their probability density function (PDF) form, the curves also appeared very similar visually. This enabled us to quantify and compare the effect of temperature or essential oil concentration on the organism's survival in terms of these temporal distributions' characteristics. Increased lethality was generally expressed in a shorter mean and mode, a smaller standard deviation and increased overall symmetry as judged by the distributions' degree of skewness. The 'shoulder', as expected, simply indicated that the distribution's standard deviation was much smaller than its mode. Rate models based on the two distribution functions could be used to predict non isothermal survival patterns. They were derived on the assumption that the momentary inactivation rate is the isothermal rate at the momentary temperature at a time that corresponds to the momentary survival ratio. In this application, however, the Weibullian model with a fixed power was not only simpler and more convenient mathematically than the one based on the log normal distribution, but it also provided more accurate estimates of the dynamic inactivation patterns.
Liu, Jinping; Tang, Zhaohui; Xu, Pengfei; Liu, Wenzhong; Zhang, Jin; Zhu, Jianyong
2016-06-29
The topic of online product quality inspection (OPQI) with smart visual sensors is attracting increasing interest in both the academic and industrial communities on account of the natural connection between the visual appearance of products with their underlying qualities. Visual images captured from granulated products (GPs), e.g., cereal products, fabric textiles, are comprised of a large number of independent particles or stochastically stacking locally homogeneous fragments, whose analysis and understanding remains challenging. A method of image statistical modeling-based OPQI for GP quality grading and monitoring by a Weibull distribution(WD) model with a semi-supervised learning classifier is presented. WD-model parameters (WD-MPs) of GP images' spatial structures, obtained with omnidirectional Gaussian derivative filtering (OGDF), which were demonstrated theoretically to obey a specific WD model of integral form, were extracted as the visual features. Then, a co-training-style semi-supervised classifier algorithm, named COSC-Boosting, was exploited for semi-supervised GP quality grading, by integrating two independent classifiers with complementary nature in the face of scarce labeled samples. Effectiveness of the proposed OPQI method was verified and compared in the field of automated rice quality grading with commonly-used methods and showed superior performance, which lays a foundation for the quality control of GP on assembly lines.
Generalized renewal process for repairable systems based on finite Weibull mixture
International Nuclear Information System (INIS)
Veber, B.; Nagode, M.; Fajdiga, M.
2008-01-01
Repairable systems can be brought to one of possible states following a repair. These states are: 'as good as new', 'as bad as old' and 'better than old but worse than new'. The probabilistic models traditionally used to estimate the expected number of failures account for the first two states, but they do not properly apply to the last one, which is more realistic in practice. In this paper, a probabilistic model that is applicable to all of the three after-repair states, called generalized renewal process (GRP), is applied. Simplistically, GRP addresses the repair assumption by introducing the concept of virtual age into the stochastic point processes to enable them to represent the full spectrum of repair assumptions. The shape of measured or design life distributions of systems can vary considerably, and therefore frequently cannot be approximated by simple distribution functions. The scope of the paper is to prove that a finite Weibull mixture, with positive component weights only, can be used as underlying distribution of the time to first failure (TTFF) of the GRP model, on condition that the unknown parameters can be estimated. To support the main idea, three examples are presented. In order to estimate the unknown parameters of the GRP model with m-fold Weibull mixture, the EM algorithm is applied. The GRP model with m mixture components distributions is compared to the standard GRP model based on two-parameter Weibull distribution by calculating the expected number of failures. It can be concluded that the suggested GRP model with Weibull mixture with an arbitrary but finite number of components is suitable for predicting failures based on the past performance of the system
Weibull and lognormal Taguchi analysis using multiple linear regression
International Nuclear Information System (INIS)
Piña-Monarrez, Manuel R.; Ortiz-Yañez, Jesús F.
2015-01-01
The paper provides to reliability practitioners with a method (1) to estimate the robust Weibull family when the Taguchi method (TM) is applied, (2) to estimate the normal operational Weibull family in an accelerated life testing (ALT) analysis to give confidence to the extrapolation and (3) to perform the ANOVA analysis to both the robust and the normal operational Weibull family. On the other hand, because the Weibull distribution neither has the normal additive property nor has a direct relationship with the normal parameters (µ, σ), in this paper, the issues of estimating a Weibull family by using a design of experiment (DOE) are first addressed by using an L_9 (3"4) orthogonal array (OA) in both the TM and in the Weibull proportional hazard model approach (WPHM). Then, by using the Weibull/Gumbel and the lognormal/normal relationships and multiple linear regression, the direct relationships between the Weibull and the lifetime parameters are derived and used to formulate the proposed method. Moreover, since the derived direct relationships always hold, the method is generalized to the lognormal and ALT analysis. Finally, the method’s efficiency is shown through its application to the used OA and to a set of ALT data. - Highlights: • It gives the statistical relations and steps to use the Taguchi Method (TM) to analyze Weibull data. • It gives the steps to determine the unknown Weibull family to both the robust TM setting and the normal ALT level. • It gives a method to determine the expected lifetimes and to perform its ANOVA analysis in TM and ALT analysis. • It gives a method to give confidence to the extrapolation in an ALT analysis by using the Weibull family of the normal level.
Directory of Open Access Journals (Sweden)
Islam Khandaker Dahirul
2016-01-01
Full Text Available This paper explores wind speed distribution using Weibull probability distribution and Rayleigh distribution methods that are proven to provide accurate and efficient estimation of energy output in terms of wind energy conversion systems. Two parameters of Weibull (shape and scale parameters k and c respectively and scale parameter of Rayleigh distribution have been determined based on hourly time-series wind speed data recorded from October 2014 to October 2015 at Saint Martin’s island, Bangladesh. This research has been carried out to examine three numerical methods namely Graphical Method (GM, Empirical Method (EM, Energy Pattern Factor method (EPF to estimate Weibull parameters. Also, Rayleigh distribution method has been analyzed throughout the study. The results in the research revealed that the Graphical method followed by Empirical method and Energy Pattern Factor method were the most accurate and efficient way for determining the value of k and c to approximate wind speed distribution in terms of estimating power error. Rayleigh distribution gives the most power error in the research. Potential for wind energy development in Saint Martin’s island, Bangladesh as found from the data analysis has been explained in this paper.
Wang, Mingyu; Han, Lijuan; Liu, Shasha; Zhao, Xuebing; Yang, Jinghua; Loh, Soh Kheang; Sun, Xiaomin; Zhang, Chenxi; Fang, Xu
2015-09-01
Renewable energy from lignocellulosic biomass has been deemed an alternative to depleting fossil fuels. In order to improve this technology, we aim to develop robust mathematical models for the enzymatic lignocellulose degradation process. By analyzing 96 groups of previously published and newly obtained lignocellulose saccharification results and fitting them to Weibull distribution, we discovered Weibull statistics can accurately predict lignocellulose saccharification data, regardless of the type of substrates, enzymes and saccharification conditions. A mathematical model for enzymatic lignocellulose degradation was subsequently constructed based on Weibull statistics. Further analysis of the mathematical structure of the model and experimental saccharification data showed the significance of the two parameters in this model. In particular, the λ value, defined the characteristic time, represents the overall performance of the saccharification system. This suggestion was further supported by statistical analysis of experimental saccharification data and analysis of the glucose production levels when λ and n values change. In conclusion, the constructed Weibull statistics-based model can accurately predict lignocellulose hydrolysis behavior and we can use the λ parameter to assess the overall performance of enzymatic lignocellulose degradation. Advantages and potential applications of the model and the λ value in saccharification performance assessment were discussed. Copyright © 2015 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.
Weibull statistic analysis of bending strength in the cemented carbide coatings
International Nuclear Information System (INIS)
Yi Yong; Shen Baoluo; Qiu Shaoyu; Li Cong
2003-01-01
The theoretical basis using Weibull statistics to analyze the strength of coating has been established that the Weibull distribution will be the asymptotic distribution of strength for coating as the volume of coating increase, provided that the local strength of coating is statistic independent, and has been confirmed in the following test for the bending strength of two cemented carbide coatings. The result shows that Weibull statistics can be well used to analyze the strength of two coatings. (authors)
arXiv Describing dynamical fluctuations and genuine correlations by Weibull regularity
Nayak, Ranjit K.; Sarkisyan-Grinbaum, Edward K.; Tasevsky, Marek
The Weibull parametrization of the multiplicity distribution is used to describe the multidimensional local fluctuations and genuine multiparticle correlations measured by OPAL in the large statistics $e^{+}e^{-} \\to Z^{0} \\to hadrons$ sample. The data are found to be well reproduced by the Weibull model up to higher orders. The Weibull predictions are compared to the predictions by the two other models, namely by the negative binomial and modified negative binomial distributions which mostly failed to fit the data. The Weibull regularity, which is found to reproduce the multiplicity distributions along with the genuine correlations, looks to be the optimal model to describe the multiparticle production process.
Directory of Open Access Journals (Sweden)
Ioan Ion
2011-09-01
Full Text Available The management of an investment program in wind power generators must consider the proper evaluation of the possibilities offered by the location where the park will be disposed and will operate: the available existing electric networks, access roads, the shape of relief, climate, extreme weather phenomena, the average wind speed, etc. Among the items listed above, the most important is wind potential of the area, quantified, measured mainly by multi-annual average wind speed. Evaluation, without special measurements, can be done based on general information such as measurements obtained from meteorological stations, using Weibull distribution for wind speed. When using the weather characteristics measurement results, the evaluation is closer to real multi- annual potential.
Energy Technology Data Exchange (ETDEWEB)
Vallee, T.; Keller, Th. [Ecole Polytech Fed Lausanne, CCLab, CH-1015 Lausanne, (Switzerland); Fourestey, G. [Ecole Polytech Fed Lausanne, IACS, Chair Modeling and Sci Comp, CH-1015 Lausanne, (Switzerland); Fournier, B. [CEA SACLAY ENSMP, DEN, DANS, DMN, SRMA, LC2M, F-91191 Gif Sur Yvette, (France); Correia, J.R. [Univ Tecn Lisbon, Inst Super Tecn, Civil Engn and Architecture Dept, P-1049001 Lisbon, (Portugal)
2009-07-01
The Weibull distribution, used to describe the scaling of strength of materials, has been verified on a wide range of materials and geometries: however, the quality of the fitting tended to be less good towards the upper tail. Based on a previously developed probabilistic strength prediction method for adhesively bonded joints composed of pultruded glass fiber-reinforced polymer (GFRP) adherends, where it was verified that a two-parameter Weibull probabilistic distribution was not able to model accurately the upper tail of a material strength distribution, different improved probabilistic distributions were compared to enhance the quality of strength predictions. The following probabilistic distributions were examined: a two-parameter Weibull (as a reference), m-fold Weibull, a Grafted Distribution, a Birnbaum-Saunders Distribution and a Generalized Lambda Distribution. The Generalized Lambda Distribution turned out to be the best analytical approximation for the strength data, providing a good fit to the experimental data, and leading to more accurate joint strength predictions than the original two-parameter Weibull distribution. It was found that a proper modeling of the upper tail leads to a noticeable increase of the quality of the predictions. (authors)
International Nuclear Information System (INIS)
Vallee, T.; Keller, Th.; Fourestey, G.; Fournier, B.; Correia, J.R.
2009-01-01
The Weibull distribution, used to describe the scaling of strength of materials, has been verified on a wide range of materials and geometries: however, the quality of the fitting tended to be less good towards the upper tail. Based on a previously developed probabilistic strength prediction method for adhesively bonded joints composed of pultruded glass fiber-reinforced polymer (GFRP) adherends, where it was verified that a two-parameter Weibull probabilistic distribution was not able to model accurately the upper tail of a material strength distribution, different improved probabilistic distributions were compared to enhance the quality of strength predictions. The following probabilistic distributions were examined: a two-parameter Weibull (as a reference), m-fold Weibull, a Grafted Distribution, a Birnbaum-Saunders Distribution and a Generalized Lambda Distribution. The Generalized Lambda Distribution turned out to be the best analytical approximation for the strength data, providing a good fit to the experimental data, and leading to more accurate joint strength predictions than the original two-parameter Weibull distribution. It was found that a proper modeling of the upper tail leads to a noticeable increase of the quality of the predictions. (authors)
Chakraborty, Snehasis; Rao, Pavuluri Srinivasa; Mishra, Hari Niwas
2015-10-15
High pressure inactivation of natural microbiota viz. aerobic mesophiles (AM), psychrotrophs (PC), yeasts and molds (YM), total coliforms (TC) and lactic acid bacteria (LAB) in pineapple puree was studied within the experimental domain of 0.1-600 MPa and 30-50 °C with a treatment time up to 20 min. A complete destruction of yeasts and molds was obtained at 500 MPa/50 °C/15 min; whereas no counts were detected for TC and LAB at 300 MPa/30 °C/15 min. A maximum of two log cycle reductions was obtained for YM during pulse pressurization at the severe process intensity of 600 MPa/50 °C/20 min. The Weibull model clearly described the non-linearity of the survival curves during the isobaric period. The tailing effect, as confirmed by the shape parameter (β) of the survival curve, was obtained in case of YM (β1) was observed for the other microbial groups. Analogous to thermal death kinetics, the activation energy (Ea, kJ·mol(-1)) and the activation volume (Va, mL·mol(-1)) values were computed further to describe the temperature and pressure dependencies of the scale parameter (δ, min), respectively. A higher δ value was obtained for each microbe at a lower temperature and it decreased with an increase in pressure. A secondary kinetic model was developed describing the inactivation rate (k, min(-1)) as a function of pressure (P, MPa) and temperature (T, K) including the dependencies of Ea and Va on P and T, respectively. Copyright © 2015 Elsevier B.V. All rights reserved.
SEMI-COMPETING RISKS ON A TRIVARIATE WEIBULL SURVIVAL MODEL
Directory of Open Access Journals (Sweden)
Jenq-Daw Lee
2008-07-01
Full Text Available A setting of a trivairate survival function using semi-competing risks concept is proposed, in which a terminal event can only occur after other events. The Stanford Heart Transplant data is reanalyzed using a trivariate Weibull distribution model with the proposed survival function.
Directory of Open Access Journals (Sweden)
Luís R. A Gabriel Filho
2011-02-01
Full Text Available O regime eólico de uma região pode ser descrito por distribuição de frequências que fornecem informações e características extremamente necessárias para uma possível implantação de sistemas eólicos de captação de energia na região e consequentes aplicações no meio rural em regiões afastadas. Estas características, tais como a velocidade média anual, a variância das velocidades registradas e a densidade da potência eólica média horária, podem ser obtidas pela frequência de ocorrências de determinada velocidade, que por sua vez deve ser estudada através de expressões analíticas. A função analítica mais adequada para distribuições eólicas é a função de densidade de Weibull, que pode ser determinada por métodos numéricos e regressões lineares. O objetivo deste trabalho é caracterizar analítica e geometricamente todos os procedimentos metodológicos necessários para a realização de uma caracterização completa do regime eólico de uma região e suas aplicações na região de Botucatu - SP, visando a determinar o potencial energético para implementação de turbinas eólicas. Assim, foi possível estabelecer teoremas relacionados com a forma de caracterização do regime eólico, estabelecendo a metodologia concisa analiticamente para a definição dos parâmetros eólicos de qualquer região a ser estudada. Para o desenvolvimento desta pesquisa, utilizou-se um anemômetro da CAMPBELL.The wind regime of a region can be described by frequency distributions that provide information and features extremely necessary for a possible deployment of wind systems of energy capturing in the region and the resulting applications in rural areas in remote regions. These features, such as the annual average speed, variance of speed and hourly average of wind power density, can be obtained by the frequency of occurrences of certain speed, which in turn should be studied through analytical expressions. The analytic
Testing homogeneity in Weibull-regression models.
Bolfarine, Heleno; Valença, Dione M
2005-10-01
In survival studies with families or geographical units it may be of interest testing whether such groups are homogeneous for given explanatory variables. In this paper we consider score type tests for group homogeneity based on a mixing model in which the group effect is modelled as a random variable. As opposed to hazard-based frailty models, this model presents survival times that conditioned on the random effect, has an accelerated failure time representation. The test statistics requires only estimation of the conventional regression model without the random effect and does not require specifying the distribution of the random effect. The tests are derived for a Weibull regression model and in the uncensored situation, a closed form is obtained for the test statistic. A simulation study is used for comparing the power of the tests. The proposed tests are applied to real data sets with censored data.
Weibull analysis of fracture test data on bovine cortical bone: influence of orientation.
Khandaker, Morshed; Ekwaro-Osire, Stephen
2013-01-01
The fracture toughness, K IC, of a cortical bone has been experimentally determined by several researchers. The variation of K IC values occurs from the variation of specimen orientation, shape, and size during the experiment. The fracture toughness of a cortical bone is governed by the severest flaw and, hence, may be analyzed using Weibull statistics. To the best of the authors' knowledge, however, no studies of this aspect have been published. The motivation of the study is the evaluation of Weibull parameters at the circumferential-longitudinal (CL) and longitudinal-circumferential (LC) directions. We hypothesized that Weibull parameters vary depending on the bone microstructure. In the present work, a two-parameter Weibull statistical model was applied to calculate the plane-strain fracture toughness of bovine femoral cortical bone obtained using specimens extracted from CL and LC directions of the bone. It was found that the Weibull modulus of fracture toughness was larger for CL specimens compared to LC specimens, but the opposite trend was seen for the characteristic fracture toughness. The reason for these trends is the microstructural and extrinsic toughening mechanism differences between CL and LC directions bone. The Weibull parameters found in this study can be applied to develop a damage-mechanics model for bone.
A log-Weibull spatial scan statistic for time to event data.
Usman, Iram; Rosychuk, Rhonda J
2018-06-13
Spatial scan statistics have been used for the identification of geographic clusters of elevated numbers of cases of a condition such as disease outbreaks. These statistics accompanied by the appropriate distribution can also identify geographic areas with either longer or shorter time to events. Other authors have proposed the spatial scan statistics based on the exponential and Weibull distributions. We propose the log-Weibull as an alternative distribution for the spatial scan statistic for time to events data and compare and contrast the log-Weibull and Weibull distributions through simulation studies. The effect of type I differential censoring and power have been investigated through simulated data. Methods are also illustrated on time to specialist visit data for discharged patients presenting to emergency departments for atrial fibrillation and flutter in Alberta during 2010-2011. We found northern regions of Alberta had longer times to specialist visit than other areas. We proposed the spatial scan statistic for the log-Weibull distribution as a new approach for detecting spatial clusters for time to event data. The simulation studies suggest that the test performs well for log-Weibull data.
Gumbel Weibull distribution function for Sahel precipitation ...
African Journals Online (AJOL)
user
insecurity, migration, social conflicts, etc.). An efficient management of under and over ground water is a ... affects their incomes (Udual and Ini, 2012). Researches on modeling, prediction and forecasting ... Douentza in Mopti region, situated on the national road 15 highway linking Mopti to Gao and Kidal regions. This small ...
International Nuclear Information System (INIS)
Petković, Dalibor; Shamshirband, Shahaboddin; Anuar, Nor Badrul; Saboohi, Hadi; Abdul Wahab, Ainuddin Wahid; Protić, Milan; Zalnezhad, Erfan; Mirhashemi, Seyed Mohammad Amin
2014-01-01
Highlights: • Probabilistic distribution functions of wind speed. • Two parameter Weibull probability distribution. • To build an effective prediction model of distribution of wind speed. • Support vector regression application as probability function for wind speed. - Abstract: The probabilistic distribution of wind speed is among the more significant wind characteristics in examining wind energy potential and the performance of wind energy conversion systems. When the wind speed probability distribution is known, the wind energy distribution can be easily obtained. Therefore, the probability distribution of wind speed is a very important piece of information required in assessing wind energy potential. For this reason, a large number of studies have been established concerning the use of a variety of probability density functions to describe wind speed frequency distributions. Although the two-parameter Weibull distribution comprises a widely used and accepted method, solving the function is very challenging. In this study, the polynomial and radial basis functions (RBF) are applied as the kernel function of support vector regression (SVR) to estimate two parameters of the Weibull distribution function according to previously established analytical methods. Rather than minimizing the observed training error, SVR p oly and SVR r bf attempt to minimize the generalization error bound, so as to achieve generalized performance. According to the experimental results, enhanced predictive accuracy and capability of generalization can be achieved using the SVR approach compared to other soft computing methodologies
Calculation of the ceramics Weibull parameters
Czech Academy of Sciences Publication Activity Database
Fuis, Vladimír; Návrat, Tomáš
2011-01-01
Roč. 58, - (2011), s. 642-647 ISSN 2010-376X. [International Conference on Bioinformatics and Biomedicine 2011. Bali, 26.10.2011-28.10.2011] Institutional research plan: CEZ:AV0Z20760514 Keywords : biomaterial parameters * Weibull statistics * ceramics Subject RIV: BO - Biophysics http://www.waset.org/journals/waset/v58/v58-132.pdf
The distribution choice for the threshold of solid state relay
International Nuclear Information System (INIS)
Sun Beiyun; Zhou Hui; Cheng Xiangyue; Mao Congguang
2009-01-01
Either normal distribution or Weibull distribution can be accepted as sample distribution of the threshold of solid state relay. By goodness-of-fit method, bootstrap method and Bayesian method, the Weibull distribution is chosen later. (authors)
A general Bayes weibull inference model for accelerated life testing
International Nuclear Information System (INIS)
Dorp, J. Rene van; Mazzuchi, Thomas A.
2005-01-01
This article presents the development of a general Bayes inference model for accelerated life testing. The failure times at a constant stress level are assumed to belong to a Weibull distribution, but the specification of strict adherence to a parametric time-transformation function is not required. Rather, prior information is used to indirectly define a multivariate prior distribution for the scale parameters at the various stress levels and the common shape parameter. Using the approach, Bayes point estimates as well as probability statements for use-stress (and accelerated) life parameters may be inferred from a host of testing scenarios. The inference procedure accommodates both the interval data sampling strategy and type I censored sampling strategy for the collection of ALT test data. The inference procedure uses the well-known MCMC (Markov Chain Monte Carlo) methods to derive posterior approximations. The approach is illustrated with an example
Two-parameter fracture mechanics: Theory and applications
International Nuclear Information System (INIS)
O'Dowd, N.P.; Shih, C.F.
1993-02-01
A family of self-similar fields provides the two parameters required to characterize the full range of high- and low-triaxiality crack tip states. The two parameters, J and Q, have distinct roles: J sets the size scale of the process zone over which large stresses and strains develop, while Q scales the near-tip stress distribution relative to a high triaxiality reference stress state. An immediate consequence of the theory is this: it is the toughness values over a range of crack tip constraint that fully characterize the material's fracture resistance. It is shown that Q provides a common scale for interpreting cleavage fracture and ductile tearing data thus allowing both failure modes to be incorporated in a single toughness locus. The evolution of Q, as plasticity progresses from small scale yielding to fully yielded conditions, has been quantified for several crack geometries and for a wide range of material strain hardening properties. An indicator of the robustness of the J-Q fields is introduced; Q as a field parameter and as a pointwise measure of stress level is discussed
Weibull-Based Design Methodology for Rotating Structures in Aircraft Engines
Directory of Open Access Journals (Sweden)
Erwin V. Zaretsky
2003-01-01
Full Text Available The NASA Energy-Efficient Engine (E3-Engine is used as the basis of a Weibull-based life and reliability analysis. Each component's life, and thus the engine's life, is defined by high-cycle fatigue or low-cycle fatigue. Knowing the cumulative life distribution of each of the components making up the engine as represented by a Weibull slope is a prerequisite to predicting the life and reliability of the entire engine. As the engine's Weibull slope increases, the predicted life decreases. The predicted engine lives L5 (95% probability of survival of approximately 17,000 and 32,000 hr do correlate with current engine-maintenance practices without and with refurbishment, respectively. The individual high-pressure turbine (HPT blade lives necessary to obtain a blade system life L0.1 (99.9% probability of survival of 9000 hr for Weibull slopes of 3, 6, and 9 are 47,391; 20,652; and 15,658 hr, respectively. For a design life of the HPT disks having probable points of failure equal to or greater than 36,000 hr at a probability of survival of 99.9%, the predicted disk system life L0.1 can vary from 9408 to 24,911 hr.
A Monte Carlo study of Weibull reliability analysis for space shuttle main engine components
Abernethy, K.
1986-01-01
The incorporation of a number of additional capabilities into an existing Weibull analysis computer program and the results of Monte Carlo computer simulation study to evaluate the usefulness of the Weibull methods using samples with a very small number of failures and extensive censoring are discussed. Since the censoring mechanism inherent in the Space Shuttle Main Engine (SSME) data is hard to analyze, it was decided to use a random censoring model, generating censoring times from a uniform probability distribution. Some of the statistical techniques and computer programs that are used in the SSME Weibull analysis are described. The methods documented in were supplemented by adding computer calculations of approximate (using iteractive methods) confidence intervals for several parameters of interest. These calculations are based on a likelihood ratio statistic which is asymptotically a chisquared statistic with one degree of freedom. The assumptions built into the computer simulations are described. The simulation program and the techniques used in it are described there also. Simulation results are tabulated for various combinations of Weibull shape parameters and the numbers of failures in the samples.
Theoretical derivation of wind power probability distribution function and applications
International Nuclear Information System (INIS)
Altunkaynak, Abdüsselam; Erdik, Tarkan; Dabanlı, İsmail; Şen, Zekai
2012-01-01
Highlights: ► Derivation of wind power stochastic characteristics are standard deviation and the dimensionless skewness. ► The perturbation is expressions for the wind power statistics from Weibull probability distribution function (PDF). ► Comparisons with the corresponding characteristics of wind speed PDF abides by the Weibull PDF. ► The wind power abides with the Weibull-PDF. -- Abstract: The instantaneous wind power contained in the air current is directly proportional with the cube of the wind speed. In practice, there is a record of wind speeds in the form of a time series. It is, therefore, necessary to develop a formulation that takes into consideration the statistical parameters of such a time series. The purpose of this paper is to derive the general wind power formulation in terms of the statistical parameters by using the perturbation theory, which leads to a general formulation of the wind power expectation and other statistical parameter expressions such as the standard deviation and the coefficient of variation. The formulation is very general and can be applied specifically for any wind speed probability distribution function. Its application to two-parameter Weibull probability distribution of wind speeds is presented in full detail. It is concluded that provided wind speed is distributed according to a Weibull distribution, the wind power could be derived based on wind speed data. It is possible to determine wind power at any desired risk level, however, in practical studies most often 5% or 10% risk levels are preferred and the necessary simple procedure is presented for this purpose in this paper.
Directory of Open Access Journals (Sweden)
J. Szymszal
2007-07-01
Full Text Available The first part of the study describes the methods used to determine Weibull modulus and the related reliability index of hypereutectic silumins containing about 17% Si, assigned for manufacture of high-duty castings to be used in automotive applications and aviation. The second part of the study discusses the importance of chemical composition, including the additions of 3% Cu, 1,5% Ni and 1,5% Mg, while in the third part attention was focussed on the effect of process history, including mould type (sand or metal as well as the inoculation process and heat treatment (solutioning and ageing applied to the cast AlSi17Cu3Mg1,5Ni1,5 alloy, on the run of Weibull distribution function and reliability index calculated for the tensile strength Rm of the investigated alloys.
Weibull statistical analysis of Krouse type bending fatigue of nuclear materials
Energy Technology Data Exchange (ETDEWEB)
Haidyrah, Ahmed S., E-mail: ashdz2@mst.edu [Nuclear Engineering, Missouri University of Science & Technology, 301 W. 14th, Rolla, MO 65409 (United States); Nuclear Science Research Institute, King Abdulaziz City for Science and Technology (KACST), P.O. Box 6086, Riyadh 11442 (Saudi Arabia); Newkirk, Joseph W. [Materials Science & Engineering, Missouri University of Science & Technology, 1440 N. Bishop Ave, Rolla, MO 65409 (United States); Castaño, Carlos H. [Nuclear Engineering, Missouri University of Science & Technology, 301 W. 14th, Rolla, MO 65409 (United States)
2016-03-15
A bending fatigue mini-specimen (Krouse-type) was used to study the fatigue properties of nuclear materials. The objective of this paper is to study fatigue for Grade 91 ferritic-martensitic steel using a mini-specimen (Krouse-type) suitable for reactor irradiation studies. These mini-specimens are similar in design (but smaller) to those described in the ASTM B593 standard. The mini specimen was machined by waterjet and tested as-received. The bending fatigue machine was modified to test the mini-specimen with a specially designed adapter. The cycle bending fatigue behavior of Grade 91 was studied under constant deflection. The S–N curve was created and mean fatigue life was analyzed using mean fatigue life. In this study, the Weibull function was predicted probably for high stress to low stress at 563, 310 and 265 MPa. The commercial software Minitab 17 was used to calculate the distribution of fatigue life under different stress levels. We have used 2 and 3- parameters Weibull analysis to introduce the probability of failure. The plots indicated that the 3- parameter Weibull distribution fits the data well.
Weibull statistical analysis of Krouse type bending fatigue of nuclear materials
International Nuclear Information System (INIS)
Haidyrah, Ahmed S.; Newkirk, Joseph W.; Castaño, Carlos H.
2016-01-01
A bending fatigue mini-specimen (Krouse-type) was used to study the fatigue properties of nuclear materials. The objective of this paper is to study fatigue for Grade 91 ferritic-martensitic steel using a mini-specimen (Krouse-type) suitable for reactor irradiation studies. These mini-specimens are similar in design (but smaller) to those described in the ASTM B593 standard. The mini specimen was machined by waterjet and tested as-received. The bending fatigue machine was modified to test the mini-specimen with a specially designed adapter. The cycle bending fatigue behavior of Grade 91 was studied under constant deflection. The S–N curve was created and mean fatigue life was analyzed using mean fatigue life. In this study, the Weibull function was predicted probably for high stress to low stress at 563, 310 and 265 MPa. The commercial software Minitab 17 was used to calculate the distribution of fatigue life under different stress levels. We have used 2 and 3- parameters Weibull analysis to introduce the probability of failure. The plots indicated that the 3- parameter Weibull distribution fits the data well.
Directory of Open Access Journals (Sweden)
Abul Kalam Azad
2014-05-01
Full Text Available The best Weibull distribution methods for the assessment of wind energy potential at different altitudes in desired locations are statistically diagnosed in this study. Seven different methods, namely graphical method (GM, method of moments (MOM, standard deviation method (STDM, maximum likelihood method (MLM, power density method (PDM, modified maximum likelihood method (MMLM and equivalent energy method (EEM were used to estimate the Weibull parameters and six statistical tools, namely relative percentage of error, root mean square error (RMSE, mean percentage of error, mean absolute percentage of error, chi-square error and analysis of variance were used to precisely rank the methods. The statistical fittings of the measured and calculated wind speed data are assessed for justifying the performance of the methods. The capacity factor and total energy generated by a small model wind turbine is calculated by numerical integration using Trapezoidal sums and Simpson’s rules. The results show that MOM and MLM are the most efficient methods for determining the value of k and c to fit Weibull distribution curves.
Directory of Open Access Journals (Sweden)
S.P.Singh
2015-09-01
Full Text Available This paper derives an inventory model is developed for items that deteriorates at a generalized Weibull distributed rate when demand for the items is dependent on the selling price. Shortages are not allowed and price inflation is taken into consideration over finite planning horizon. A brief analysis of the cost involved is carried out by theoretical analysis.
Lifetime modelling with a Weibull law: comparison of three Bayesian Methods
International Nuclear Information System (INIS)
Billy, F.; Remy, E.; Bousquet, N.; Celeux, G.
2006-01-01
For a nuclear power plant, being able to estimate the lifetime of important components is strategic. But data is usually insufficient to do so. Thus, it is relevant to use expertise, together with data, in order to assess the value of lifetime on the grounds of both sources. The Bayesian frame and the choice of a Weibull law to model the random time for replacement are relevant. They have been chosen for this article. Two indicators are computed : the mean lifetime of any component and the mean residual lifetime of a given component, after it has been controlled. Three different Bayesian methods are compared on three sets of data. The article shows that the three methods lead to coherent results and that uncertainties are strongly reduced. The method developed around PMC has two main advantages: it models a conditional dependence of the two parameters of the Weibull law, which enables more coherent results on the prior; it has a parameter that weights the strength of the expertise. This last point is very important to do lifetime assessments, because then, expertise is not used to increase too small samples as much as to do a real extrapolation, far beyond what data itself say. (authors)
Schur Convexity of Generalized Heronian Means Involving Two Parameters
Directory of Open Access Journals (Sweden)
Bencze Mihály
2008-01-01
Full Text Available Abstract The Schur convexity and Schur-geometric convexity of generalized Heronian means involving two parameters are studied, the main result is then used to obtain several interesting and significantly inequalities for generalized Heronian means.
A drawback and an improvement of the classical Weibull probability plot
International Nuclear Information System (INIS)
Jiang, R.
2014-01-01
The classical Weibull Probability Paper (WPP) plot has been widely used to identify a model for fitting a given dataset. It is based on a match between the WPP plots of the model and data in shape. This paper carries out an analysis for the Weibull transformations that create the WPP plot and shows that the shape of the WPP plot of the data randomly generated from a distribution model can be significantly different from the shape of the WPP plot of the model due to the high non-linearity of the Weibull transformations. As such, choosing model based on the shape of the WPP plot of data can be unreliable. A cdf-based weighted least squares method is proposed to improve the parameter estimation accuracy; and an improved WPP plot is suggested to avoid the drawback of the classical WPP plot. The appropriateness and usefulness of the proposed estimation method and probability plot are illustrated by simulation and real-world examples
Weibull modeling of particle cracking in metal matrix composites
International Nuclear Information System (INIS)
Lewis, C.A.; Withers, P.J.
1995-01-01
An investigation into the occurrence of reinforcement cracking within a particulate ZrO 2 /2618 Al alloy metal matrix composite under tensile plastic straining has been carried out, special attention being paid to the dependence of fracture on particle size and shape. The probability of particle cracking has been modeled using a Weibull approach, giving good agreement with the experimental data. Values for the Weibull modulus and the stress required to crack the particles were found to be within the range expected for the cracking of ceramic particles. Additional information regarding the fracture behavior of the particles was provided by in-situ neutron diffraction monitoring of the internal strains, measurement of the variation in the composite Young's modulus with straining and by direct observation of the cracked particles. The values of the particle stress required for the initiation of particle cracking deduced from these supplementary experiments were found to be in good agreement with each other and with the results from the Weibull analysis. Further, it is shown that while both the current experiments, as well as the previous work of others, can be well described by the Weibull approach, the exact values of the Weibull parameters do deduced are very sensitive to the approximations and the assumptions made in constructing the model
Use of finite mixture distribution models in the analysis of wind energy in the Canarian Archipelago
International Nuclear Information System (INIS)
Carta, Jose Antonio; Ramirez, Penelope
2007-01-01
The statistical characteristics of hourly mean wind speed data recorded at 16 weather stations located in the Canarian Archipelago are analyzed in this paper. As a result of this analysis we see that the typical two parameter Weibull wind speed distribution (W-pdf) does not accurately represent all wind regimes observed in that region. However, a Singly Truncated from below Normal Weibull mixture distribution (TNW-pdf) and a two component mixture Weibull distribution (WW-pdf) developed here do provide very good fits for both unimodal and bimodal wind speed frequency distributions observed in that region and offer less relative errors in determining the annual mean wind power density. The parameters of the distributions are estimated using the least squares method, which is resolved in this paper using the Levenberg-Marquardt algorithm. The suitability of the distributions is judged from the probability plot correlation coefficient plot R 2 , adjusted for degrees of freedom. Based on the results obtained, we conclude that the two mixture distributions proposed here provide very flexible models for wind speed studies and can be applied in a widespread manner to represent the wind regimes in the Canarian archipelago and in other regions with similar characteristics. The TNW-pdf takes into account the frequency of null winds, whereas the WW-pdf and W-pdf do not. It can, therefore, better represent wind regimes with high percentages of null wind speeds. However, calculation of the TNW-pdf is markedly slower
Optimal Two Parameter Bounds for the Seiffert Mean
Directory of Open Access Journals (Sweden)
Hui Sun
2013-01-01
Full Text Available We obtain sharp bounds for the Seiffert mean in terms of a two parameter family of means. Our results generalize and extend the recent bounds presented in the Journal of Inequalities and Applications (2012 and Abstract and Applied Analysis (2012.
Bubbling and bistability in two parameter discrete systems
Ambika, G.; Sujatha, N. V.
2000-01-01
We present a graphical analysis of the mechanisms underlying the occurrences of bubbling sequences and bistability regions in the bifurcation scenario of a special class of one dimensional two parameter maps. The main result of the analysis is that whether it is bubbling or bistability is decided by the sign of the third derivative at the inflection point of the map function.
Energy Technology Data Exchange (ETDEWEB)
Pallocchia, G.; Laurenza, M.; Consolini, G. [INAF—Istituto di Astrofisica e Planetologia Spaziali, Via Fosso del Cavaliere 100, I-00133 Roma (Italy)
2017-03-10
Some interplanetary shocks are associated with short-term and sharp particle flux enhancements near the shock front. Such intensity enhancements, known as shock-spike events (SSEs), represent a class of relatively energetic phenomena as they may extend to energies of some tens of MeV or even beyond. Here we present an SSE case study in order to shed light on the nature of the particle acceleration involved in this kind of event. Our observations refer to an SSE registered on 2011 October 3 at 22:23 UT, by STEREO B instrumentation when, at a heliocentric distance of 1.08 au, the spacecraft was swept by a perpendicular shock moving away from the Sun. The main finding from the data analysis is that a Weibull distribution represents a good fitting function to the measured particle spectrum over the energy range from 0.1 to 30 MeV. To interpret such an observational result, we provide a theoretical derivation of the Weibull spectrum in the framework of the acceleration by “killed” stochastic processes exhibiting power-law growth in time of the velocity expectation, such as the classical Fermi process. We find an overall coherence between the experimental values of the Weibull spectrum parameters and their physical meaning within the above scenario. Hence, our approach based on the Weibull distribution proves to be useful for understanding SSEs. With regard to the present event, we also provide an alternative explanation of the Weibull spectrum in terms of shock-surfing acceleration.
International Nuclear Information System (INIS)
Anon
2009-01-01
In this presentation author deals with the probabilistic evaluation of product life on the example of the exponential distribution. The exponential distribution is special one-parametric case of the weibull distribution.
Comparison of Weibull and Probit Analysis in Toxicity Testing of ...
African Journals Online (AJOL)
HP
Keywords: Hunteria umbellata, Weibull model, Acute toxicity, Median lethal dose (LD50). Received: 7 November ... (PBPK) models [14,15], and (v) biologically-. Based Models: Moolgavkar-Venzon-Kundson. (MVK) model [16] and Ellwein and Cohen model [17]. ... Nigeria, Ibadan, where a sample with number. FHI107618 ...
Bayesian and Classical Estimation of Stress-Strength Reliability for Inverse Weibull Lifetime Models
Directory of Open Access Journals (Sweden)
Qixuan Bi
2017-06-01
Full Text Available In this paper, we consider the problem of estimating stress-strength reliability for inverse Weibull lifetime models having the same shape parameters but different scale parameters. We obtain the maximum likelihood estimator and its asymptotic distribution. Since the classical estimator doesn’t hold explicit forms, we propose an approximate maximum likelihood estimator. The asymptotic confidence interval and two bootstrap intervals are obtained. Using the Gibbs sampling technique, Bayesian estimator and the corresponding credible interval are obtained. The Metropolis-Hastings algorithm is used to generate random variates. Monte Carlo simulations are conducted to compare the proposed methods. Analysis of a real dataset is performed.
On the Weibull distribution for wind energy assessment
DEFF Research Database (Denmark)
Batchvarova, Ekaterina; Gryning, Sven-Erik
2014-01-01
-term measurements performed by a wind lidar, the vertical profile of the shape parameter will be discussed for a sub-urban site, a coastal site and a marine site. The profile of the shape parameter was found to be substantially different over land and sea. A parameterization of the vertical behavior of the shape...
Wind climate modeling using Weibull and extreme value distribution ...
African Journals Online (AJOL)
It is very much important to fit wind speed data into some suitable statistical model for two aspects. One is fatigue failure due to periodic vortex shedding and the other is to estimate the wind energy potential of a particular location. For the fatigue failure due to periodic vortex shedding, it is important to analyse the load cycle.
Percentile-based Weibull diameter distribution model for Pinus ...
African Journals Online (AJOL)
Using a site index equation and stem volume model developed for Pinus kesiya in the Philippines, a yield prediction system was created to predict the volume per ha (VPH) for each diameter class and, subsequently, the total volume of a stand. To evaluate the yield prediction system, the predicted mean VPH for each ...
Directory of Open Access Journals (Sweden)
Emanuele Pascale
2018-04-01
Full Text Available This paper presents the advantages of using Weibull distributions, within the context of railway signaling systems, for enabling safety-oriented decision-making. Failure rates are used to statistically model the basic event of fault-tree analysis, and their value sizes the maximum allowable latency of failures to fulfill the safety target for which the system has been designed. Relying on field-return failure data, Weibull parameters have been calculated for an existing electronic signaling system and a comparison with existing predictive reliability data, based on exponential distribution, is provided. Results are discussed in order to drive considerations on the respect of quantitative targets and on the impact that a wrong hypothesis might have on the choice of a given architecture. Despite the huge amount of information gathered through the after-sales logbook used to build reliability distribution, several key elements for reliable estimation of failure rate values are still missing. This might affect the uncertainty of reliability parameters and the effort required to collect all the information. We then present how to intervene when operational failure rates present higher values compared to the theoretical approach: increasing the redundancies of the system or performing preventive maintenance tasks. Possible consequences of unjustified adoption of constant failure rate are presented. Some recommendations are also shared in order to build reliability-oriented logbooks and avoid data censoring phenomena by enhancing the functions of the electronic boards composing the system.
pT spectra in pp and AA collisions at RHIC and LHC energies using the Tsallis-Weibull approach
Dash, Sadhana; Mahapatra, D. P.
2018-04-01
The Tsallis q -statistics have been incorporated in the Weibull model of particle production, in the form of q-Weibull distribution, to describe the transverse momentum (pT) distribution of charged hadrons at mid-rapidity, measured at RHIC and LHC energies. The q-Weibull distribution is found to describe the observed pT distributions over all ranges of measured pT. Below 2.2 GeV/c, while going from peripheral to central collisions, the parameter q is found to decrease systematically towards unity, indicating an evolution from a non-equilibrated system in peripheral collisions, towards a more thermalized system in central collisions. However, the trend is reversed in the all inclusive pT regime. This can be attributed to an increase in relative contribution of hard pQCD processes in central collisions. The λ-parameter is found to be associated with the mean pT or the collective expansion velocity of the produced hadrons, which shows an expected increase with centrality of collisions. The k parameter is observed to increase with the onset of hard QCD scatterings, initial fluctuations, and other processes leading to non-equilibrium conditions.
Pal, Suvra; Balakrishnan, Narayanaswamy
2018-05-01
In this paper, we develop likelihood inference based on the expectation maximization algorithm for the Box-Cox transformation cure rate model assuming the lifetimes to follow a Weibull distribution. A simulation study is carried out to demonstrate the performance of the proposed estimation method. Through Monte Carlo simulations, we also study the effect of model misspecification on the estimate of cure rate. Finally, we analyze a well-known data on melanoma with the model and the inferential method developed here.
Time-dependent fiber bundles with local load sharing. II. General Weibull fibers.
Phoenix, S Leigh; Newman, William I
2009-12-01
Fiber bundle models (FBMs) are useful tools in understanding failure processes in a variety of material systems. While the fibers and load sharing assumptions are easily described, FBM analysis is typically difficult. Monte Carlo methods are also hampered by the severe computational demands of large bundle sizes, which overwhelm just as behavior relevant to real materials starts to emerge. For large size scales, interest continues in idealized FBMs that assume either equal load sharing (ELS) or local load sharing (LLS) among fibers, rules that reflect features of real load redistribution in elastic lattices. The present work focuses on a one-dimensional bundle of N fibers under LLS where life consumption in a fiber follows a power law in its load, with exponent rho , and integrated over time. This life consumption function is further embodied in a functional form resulting in a Weibull distribution for lifetime under constant fiber stress and with Weibull exponent, beta. Thus the failure rate of a fiber depends on its past load history, except for beta=1 . We develop asymptotic results validated by Monte Carlo simulation using a computational algorithm developed in our previous work [Phys. Rev. E 63, 021507 (2001)] that greatly increases the size, N , of treatable bundles (e.g., 10(6) fibers in 10(3) realizations). In particular, our algorithm is O(N ln N) in contrast with former algorithms which were O(N2) making this investigation possible. Regimes are found for (beta,rho) pairs that yield contrasting behavior for large N. For rho>1 and large N, brittle weakest volume behavior emerges in terms of characteristic elements (groupings of fibers) derived from critical cluster formation, and the lifetime eventually goes to zero as N-->infinity , unlike ELS, which yields a finite limiting mean. For 1/21 but with 0
Huang, Zhi-Hui; Tang, Ying-Chun; Dai, Kai
2016-05-01
Semiconductor materials and Product qualified rate are directly related to the manufacturing costs and survival of the enterprise. Application a dynamic reliability growth analysis method studies manufacturing execution system reliability growth to improve product quality. Refer to classical Duane model assumptions and tracking growth forecasts the TGP programming model, through the failure data, established the Weibull distribution model. Combining with the median rank of average rank method, through linear regression and least squares estimation method, match respectively weibull information fusion reliability growth curve. This assumption model overcome Duane model a weakness which is MTBF point estimation accuracy is not high, through the analysis of the failure data show that the method is an instance of the test and evaluation modeling process are basically identical. Median rank in the statistics is used to determine the method of random variable distribution function, which is a good way to solve the problem of complex systems such as the limited sample size. Therefore this method has great engineering application value.
A comparison of Weibull and βIc analyses of transition range data
International Nuclear Information System (INIS)
McCabe, D.E.
1991-01-01
Specimen size effects on K Jc data scatter in the transition range of fracture toughness have been explained by external (weakest link) statistics. In this investigation, compact specimens of A 533 grade B steel were tested in sizes ranging from 1/2TC(T) to 4TC(T) with sufficient replication to obtain good three-parameter Weibull characterization of data distributions. The optimum fitting parameters for an assumed Weibull slope of 4 were calculated. External statistics analysis was applied to the 1/2TC(T) data to predict median K Jc values for 1TC(T), 2TC(T), and 4TC(T) specimens. The distributions from experimentally developed 1TC(T), 2TC(T), and 4TC(T) data tended to confirm the predictions. However, the extremal prediction model does not work well at lower-shelf toughness. At -150 degree C the extremal model predicts a specimen size effect where in reality there is no size effect
Unification of the Two-Parameter Equation of State and the Principle of Corresponding States
DEFF Research Database (Denmark)
Mollerup, Jørgen
1998-01-01
A two-parameter equation of state is a two-parameter corresponding states model. A two-parameter corresponding states model is composed of two scale factor correlations and a reference fluid equation of state. In a two-parameter equation of state the reference equation of state is the two-paramet...
Energy Technology Data Exchange (ETDEWEB)
Gorgoseo, J. J.; Rojo, A.; Camara-Obregon, A.; Dieguez-Aranda, U.
2012-07-01
The purpose of this study was to compare the accuracy of the Weibull, Johnson's SB and beta distributions, fitted with some of the most usual methods and with different fixed values for the location parameters, for describing diameter distributions in even-aged stands of Pinus pinaster, Pinus radiata and Pinus sylvestris in northwest Spain. A total of 155 permanent plots in Pinus sylvestris stands throughout Galicia, 183 plots in Pinus pinaster stands throughout Galicia and Asturias and 325 plots in Pinus radiata stands in both regions were measured to describe the diameter distributions. Parameters of the Weibull function were estimated by Moments and Maximum Likelihood approaches, those of Johnson's SB function by Conditional Maximum Likelihood and by Knoebel and Burkhart's method, and those of the beta function with the method based on the moments of the distribution. The beta and the Johnson's SB functions were slightly superior to Weibull function for Pinus pinaster stands; the Johnson's SB and beta functions were more accurate in the best fits for Pinus radiata stands, and the best results of the Weibull and the Johnson's SB functions were slightly superior to beta function for Pinus sylvestris stands. However, the three functions are suitable for this stands with an appropriate value of the location parameter and estimation of parameters method. (Author) 44 refs.
On the Performance Analysis of Digital Communications over Weibull-Gamma Channels
Ansari, Imran Shafique
2015-05-01
In this work, the performance analysis of digital communications over a composite Weibull-Gamma (WG) multipath-fading and shadowing channel is presented wherein WG distribution is appropriate for modeling fading environments when multipath is superimposed on shadowing. More specifically, in this work, exact closed-form expressions are derived for the probability density function, the cumulative distribution function, the moment generating function, and the moments of a composite WG channel. Capitalizing on these results, new exact closed-form expressions are offered for the outage probability, the higher- order amount of fading, the average error rate for binary and M-ary modulation schemes, and the ergodic capacity under various types of transmission policies, mostly in terms of Meijer\\'s G functions. These new analytical results were also verified via computer-based Monte-Carlo simulation results. © 2015 IEEE.
On the Performance Analysis of Digital Communications over Weibull-Gamma Channels
Ansari, Imran Shafique; Alouini, Mohamed-Slim
2015-01-01
In this work, the performance analysis of digital communications over a composite Weibull-Gamma (WG) multipath-fading and shadowing channel is presented wherein WG distribution is appropriate for modeling fading environments when multipath is superimposed on shadowing. More specifically, in this work, exact closed-form expressions are derived for the probability density function, the cumulative distribution function, the moment generating function, and the moments of a composite WG channel. Capitalizing on these results, new exact closed-form expressions are offered for the outage probability, the higher- order amount of fading, the average error rate for binary and M-ary modulation schemes, and the ergodic capacity under various types of transmission policies, mostly in terms of Meijer's G functions. These new analytical results were also verified via computer-based Monte-Carlo simulation results. © 2015 IEEE.
Directory of Open Access Journals (Sweden)
Amany E. Aly
2016-04-01
Full Text Available When a system consisting of independent components of the same type, some appropriate actions may be done as soon as a portion of them have failed. It is, therefore, important to be able to predict later failure times from earlier ones. One of the well-known failure distributions commonly used to model component life, is the modified Weibull distribution (MWD. In this paper, two pivotal quantities are proposed to construct prediction intervals for future unobservable lifetimes based on generalized order statistics (gos from MWD. Moreover, a pivotal quantity is developed to reconstruct missing observations at the beginning of experiment. Furthermore, Monte Carlo simulation studies are conducted and numerical computations are carried out to investigate the efficiency of presented results. Finally, two illustrative examples for real data sets are analyzed.
Two-parameter asymptotics in magnetic Weyl calculus
International Nuclear Information System (INIS)
Lein, Max
2010-01-01
This paper is concerned with small parameter asymptotics of magnetic quantum systems. In addition to a semiclassical parameter ε, the case of small coupling λ to the magnetic vector potential naturally occurs in this context. Magnetic Weyl calculus is adapted to incorporate both parameters, at least one of which needs to be small. Of particular interest is the expansion of the Weyl product which can be used to expand the product of operators in a small parameter, a technique which is prominent to obtain perturbation expansions. Three asymptotic expansions for the magnetic Weyl product of two Hoermander class symbols are proven as (i) ε<< 1 and λ<< 1, (ii) ε<< 1 and λ= 1, as well as (iii) ε= 1 and λ<< 1. Expansions (i) and (iii) are impossible to obtain with ordinary Weyl calculus. Furthermore, I relate the results derived by ordinary Weyl calculus with those obtained with magnetic Weyl calculus by one- and two-parameter expansions. To show the power and versatility of magnetic Weyl calculus, I derive the semirelativistic Pauli equation as a scaling limit from the Dirac equation up to errors of fourth order in 1/c.
Directory of Open Access Journals (Sweden)
Luiz Claudio Pardini
2002-10-01
Full Text Available Carbon fibres and glass fibres are reinforcements for advanced composites and the fiber strength is the most influential factor on the strength of the composites. They are essentially brittle and fail with very little reduction in cross section. Composites made with these fibres are characterized by a high strength/density ratio and their properties are intrisically related to their microstructure, i.e., amount and orientation of the fibres, surface treatment, among other factors. Processing parameters have an important role in the fibre mechanical behaviour (strength and modulus. Cracks, voids and impurities in the case of glass fibres and fibrillar misalignments in the case of carbon fibres are created during processing. Such inhomogeneities give rise to an appreciable scatter in properties. The most used statistical tool that deals with this characteristic variability in properties is the Weibull distribution. The present work investigates the influence of the testing gage length on the strength, Young's modulus and Weibull modulus of carbon fibres and glass fibres. The Young's modulus is calculated by two methods: (i ASTM D 3379M, and (ii interaction between testing equipment/specimen The first method resulted in a Young modulus of 183 GPa for carbon fibre, and 76 GPa for glass fibre. The second method gave a Young modulus of 250 GPa for carbon fibre and 50 GPa for glass fibre. These differences revelead differences on how the interaction specimen/testing machine can interfere in the Young modulus calculations. Weibull modulus can be a tool to evaluate the fibre's homogeneity in terms of properties and it is a good quality control parameter during processing. In the range of specimen gage length tested the Weibull modulus for carbon fibre is ~ 3.30 and for glass fibres is ~ 5.65, which indicates that for the batch of fibres tested, the glass fibre is more uniform in properties.
Weibull Parameters Estimation Based on Physics of Failure Model
DEFF Research Database (Denmark)
Kostandyan, Erik; Sørensen, John Dalsgaard
2012-01-01
Reliability estimation procedures are discussed for the example of fatigue development in solder joints using a physics of failure model. The accumulated damage is estimated based on a physics of failure model, the Rainflow counting algorithm and the Miner’s rule. A threshold model is used...... for degradation modeling and failure criteria determination. The time dependent accumulated damage is assumed linearly proportional to the time dependent degradation level. It is observed that the deterministic accumulated damage at the level of unity closely estimates the characteristic fatigue life of Weibull...
Test Population Selection from Weibull-Based, Monte Carlo Simulations of Fatigue Life
Vlcek, Brian L.; Zaretsky, Erwin V.; Hendricks, Robert C.
2012-01-01
Fatigue life is probabilistic and not deterministic. Experimentally establishing the fatigue life of materials, components, and systems is both time consuming and costly. As a result, conclusions regarding fatigue life are often inferred from a statistically insufficient number of physical tests. A proposed methodology for comparing life results as a function of variability due to Weibull parameters, variability between successive trials, and variability due to size of the experimental population is presented. Using Monte Carlo simulation of randomly selected lives from a large Weibull distribution, the variation in the L10 fatigue life of aluminum alloy AL6061 rotating rod fatigue tests was determined as a function of population size. These results were compared to the L10 fatigue lives of small (10 each) populations from AL2024, AL7075 and AL6061. For aluminum alloy AL6061, a simple algebraic relationship was established for the upper and lower L10 fatigue life limits as a function of the number of specimens failed. For most engineering applications where less than 30 percent variability can be tolerated in the maximum and minimum values, at least 30 to 35 test samples are necessary. The variability of test results based on small sample sizes can be greater than actual differences, if any, that exists between materials and can result in erroneous conclusions. The fatigue life of AL2024 is statistically longer than AL6061 and AL7075. However, there is no statistical difference between the fatigue lives of AL6061 and AL7075 even though AL7075 had a fatigue life 30 percent greater than AL6061.
The Weibull probabilities analysis on the single kenaf fiber
Ibrahim, I.; Sarip, S.; Bani, N. A.; Ibrahim, M. H.; Hassan, M. Z.
2018-05-01
Kenaf fiber has a great potential to be replaced with the synthetic composite due to their advantages such as environmentally friendly and outstanding performance. However, the main issue of this natural fiber that to be used in structural composite is inconsistency of their mechanical properties. Here, the influence of the gage length on the mechanical properties of single kenaf fiber was evaluated. This fiber was tested using the Universal testing machine at a loading rate of 1mm per min following ASTM D3822 standard. In this study, the different length of treated fiber including 20, 30 and 40mm were being tested. Following, Weibull probabilities analysis was used to characterize the tensile strength and Young modulus of kenaf fiber. The predicted average tensile strength from this approach is in good agreement with experimental results for the obtained parameter.
Fetisova, Yu. A.; Ermolenko, B. V.; Ermolenko, G. V.; Kiseleva, S. V.
2017-04-01
We studied the information basis for the assessment of wind power potential on the territory of Russia. We described the methodology to determine the parameters of the Weibull function, which reflects the density of distribution of probabilities of wind flow speeds at a defined basic height above the surface of the earth using the available data on the average speed at this height and its repetition by gradations. The application of the least square method for determining these parameters, unlike the use of graphical methods, allows performing a statistical assessment of the results of approximation of empirical histograms by the Weibull formula. On the basis of the computer-aided analysis of the statistical data, it was shown that, at a fixed point where the wind speed changes at different heights, the range of parameter variation of the Weibull distribution curve is relatively small, the sensitivity of the function to parameter changes is quite low, and the influence of changes on the shape of speed distribution curves is negligible. Taking this into consideration, we proposed and mathematically verified the methodology of determining the speed parameters of the Weibull function at other heights using the parameter computations for this function at a basic height, which is known or defined by the average speed of wind flow, or the roughness coefficient of the geological substrate. We gave examples of practical application of the suggested methodology in the development of the Atlas of Renewable Energy Resources in Russia in conditions of deficiency of source meteorological data. The proposed methodology, to some extent, may solve the problem related to the lack of information on the vertical profile of repeatability of the wind flow speeds in the presence of a wide assortment of wind turbines with different ranges of wind-wheel axis heights and various performance characteristics in the global market; as a result, this methodology can become a powerful tool for
International Nuclear Information System (INIS)
Nunes, M.E.C.; Noriega, H.C.; Melo, P.F.F.
1997-01-01
Among the features to take into account in the unavailability analysis of protective channels, there is one that plays a dominant role - that of considering the equipment aging. In this sense, the exponential failure model is not adequate, since some transition rates are no more constant. As a consequence, Markovian models cannot be used anymore. As an alternative, one may use the device of stages that allows for transforming a Non Markovian model into an equivalent Markovian one by insertion of a fictitious states set, called stages. For a given time-dependent transition rate, its failure density is analysed as to the best combination of exponential distributions and then the moments of the original distribution and those of the combination are matched to estimate the necessary parameters. In this paper, the aging of the protective channel is supposed to follow Weibull distributions. Typical means and variances for the times to failure are considered and combinations of stages are checked. Initial conditions features are discussed in connection with states that are fictitious and to check the validity of the developed models. Alternative solutions by the discretization of the failure rates are generated. The results obtained agree quite well. (author). 7 refs., 6 figs
HIROSE,Hideo
1998-01-01
TYPES OF THE DISTRIBUTION:13;Normal distribution (2-parameter)13;Uniform distribution (2-parameter)13;Exponential distribution ( 2-parameter)13;Weibull distribution (2-parameter)13;Gumbel Distribution (2-parameter)13;Weibull/Frechet Distribution (3-parameter)13;Generalized extreme-value distribution (3-parameter)13;Gamma distribution (3-parameter)13;Extended Gamma distribution (3-parameter)13;Log-normal distribution (3-parameter)13;Extended Log-normal distribution (3-parameter)13;Generalized ...
Oubei, Hassan M.; Zedini, Emna; Elafandy, Rami T.; Kammoun, Abla; Ng, Tien Khee; Alouini, Mohamed-Slim; Ooi, Boon S.
2017-01-01
Recent advances in underwater wireless optical communications necessitate a better understanding of the underwater channel. We propose the Weibull model to characterize the fading of salinity induced turbulent underwater wireless optical channels
A study of optimization problem for amplify-and-forward relaying over weibull fading channels
Ikki, Salama Said; Aissa, Sonia
2010-01-01
This paper addresses the power allocation and relay positioning problems in amplify-and-forward cooperative networks operating in Weibull fading environments. We study adaptive power allocation (PA) with fixed relay location, optimal relay location
A Weibull-based compositional approach for hierarchical dynamic fault trees
International Nuclear Information System (INIS)
Chiacchio, F.; Cacioppo, M.; D'Urso, D.; Manno, G.; Trapani, N.; Compagno, L.
2013-01-01
The solution of a dynamic fault tree (DFT) for the reliability assessment can be achieved using a wide variety of techniques. These techniques have a strong theoretical foundation as both the analytical and the simulation methods have been extensively developed. Nevertheless, they all present the same limits that appear with the increasing of the size of the fault trees (i.e., state space explosion, time-consuming simulations), compromising the resolution. We have tested the feasibility of a composition algorithm based on a Weibull distribution, addressed to the resolution of a general class of dynamic fault trees characterized by non-repairable basic events and generally distributed failure times. The proposed composition algorithm is used to generalize the traditional hierarchical technique that, as previous literature have extensively confirmed, is able to reduce the computational effort of a large DFT through the modularization of independent parts of the tree. The results of this study are achieved both through simulation and analytical techniques, thus confirming the capability to solve a quite general class of dynamic fault trees and overcome the limits of traditional techniques.
Reliability demonstration test for load-sharing systems with exponential and Weibull components.
Directory of Open Access Journals (Sweden)
Jianyu Xu
Full Text Available Conducting a Reliability Demonstration Test (RDT is a crucial step in production. Products are tested under certain schemes to demonstrate whether their reliability indices reach pre-specified thresholds. Test schemes for RDT have been studied in different situations, e.g., lifetime testing, degradation testing and accelerated testing. Systems designed with several structures are also investigated in many RDT plans. Despite the availability of a range of test plans for different systems, RDT planning for load-sharing systems hasn't yet received the attention it deserves. In this paper, we propose a demonstration method for two specific types of load-sharing systems with components subject to two distributions: exponential and Weibull. Based on the assumptions and interpretations made in several previous works on such load-sharing systems, we set the mean time to failure (MTTF of the total system as the demonstration target. We represent the MTTF as a summation of mean time between successive component failures. Next, we introduce generalized test statistics for both the underlying distributions. Finally, RDT plans for the two types of systems are established on the basis of these test statistics.
The effect of mis-specification on mean and selection between the Weibull and lognormal models
Jia, Xiang; Nadarajah, Saralees; Guo, Bo
2018-02-01
The lognormal and Weibull models are commonly used to analyse data. Although selection procedures have been extensively studied, it is possible that the lognormal model could be selected when the true model is Weibull or vice versa. As the mean is important in applications, we focus on the effect of mis-specification on mean. The effect on lognormal mean is first considered if the lognormal sample is wrongly fitted by a Weibull model. The maximum likelihood estimate (MLE) and quasi-MLE (QMLE) of lognormal mean are obtained based on lognormal and Weibull models. Then, the impact is evaluated by computing ratio of biases and ratio of mean squared errors (MSEs) between MLE and QMLE. For completeness, the theoretical results are demonstrated by simulation studies. Next, the effect of the reverse mis-specification on Weibull mean is discussed. It is found that the ratio of biases and the ratio of MSEs are independent of the location and scale parameters of the lognormal and Weibull models. The influence could be ignored if some special conditions hold. Finally, a model selection method is proposed by comparing ratios concerning biases and MSEs. We also present a published data to illustrate the study in this paper.
Distribution of crushing strength of tablets
DEFF Research Database (Denmark)
Sonnergaard, Jørn
2002-01-01
The distribution of a given set of data is important since most parametric statistical tests are based on the assumption that the studied data are normal distributed. In analysis of fracture mechanics the Weibull distribution is widely used and the derived Weibull modulus is interpreted as a mate...... data from nine model tablet formulations and four commercial tablets are shown to follow the normal distribution. The importance of proper cleaning of the crushing strength apparatus is demonstrated....
International Nuclear Information System (INIS)
Khahro, Shahnawaz Farhan; Tabbassum, Kavita; Soomro, Amir Mahmood; Dong, Lei; Liao, Xiaozhong
2014-01-01
Highlights: • Weibull scale and shape parameters are calculated using 5 numerical methods. • Yearly mean wind speed is 6.712 m/s at 80 m height with highest in May 9.595 m/s. • Yearly mean WPD is 310 W/m 2 and available energy density is 2716 kWh/m 2 at 80 m height. • Probability of higher wind speeds is more in spring and summer than in autumn and winter. • Estimated cost of per kWh of electricity from wind is calculated as 0.0263 US$/kWh. - Abstract: Pakistan is currently experiencing an acute shortage of energy and urgently needs new sources of affordable energy that could alleviate the misery of the energy starved masses. At present the government is increasing not only the conventional energy sources like hydel and thermal but also focusing on the immense potential of renewable energy sources like; solar, wind, biogas, waste-to-energy etc. The recent economic crisis worldwide, global warming and climate change have also emphasized the need for utilizing economic feasible energy sources having lowest carbon emissions. Wind energy, with its sustainability and low environmental impact, is highly prominent. The aim of this paper is to explore the wind power production prospective of one of the sites in south region of Pakistan. It is worth mentioning here that this type of detailed analysis is hardly done for any location in Pakistan. Wind power densities and frequency distributions of wind speed at four different altitudes along with estimated wind power expected to be generated through commercial wind turbines is calculated. Analysis and comparison of 5 numerical methods is presented in this paper to determine the Weibull scale and shape parameters for the available wind data. The yearly mean wind speed of the considered site is 6.712 m/s and has power density of 310 W/m 2 at 80 m height with high power density during April to August (highest in May with wind speed 9.595 m/s and power density 732 W/m 2 ). Economic evaluation, to exemplify feasibility
International Nuclear Information System (INIS)
Hall, P.L.; Strutt, J.E.
2003-01-01
In reliability engineering, component failures are generally classified in one of three ways: (1) early life failures; (2) failures having random onset times; and (3) late life or 'wear out' failures. When the time-distribution of failures of a population of components is analysed in terms of a Weibull distribution, these failure types may be associated with shape parameters β having values 1 respectively. Early life failures are frequently attributed to poor design (e.g. poor materials selection) or problems associated with manufacturing or assembly processes. We describe a methodology for the implementation of physics-of-failure models of component lifetimes in the presence of parameter and model uncertainties. This treats uncertain parameters as random variables described by some appropriate statistical distribution, which may be sampled using Monte Carlo methods. The number of simulations required depends upon the desired accuracy of the predicted lifetime. Provided that the number of sampled variables is relatively small, an accuracy of 1-2% can be obtained using typically 1000 simulations. The resulting collection of times-to-failure are then sorted into ascending order and fitted to a Weibull distribution to obtain a shape factor β and a characteristic life-time η. Examples are given of the results obtained using three different models: (1) the Eyring-Peck (EP) model for corrosion of printed circuit boards; (2) a power-law corrosion growth (PCG) model which represents the progressive deterioration of oil and gas pipelines; and (3) a random shock-loading model of mechanical failure. It is shown that for any specific model the values of the Weibull shape parameters obtained may be strongly dependent on the degree of uncertainty of the underlying input parameters. Both the EP and PCG models can yield a wide range of values of β, from β>1, characteristic of wear-out behaviour, to β<1, characteristic of early-life failure, depending on the degree of
Statistical Distribution of Fatigue Life for Cast TiAl Alloy
Directory of Open Access Journals (Sweden)
WAN Wenjuan
2016-08-01
Full Text Available Statistic distribution of fatigue life data and its controls of cast Ti-47.5Al-2.5V-1.0Cr-0.2Zr (atom fraction/% alloy were investigated. Fatigue tests were operated by means of load-controlled rotating bending fatigue tests (R=-1 performed at a frequency of 100 Hz at 750 ℃ in air. The fracture mechanism was analyzed by observing the fracture surface morphologies through scanning electron microscope,and the achieved fatigue life data were analyzed by Weibull statistics. The results show that the fatigue life data present a remarkable scatter ranging from 103 to 106 cycles, and distribute mainly in short and long life regime. The reason for this phenomenon is that the fatigue crack initiators are different with different specimens. The crack initiators for short-life specimens are caused by shrinkage porosity, and for long-life ones are caused by bridged porosity interface and soft-oriented lamellar interface. Based on the observation results of fracture surface, two-parameter Weibull distribution model for fatigue life data can be used for the prediction of fatigue life at a certain failure probability. It has also shown that the shrinkage porosity causes the most detrimental effect to fatigue life.
Virtual walks in spin space: A study in a family of two-parameter models
Mullick, Pratik; Sen, Parongama
2018-05-01
We investigate the dynamics of classical spins mapped as walkers in a virtual "spin" space using a generalized two-parameter family of spin models characterized by parameters y and z [de Oliveira et al., J. Phys. A 26, 2317 (1993), 10.1088/0305-4470/26/10/006]. The behavior of S (x ,t ) , the probability that the walker is at position x at time t , is studied in detail. In general S (x ,t ) ˜t-αf (x /tα) with α ≃1 or 0.5 at large times depending on the parameters. In particular, S (x ,t ) for the point y =1 ,z =0.5 corresponding to the Voter model shows a crossover in time; associated with this crossover, two timescales can be defined which vary with the system size L as L2logL . We also show that as the Voter model point is approached from the disordered regions along different directions, the width of the Gaussian distribution S (x ,t ) diverges in a power law manner with different exponents. For the majority Voter case, the results indicate that the the virtual walk can detect the phase transition perhaps more efficiently compared to other nonequilibrium methods.
Nash equilibria in quantum games with generalized two-parameter strategies
International Nuclear Information System (INIS)
Flitney, Adrian P.; Hollenberg, Lloyd C.L.
2007-01-01
In the Eisert protocol for 2x2 quantum games [J. Eisert, et al., Phys. Rev. Lett. 83 (1999) 3077], a number of authors have investigated the features arising from making the strategic space a two-parameter subset of single qubit unitary operators. We argue that the new Nash equilibria and the classical-quantum transitions that occur are simply an artifact of the particular strategy space chosen. By choosing a different, but equally plausible, two-parameter strategic space we show that different Nash equilibria with different classical-quantum transitions can arise. We generalize the two-parameter strategies and also consider these strategies in a multiplayer setting
Energy Technology Data Exchange (ETDEWEB)
Lee, Jongk Uk; Lee, Kwan Hee; Kim, Sung Il; Yook, Dae Sik; Ahn, Sang Myeon [KINS, Daejeon (Korea, Republic of)
2016-05-15
Evaluation of the meteorological characteristics at the nuclear power plant and in the surrounding area should be performed in determining the site suitability for safe operation of the nuclear power plant. Under unexpected emergency condition, knowledge of meteorological information on the site area is important to provide the basis for estimating environmental impacts resulting from radioactive materials released in gaseous effluents during the accident condition. In the meteorological information, wind speed and direction are the important meteorological factors for examination of the safety analysis in the nuclear power plant area. Wind characteristics was analyzed on Hanbit NPP area. It was found that the Weibull parameters k and c vary 2.56 to 4.77 and 4.53 to 6.79 for directional wind speed distribution, respectively. Maximum wind frequency was NE and minimum was NNW.
International Nuclear Information System (INIS)
Wang Jisuo; Sun Changyong; He Jinyu
1996-01-01
The eigenstates of the higher power of the annihilation operator a qs k (k≥3) of the two-parameter deformed harmonic oscillator are constructed. Their completeness is demonstrated in terms of the qs-integration
Pal, Suvra; Balakrishnan, N
2017-10-01
In this paper, we consider a competing cause scenario and assume the number of competing causes to follow a Conway-Maxwell Poisson distribution which can capture both over and under dispersion that is usually encountered in discrete data. Assuming the population of interest having a component cure and the form of the data to be interval censored, as opposed to the usually considered right-censored data, the main contribution is in developing the steps of the expectation maximization algorithm for the determination of the maximum likelihood estimates of the model parameters of the flexible Conway-Maxwell Poisson cure rate model with Weibull lifetimes. An extensive Monte Carlo simulation study is carried out to demonstrate the performance of the proposed estimation method. Model discrimination within the Conway-Maxwell Poisson distribution is addressed using the likelihood ratio test and information-based criteria to select a suitable competing cause distribution that provides the best fit to the data. A simulation study is also carried out to demonstrate the loss in efficiency when selecting an improper competing cause distribution which justifies the use of a flexible family of distributions for the number of competing causes. Finally, the proposed methodology and the flexibility of the Conway-Maxwell Poisson distribution are illustrated with two known data sets from the literature: smoking cessation data and breast cosmesis data.
Oubei, Hassan M.
2017-12-13
Recent advances in underwater wireless optical communications necessitate a better understanding of the underwater channel. We propose the Weibull model to characterize the fading of salinity induced turbulent underwater wireless optical channels. The model shows an excellent agreement with the measured data under all channel conditions.
Modified Weibull theory and stress-concentration factors of polycrystalline graphite
International Nuclear Information System (INIS)
Ho, F.H.
1980-12-01
Stress concentration factors (SCF) due to geometric discontinuities in graphite specimens are observed to be much less than the theoretical SCF in an elastic material. In fact, the experimental SCF is always less than two and sometimes even less than one. A four parameter Weibull theory which recognizes the grain size effect is found to give an adequate explanation of the above observed discrepancies
determination of weibull parameters and analysis of wind power
African Journals Online (AJOL)
HOD
shape parameter (k) and the scale factor(c) were obtained to be 6.7 m/s and 4.3 m/s, 0.91 MW and 0.25 MW, K~ 5.4 and. 2.1, and c ... China, the forecast is not different as the report of the ..... Distribution for Wind Energy Analysis”, J. Wind Eng.
An Application of a Multidimensional Extension of the Two-Parameter Logistic Latent Trait Model.
McKinley, Robert L.; Reckase, Mark D.
A latent trait model is described that is appropriate for use with tests that measure more than one dimension, and its application to both real and simulated test data is demonstrated. Procedures for estimating the parameters of the model are presented. The research objectives are to determine whether the two-parameter logistic model more…
Bending of an Infinite beam on a base with two parameters in the absence of a part of the base
Directory of Open Access Journals (Sweden)
Aleksandrovskiy Maxim
2018-01-01
Full Text Available Currently, in connection with the rapid development of high-rise construction and the improvement of joint operation of high-rise structures and bases models, the questions connected with the use of various calculation methods become topical. The rigor of analytical methods is capable of more detailed and accurate characterization of the structures behavior, which will affect the reliability of objects and can lead to a reduction in their cost. In the article, a model with two parameters is used as a computational model of the base that can effectively take into account the distributive properties of the base by varying the coefficient reflecting the shift parameter. The paper constructs the effective analytical solution of the problem of a beam of infinite length interacting with a two-parameter voided base. Using the Fourier integral equations, the original differential equation is reduced to the Fredholm integral equation of the second kind with a degenerate kernel, and all the integrals are solved analytically and explicitly, which leads to an increase in the accuracy of the computations in comparison with the approximate methods. The paper consider the solution of the problem of a beam loaded with a concentrated force applied at the point of origin with a fixed value of the length of the dip section. The paper gives the analysis of the obtained results values for various parameters of coefficient taking into account cohesion of the ground.
Bending of an Infinite beam on a base with two parameters in the absence of a part of the base
Aleksandrovskiy, Maxim; Zaharova, Lidiya
2018-03-01
Currently, in connection with the rapid development of high-rise construction and the improvement of joint operation of high-rise structures and bases models, the questions connected with the use of various calculation methods become topical. The rigor of analytical methods is capable of more detailed and accurate characterization of the structures behavior, which will affect the reliability of objects and can lead to a reduction in their cost. In the article, a model with two parameters is used as a computational model of the base that can effectively take into account the distributive properties of the base by varying the coefficient reflecting the shift parameter. The paper constructs the effective analytical solution of the problem of a beam of infinite length interacting with a two-parameter voided base. Using the Fourier integral equations, the original differential equation is reduced to the Fredholm integral equation of the second kind with a degenerate kernel, and all the integrals are solved analytically and explicitly, which leads to an increase in the accuracy of the computations in comparison with the approximate methods. The paper consider the solution of the problem of a beam loaded with a concentrated force applied at the point of origin with a fixed value of the length of the dip section. The paper gives the analysis of the obtained results values for various parameters of coefficient taking into account cohesion of the ground.
Comparison of Weibull strength parameters from flexure and spin tests of brittle materials
Holland, Frederic A., Jr.; Zaretsky, Erwin V.
1991-01-01
Fracture data from five series of four point bend tests of beam and spin tests of flat annular disks were reanalyzed. Silicon nitride and graphite were the test materials. The experimental fracture strengths of the disks were compared with the predicted strengths based on both volume flaw and surface flaw analyses of four point bend data. Volume flaw analysis resulted in a better correlation between disks and beams in three of the five test series than did surface flaw analysis. The Weibull (moduli) and characteristic gage strengths for the disks and beams were also compared. Differences in the experimental Weibull slopes were not statistically significant. It was shown that results from the beam tests can predict the fracture strength of rotating disks.
Lin, Wei-Shao; Ercoli, Carlo; Feng, Changyong; Morton, Dean
2012-07-01
The objective of this study was to compare the effect of veneering porcelain (monolithic or bilayer specimens) and core fabrication technique (heat-pressed or CAD/CAM) on the biaxial flexural strength and Weibull modulus of leucite-reinforced and lithium-disilicate glass ceramics. In addition, the effect of veneering technique (heat-pressed or powder/liquid layering) for zirconia ceramics on the biaxial flexural strength and Weibull modulus was studied. Five ceramic core materials (IPS Empress Esthetic, IPS Empress CAD, IPS e.max Press, IPS e.max CAD, IPS e.max ZirCAD) and three corresponding veneering porcelains (IPS Empress Esthetic Veneer, IPS e.max Ceram, IPS e.max ZirPress) were selected for this study. Each core material group contained three subgroups based on the core material thickness and the presence of corresponding veneering porcelain as follows: 1.5 mm core material only (subgroup 1.5C), 0.8 mm core material only (subgroup 0.8C), and 1.5 mm core/veneer group: 0.8 mm core with 0.7 mm corresponding veneering porcelain with a powder/liquid layering technique (subgroup 0.8C-0.7VL). The ZirCAD group had one additional 1.5 mm core/veneer subgroup with 0.7 mm heat-pressed veneering porcelain (subgroup 0.8C-0.7VP). The biaxial flexural strengths were compared for each subgroup (n = 10) according to ISO standard 6872:2008 with ANOVA and Tukey's post hoc multiple comparison test (p≤ 0.05). The reliability of strength was analyzed with the Weibull distribution. For all core materials, the 1.5 mm core/veneer subgroups (0.8C-0.7VL, 0.8C-0.7VP) had significantly lower mean biaxial flexural strengths (p Empress and e.max groups, regardless of core thickness and fabrication techniques. Comparing fabrication techniques, Empress Esthetic/CAD, e.max Press/CAD had similar biaxial flexural strength (p= 0.28 for Empress pair; p= 0.87 for e.max pair); however, e.max CAD/Press groups had significantly higher flexural strength (p Empress Esthetic/CAD groups. Monolithic core
Probabilistic Analysis for Comparing Fatigue Data Based on Johnson-Weibull Parameters
Vlcek, Brian L.; Hendricks, Robert C.; Zaretsky, Erwin V.
2013-01-01
Leonard Johnson published a methodology for establishing the confidence that two populations of data are different. Johnson's methodology is dependent on limited combinations of test parameters (Weibull slope, mean life ratio, and degrees of freedom) and a set of complex mathematical equations. In this report, a simplified algebraic equation for confidence numbers is derived based on the original work of Johnson. The confidence numbers calculated with this equation are compared to those obtained graphically by Johnson. Using the ratios of mean life, the resultant values of confidence numbers at the 99 percent level deviate less than 1 percent from those of Johnson. At a 90 percent confidence level, the calculated values differ between +2 and 4 percent. The simplified equation is used to rank the experimental lives of three aluminum alloys (AL 2024, AL 6061, and AL 7075), each tested at three stress levels in rotating beam fatigue, analyzed using the Johnson- Weibull method, and compared to the ASTM Standard (E739 91) method of comparison. The ASTM Standard did not statistically distinguish between AL 6061 and AL 7075. However, it is possible to rank the fatigue lives of different materials with a reasonable degree of statistical certainty based on combined confidence numbers using the Johnson- Weibull analysis. AL 2024 was found to have the longest fatigue life, followed by AL 7075, and then AL 6061. The ASTM Standard and the Johnson-Weibull analysis result in the same stress-life exponent p for each of the three aluminum alloys at the median, or L(sub 50), lives
A two-parameter family of double-power-law biorthonormal potential-density expansions
Lilley, Edward J.; Sanders, Jason L.; Evans, N. Wyn
2018-05-01
We present a two-parameter family of biorthonormal double-power-law potential-density expansions. Both the potential and density are given in closed analytic form and may be rapidly computed via recurrence relations. We show that this family encompasses all the known analytic biorthonormal expansions: the Zhao expansions (themselves generalizations of ones found earlier by Hernquist & Ostriker and by Clutton-Brock) and the recently discovered Lilley et al. (2017a) expansion. Our new two-parameter family includes expansions based around many familiar spherical density profiles as zeroth-order models, including the γ models and the Jaffe model. It also contains a basis expansion that reproduces the famous Navarro-Frenk-White (NFW) profile at zeroth order. The new basis expansions have been found via a systematic methodology which has wide applications in finding other new expansions. In the process, we also uncovered a novel integral transform solution to Poisson's equation.
Thermodynamics of two-parameter quantum group Bose and Fermi gases
International Nuclear Information System (INIS)
Algin, A.
2005-01-01
The high and low temperature thermodynamic properties of the two-parameter deformed quantum group Bose and Fermi gases with SU p/q (2) symmetry are studied. Starting with a SU p/q (2)-invariant bosonic as well as fermionic Hamiltonian, several thermodynamic functions of the system such as the average number of particles, internal energy and equation of state are derived. The effects of two real independent deformation parameters p and q on the properties of the systems are discussed. Particular emphasis is given to a discussion of the Bose-Einstein condensation phenomenon for the two-parameter deformed quantum group Bose gas. The results are also compared with earlier undeformed and one-parameter deformed versions of Bose and Fermi gas models. (author)
Band head spin assignment of superdeformed bands in 133Pr using two-parameter formulae
Sharma, Honey; Mittal, H. M.
2018-03-01
The two-parameter formulae viz. the power index formula, the nuclear softness formula and the VMI model are adopted to accredit the band head spin (I0) of four superdeformed rotational bands in 133Pr. The technique of least square fitting is used to accredit the band head spin for four superdeformed rotational bands in 133Pr. The root mean deviation among the computed transition energies and well-known experimental transition energies are attained by extracting the model parameters from the two-parameter formulae. The determined transition energies are in excellent agreement with the experimental transition energies, whenever exact spins are accredited. The power index formula coincides well with the experimental data and provides minimum root mean deviation. So, the power index formula is more efficient tool than the nuclear softness formula and the VMI model. The deviation of dynamic moment of inertia J(2) against the rotational frequency is also examined.
Kosmidis, Kosmas; Argyrakis, Panos; Macheras, Panos
2003-07-01
To verify the Higuchi law and study the drug release from cylindrical and spherical matrices by means of Monte Carlo computer simulation. A one-dimensional matrix, based on the theoretical assumptions of the derivation of the Higuchi law, was simulated and its time evolution was monitored. Cylindrical and spherical three-dimensional lattices were simulated with sites at the boundary of the lattice having been denoted as leak sites. Particles were allowed to move inside it using the random walk model. Excluded volume interactions between the particles was assumed. We have monitored the system time evolution for different lattice sizes and different initial particle concentrations. The Higuchi law was verified using the Monte Carlo technique in a one-dimensional lattice. It was found that Fickian drug release from cylindrical matrices can be approximated nicely with the Weibull function. A simple linear relation between the Weibull function parameters and the specific surface of the system was found. Drug release from a matrix, as a result of a diffusion process assuming excluded volume interactions between the drug molecules, can be described using a Weibull function. This model, although approximate and semiempirical, has the benefit of providing a simple physical connection between the model parameters and the system geometry, which was something missing from other semiempirical models.
Quantum-classical crossover of the escape rate in the two-parameter doubly periodic potential
Energy Technology Data Exchange (ETDEWEB)
Zhou Bin [Department of Physics, Hubei University, Wuhan 430062, Hubei (China)]. E-mail: binzhoucn@yahoo.com
2005-05-09
The transition from the quantum tunneling to classical hopping for a two-parameter doubly periodic potential is investigated. According to the Chudnovsky's criterion for the first-order transition, it is shown that there is the first- or second-order transition depending on different parameters regions. The phase boundary lines between first- and second-order transitions are calculated, and a complete phase diagram is presented.
Quantum-classical crossover of the escape rate in the two-parameter doubly periodic potential
International Nuclear Information System (INIS)
Zhou Bin
2005-01-01
The transition from the quantum tunneling to classical hopping for a two-parameter doubly periodic potential is investigated. According to the Chudnovsky's criterion for the first-order transition, it is shown that there is the first- or second-order transition depending on different parameters regions. The phase boundary lines between first- and second-order transitions are calculated, and a complete phase diagram is presented
Quantum classical crossover of the escape rate in the two-parameter doubly periodic potential
Zhou, Bin
2005-05-01
The transition from the quantum tunneling to classical hopping for a two-parameter doubly periodic potential is investigated. According to the Chudnovsky's criterion for the first-order transition, it is shown that there is the first- or second-order transition depending on different parameters regions. The phase boundary lines between first- and second-order transitions are calculated, and a complete phase diagram is presented.
The two-parameter deformation of GL(2), its differential calculus, and Lie algebra
International Nuclear Information System (INIS)
Schirrmacher, A.; Wess, J.
1991-01-01
The Yang-Baxter equation is solved in two dimensions giving rise to a two-parameter deformation of GL(2). The transformation properties of quantum planes are briefly discussed. Non-central determinant and inverse are constructed. A right-invariant differential calculus is presented and the role of the different deformation parameters investigated. While the corresponding Lie algebra relations are simply deformed, the comultiplication exhibits both quantization parameters. (orig.)
Energy Technology Data Exchange (ETDEWEB)
Safari, Bonfils; Gasore, Jimmy [Department of Physics, National University of Rwanda, P.O. Box 117, Huye, South Province (Rwanda)
2010-12-15
A wind energy system converts the kinetic energy of the wind into mechanical or electrical energy that can be harnessed for practical uses and transform the economy of rural areas where access to water and electricity is very restricted and industry is almost nonexistent in most of the developing countries like Rwanda. Assessing wind power potential for a location is an imperative requirement before making a decision for the installation of windmills or a wind electric generator and evaluating plans for relating projects. The aim of the present study was to evaluate the potential of wind resource in Rwanda and to constitute a database for the users of the wind power. A time series of hourly daily measured wind speed and wind direction for the period between 1974 and 1993 on five main Rwandan meteorological stations was provided by the National Meteorology Department. Statistical methods applying Weibull and Rayleigh distribution were presented to evaluate the wind speed characteristics and the wind power potential at a height of 10 m above ground level using hourly monthly average data. Those characteristics were extrapolated for higher levels in altitude. The results give a global picture of the distribution of the wind potential in different locations of Rwanda. (author)
Ghorbanpour Arani, A.; Zamani, M. H.
2018-06-01
The present work deals with bending behavior of nanocomposite beam resting on two parameters modified Vlasov model foundation (MVMF), with consideration of agglomeration and distribution of carbon nanotubes (CNTs) in beam matrix. Equivalent fiber based on Eshelby-Mori-Tanaka approach is employed to determine influence of CNTs aggregation on elastic properties of CNT-reinforced beam. The governing equations are deduced using the principle of minimum potential energy under assumption of the Euler-Bernoulli beam theory. The MVMF required the estimation of γ parameter; to this purpose, unique iterative technique based on variational principles is utilized to compute value of the γ and subsequently fourth-order differential equation is solved analytically. Eventually, the transverse displacements and bending stresses are obtained and compared for different agglomeration parameters, various boundary conditions simultaneously and variant elastic foundation without requirement to instate values for foundation parameters.
Transformation and Self-Similarity Properties of Gamma and Weibull Fragment Size Distributions
2015-12-01
the spatial dimension ( 31 m ). For classic aerosols with nearly-spherical droplets, 3m and 6/s . Alternatively, Wittel et. al. (2006...which come from the definitions of )(DFM , )(MFM , )(DF , and )(MF . The fourth transformation equation is as follows: )(~)( DfDDf mM... dimension defined by Equation (2), )(x is the gamma function, and ),( xs is the upper incomplete gamma function. Table 1a. Rosin-Rammler (a.k.a
Some challenges of wind modelling for modern wind turbines: The Weibull distribution
DEFF Research Database (Denmark)
Gryning, Sven-Erik; Batchvarova, Ekatarina; Floors, Rogier
2012-01-01
Wind power assessments, as well as forecast of wind energy production, are key issues in wind energy and grid related studies. However the hub height of today’s wind turbines is well above the surface layer. Wind profiles studies based on mast data show that the wind profile above the surface layer...
An Iterative Optimization Algorithm for Lens Distortion Correction Using Two-Parameter Models
Directory of Open Access Journals (Sweden)
Daniel Santana-Cedrés
2016-12-01
Full Text Available We present a method for the automatic estimation of two-parameter radial distortion models, considering polynomial as well as division models. The method first detects the longest distorted lines within the image by applying the Hough transform enriched with a radial distortion parameter. From these lines, the first distortion parameter is estimated, then we initialize the second distortion parameter to zero and the two-parameter model is embedded into an iterative nonlinear optimization process to improve the estimation. This optimization aims at reducing the distance from the edge points to the lines, adjusting two distortion parameters as well as the coordinates of the center of distortion. Furthermore, this allows detecting more points belonging to the distorted lines, so that the Hough transform is iteratively repeated to extract a better set of lines until no improvement is achieved. We present some experiments on real images with significant distortion to show the ability of the proposed approach to automatically correct this type of distortion as well as a comparison between the polynomial and division models.
Weibull statistics effective area and volume in the ball-on-ring testing method
DEFF Research Database (Denmark)
Frandsen, Henrik Lund
2014-01-01
The ball-on-ring method is together with other biaxial bending methods often used for measuring the strength of plates of brittle materials, because machining defects are remote from the high stresses causing the failure of the specimens. In order to scale the measured Weibull strength...... to geometries relevant for the application of the material, the effective area or volume for the test specimen must be evaluated. In this work analytical expressions for the effective area and volume of the ball-on-ring test specimen is derived. In the derivation the multiaxial stress field has been accounted...
DEFF Research Database (Denmark)
Silau, Harald; Stabell, Nicolai Bogø; Petersen, Frederik Riddersholm
2018-01-01
To realize the commercial potential of dielectric elastomers, reliable, large-scale film production is required. Ensuring proper mixing and subsequently avoiding demixing after, for example, pumping and coating of elastomer premix in an online process is not facile. Weibull analysis...... of the electrical breakdown strength of dielectric elastomer films is shown to be an effective means of evaluating the film quality. The analysis is shown to be capable of distinguishing between proper and improper mixing schemes where similar analysis of ultimate mechanical properties fails to distinguish....
J. Szymszal; J. Piątkowski; J. Przondziono
2007-01-01
The first part of the study describes the methods used to determine Weibull modulus and the related reliability index of hypereutectic silumins containing about 17% Si, assigned for manufacture of high-duty castings to be used in automotive applications and aviation. The second part of the study discusses the importance of chemical composition, including the additions of 3% Cu, 1,5% Ni and 1,5% Mg, while in the third part attention was focussed on the effect of process history, including moul...
Directory of Open Access Journals (Sweden)
Isis Didier Lins
2018-03-01
Full Text Available The Generalized Renewal Process (GRP is a probabilistic model for repairable systems that can represent the usual states of a system after a repair: as new, as old, or in a condition between new and old. It is often coupled with the Weibull distribution, widely used in the reliability context. In this paper, we develop novel GRP models based on probability distributions that stem from the Tsallis’ non-extensive entropy, namely the q-Exponential and the q-Weibull distributions. The q-Exponential and Weibull distributions can model decreasing, constant or increasing failure intensity functions. However, the power law behavior of the q-Exponential probability density function for specific parameter values is an advantage over the Weibull distribution when adjusting data containing extreme values. The q-Weibull probability distribution, in turn, can also fit data with bathtub-shaped or unimodal failure intensities in addition to the behaviors already mentioned. Therefore, the q-Exponential-GRP is an alternative for the Weibull-GRP model and the q-Weibull-GRP generalizes both. The method of maximum likelihood is used for their parameters’ estimation by means of a particle swarm optimization algorithm, and Monte Carlo simulations are performed for the sake of validation. The proposed models and algorithms are applied to examples involving reliability-related data of complex systems and the obtained results suggest GRP plus q-distributions are promising techniques for the analyses of repairable systems.
Ju Feng; Wen Zhong Shen
2015-01-01
Reliable wind modelling is of crucial importance for wind farm development. The common practice of using sector-wise Weibull distributions has been found inappropriate for wind farm layout optimization. In this study, we propose a simple and easily implementable method to construct joint distributions of wind speed and wind direction, which is based on the parameters of sector-wise Weibull distributions and interpolations between direction sectors. It is applied to the wind measurement data a...
A two-parameter nondiffusive heat conduction model for data analysis in pump-probe experiments
Ma, Yanbao
2014-12-01
Nondiffusive heat transfer has attracted intensive research interests in last 50 years because of its importance in fundamental physics and engineering applications. It has unique features that cannot be described by the Fourier law. However, current studies of nondiffusive heat transfer still focus on studying the effective thermal conductivity within the framework of the Fourier law due to a lack of a well-accepted replacement. Here, we show that nondiffusive heat conduction can be characterized by two inherent material properties: a diffusive thermal conductivity and a ballistic transport length. We also present a two-parameter heat conduction model and demonstrate its validity in different pump-probe experiments. This model not only offers new insights of nondiffusive heat conduction but also opens up new avenues for the studies of nondiffusive heat transfer outside the framework of the Fourier law.
Vibrations And Stability Of Bernoulli-Euler And Timoshenko Beams On Two-Parameter Elastic Foundation
Directory of Open Access Journals (Sweden)
Obara P.
2014-12-01
Full Text Available The vibration and stability analysis of uniform beams supported on two-parameter elastic foundation are performed. The second foundation parameter is a function of the total rotation of the beam. The effects of axial force, foundation stiffness parameters, transverse shear deformation and rotatory inertia are incorporated into the accurate vibration analysis. The work shows very important question of relationships between the parameters describing the beam vibration, the compressive force and the foundation parameters. For the free supported beam, the exact formulas for the natural vibration frequencies, the critical forces and the formula defining the relationship between the vibration frequency and the compressive forces are derived. For other conditions of the beam support conditional equations were received. These equations determine the dependence of the frequency of vibration of the compressive force for the assumed parameters of elastic foundation and the slenderness of the beam.
International Nuclear Information System (INIS)
Zhang, L.F.; Xie, M.; Tang, L.C.
2006-01-01
Estimation of the Weibull shape parameter is important in reliability engineering. However, commonly used methods such as the maximum likelihood estimation (MLE) and the least squares estimation (LSE) are known to be biased. Bias correction methods for MLE have been studied in the literature. This paper investigates the methods for bias correction when model parameters are estimated with LSE based on probability plot. Weibull probability plot is very simple and commonly used by practitioners and hence such a study is useful. The bias of the LS shape parameter estimator for multiple censored data is also examined. It is found that the bias can be modeled as the function of the sample size and the censoring level, and is mainly dependent on the latter. A simple bias function is introduced and bias correcting formulas are proposed for both complete and censored data. Simulation results are also presented. The bias correction methods proposed are very easy to use and they can typically reduce the bias of the LSE of the shape parameter to less than half percent
Utilization of Weibull equation to obtain soil-water diffusivity in horizontal infiltration
International Nuclear Information System (INIS)
Guerrini, I.A.
1982-06-01
Water movement was studied in horizontal infiltration experiments using laboratory columns of air-dry and homogeneous soil to obtain a simple and suitable equation for soil-water diffusivity. Many water content profiles for each one of the ten soil columns utilized were obtained through gamma-ray attenuation technique using a 137 Cs source. During the measurement of a particular water content profile, the soil column was held in the same position in order to measure changes in time and so to reduce the errors in water content determination. The Weibull equation utilized was excellent in fitting water content profiles experimental data. The use of an analytical function for ν, the Boltzmann variable, according to Weibull model, allowed to obtain a simple equation for soil water diffusivity. Comparisons among the equation here obtained for diffusivity and others solutions found in literature were made, and the unsuitability of a simple exponential variation of diffusivity with water content for the full range of the latter was shown. The necessity of admitting the time dependency for diffusivity was confirmed and also the possibility fixing that dependency on a well known value extended to generalized soil water infiltration studies was found. Finally, it was shown that the soil water diffusivity function given by the equation here proposed can be obtained just by the analysis of the wetting front advance as a function of time. (Author) [pt
International Nuclear Information System (INIS)
Sun, Huarui; Bajo, Miguel Montes; Uren, Michael J.; Kuball, Martin
2015-01-01
Gate leakage degradation of AlGaN/GaN high electron mobility transistors under OFF-state stress is investigated using a combination of electrical, optical, and surface morphology characterizations. The generation of leakage “hot spots” at the edge of the gate is found to be strongly temperature accelerated. The time for the formation of each failure site follows a Weibull distribution with a shape parameter in the range of 0.7–0.9 from room temperature up to 120 °C. The average leakage per failure site is only weakly temperature dependent. The stress-induced structural degradation at the leakage sites exhibits a temperature dependence in the surface morphology, which is consistent with a surface defect generation process involving temperature-associated changes in the breakdown sites
Directory of Open Access Journals (Sweden)
Janine Treter
2010-01-01
Full Text Available Saponins are natural soaplike foam-forming compounds widely used in foods, cosmetic and pharmaceutical preparations. In this work foamability and foam lifetime of foams obtained from Ilex paraguariensis unripe fruits were analyzed. Polysorbate 80 and sodium dodecyl sulfate were used as reference surfactants. Aiming a better data understanding a linearized 4-parameters Weibull function was proposed. The mate hydroethanolic extract (ME and a mate saponin enriched fraction (MSF afforded foamability and foam lifetime comparable to the synthetic surfactants. The linearization of the Weibull equation allowed the statistical comparison of foam decay curves, improving former mathematical approaches.
Statistical distributions as applied to environmental surveillance data
International Nuclear Information System (INIS)
Speer, D.R.; Waite, D.A.
1976-01-01
Application of normal, lognormal, and Weibull distributions to radiological environmental surveillance data was investigated for approximately 300 nuclide-medium-year-location combinations. The fit of data to distributions was compared through probability plotting (special graph paper provides a visual check) and W test calculations. Results show that 25% of the data fit the normal distribution, 50% fit the lognormal, and 90% fit the Weibull.Demonstration of how to plot each distribution shows that normal and lognormal distributions are comparatively easy to use while Weibull distribution is complicated and difficult to use. Although current practice is to use normal distribution statistics, normal fit the least number of data groups considered in this study
Fitting diameter distribution models to data from forest inventories with concentric plot design
Directory of Open Access Journals (Sweden)
Nikos Nanos
2017-10-01
Research highlights:We designed a new method to fit the Weibull distribution to forest inventory data from concentric plots that achieves high accuracy and precision in parameter estimates regardless of the within-plot spatial tree pattern.
Two-parameter nonlinear spacetime perturbations: gauge transformations and gauge invariance
International Nuclear Information System (INIS)
Bruni, Marco; Gualtieri, Leonardo; Sopuerta, Carlos F
2003-01-01
An implicit fundamental assumption in relativistic perturbation theory is that there exists a parametric family of spacetimes that can be Taylor expanded around a background. The choice of the latter is crucial to obtain a manageable theory, so that it is sometime convenient to construct a perturbative formalism based on two (or more) parameters. The study of perturbations of rotating stars is a good example: in this case one can treat the stationary axisymmetric star using a slow rotation approximation (expansion in the angular velocity Ω), so that the background is spherical. Generic perturbations of the rotating star (say parametrized by λ) are then built on top of the axisymmetric perturbations in Ω. Clearly, any interesting physics requires nonlinear perturbations, as at least terms λΩ need to be considered. In this paper, we analyse the gauge dependence of nonlinear perturbations depending on two parameters, derive explicit higher-order gauge transformation rules and define gauge invariance. The formalism is completely general and can be used in different applications of general relativity or any other spacetime theory
TWO-PARAMETER ISOTHERMS OF METHYL ORANGE SORPTION BY PINECONE DERIVED ACTIVATED CARBON
Directory of Open Access Journals (Sweden)
M. R. Samarghandi ، M. Hadi ، S. Moayedi ، F. Barjasteh Askari
2009-10-01
Full Text Available The adsorption of a mono azo dye methyl-orange (MeO onto granular pinecone derived activated carbon (GPAC, from aqueous solutions, was studied in a batch system. Seven two-parameter isotherm models Langmuir, Freundlich, Dubinin-Radushkevic, Temkin, Halsey, Jovanovic and Hurkins-Jura were used to fit the experimental data. The results revealed that the adsorption isotherm models fitted the data in the order of Jovanovic (X2=1.374 > Langmuir > Dubinin-Radushkevic > Temkin > Freundlich > Halsey > Hurkins-Jura isotherms. Adsorption isotherms modeling showed that the interaction of dye with activated carbon surface is localized monolayer adsorption. A comparison of kinetic models was evaluated for the pseudo-second order, Elovich and Lagergren kinetic models. Lagergren first order model was found to agree well with the experimental data (X2=9.231. In order to determine the best-fit isotherm and kinetic models, two error analysis methods of Residual Mean Square Error and Chi-square statistic (X2 were used to evaluate the data.
Modeling the reliability and maintenance costs of wind turbines using Weibull analysis
Energy Technology Data Exchange (ETDEWEB)
Vachon, W.A. [W.A. Vachon & Associates, Inc., Manchester, MA (United States)
1996-12-31
A general description is provided of the basic mathematics and use of Weibull statistical models for modeling component failures and maintenance costs as a function of time. The applicability of the model to wind turbine components and subsystems is discussed with illustrative examples of typical component reliabilities drawn from actual field experiences. Example results indicate the dominant role of key subsystems based on a combination of their failure frequency and repair/replacement costs. The value of the model is discussed as a means of defining (1) maintenance practices, (2) areas in which to focus product improvements, (3) spare parts inventory, and (4) long-term trends in maintenance costs as an important element in project cash flow projections used by developers, investors, and lenders. 6 refs., 8 figs., 3 tabs.
A study of optimization problem for amplify-and-forward relaying over weibull fading channels
Ikki, Salama Said
2010-09-01
This paper addresses the power allocation and relay positioning problems in amplify-and-forward cooperative networks operating in Weibull fading environments. We study adaptive power allocation (PA) with fixed relay location, optimal relay location with fixed power allocation, and joint optimization of the PA and relay location under total transmit power constraint, in order to minimize the outage probability and average error probability at high signal-to-noise ratios (SNR). Analytical results are validated by numerical simulations and comparisons between the different optimization schemes and their performance are provided. Results show that optimum PA brings only coding gain, while optimum relay location yields, in addition to the latter, diversity gains as well. Also, joint optimization improves both, the diversity gain and coding gain. Furthermore, results illustrate that the analyzed adaptive algorithms outperform uniform schemes. ©2010 IEEE.
Energy Technology Data Exchange (ETDEWEB)
Bass, B.R.; Williams, P.T.; McAfee, W.J.; Pugh, C.E. [Oak Ridge National Lab., Heavy-Section Steel Technology Program, Oak Ridge, TN (United States)
2001-07-01
A primary objective of the United States Nuclear Regulatory Commission (USNRC) -sponsored Heavy-Section Steel Technology (HSST) Program is to develop and validate technology applicable to quantitative assessments of fracture prevention margins in nuclear reactor pressure vessels (RPVs) containing flaws and subjected to service-induced material toughness degradation. This paper describes an experimental/analytical program for the development of a Weibull statistical model of cleavage fracture toughness for applications to shallow surface-breaking and embedded flaws in RPV materials subjected to multi-axial loading conditions. The experimental part includes both material characterization testing and larger fracture toughness experiments conducted using a special-purpose cruciform beam specimen developed by Oak Ridge National Laboratory for applying biaxial loads to shallow cracks. Test materials (pressure vessel steels) included plate product forms (conforming to ASTM A533 Grade B Class 1 specifications) and shell segments procured from a pressurized-water reactor vessel intended for a nuclear power plant. Results from tests performed on cruciform specimens demonstrated that biaxial loading can have a pronounced effect on shallow-flaw fracture toughness in the lower-transition temperature region. A local approach methodology based on a three-parameter Weibull model was developed to correlate these experimentally-observed biaxial effects on fracture toughness. The Weibull model, combined with a new hydrostatic stress criterion in place of the more commonly used maximum principal stress in the kernel of the Weibull stress integral definition, is shown to provide a scaling mechanism between uniaxial and biaxial loading states for 2-dimensional flaws located in the A533-B plate material. The Weibull stress density was introduced as a matrice for identifying regions along a semi-elliptical flaw front that have a higher probability of cleavage initiation. Cumulative
International Nuclear Information System (INIS)
Bass, B.R.; Williams, P.T.; McAfee, W.J.; Pugh, C.E.
2001-01-01
A primary objective of the United States Nuclear Regulatory Commission (USNRC) -sponsored Heavy-Section Steel Technology (HSST) Program is to develop and validate technology applicable to quantitative assessments of fracture prevention margins in nuclear reactor pressure vessels (RPVs) containing flaws and subjected to service-induced material toughness degradation. This paper describes an experimental/analytical program for the development of a Weibull statistical model of cleavage fracture toughness for applications to shallow surface-breaking and embedded flaws in RPV materials subjected to multi-axial loading conditions. The experimental part includes both material characterization testing and larger fracture toughness experiments conducted using a special-purpose cruciform beam specimen developed by Oak Ridge National Laboratory for applying biaxial loads to shallow cracks. Test materials (pressure vessel steels) included plate product forms (conforming to ASTM A533 Grade B Class 1 specifications) and shell segments procured from a pressurized-water reactor vessel intended for a nuclear power plant. Results from tests performed on cruciform specimens demonstrated that biaxial loading can have a pronounced effect on shallow-flaw fracture toughness in the lower-transition temperature region. A local approach methodology based on a three-parameter Weibull model was developed to correlate these experimentally-observed biaxial effects on fracture toughness. The Weibull model, combined with a new hydrostatic stress criterion in place of the more commonly used maximum principal stress in the kernel of the Weibull stress integral definition, is shown to provide a scaling mechanism between uniaxial and biaxial loading states for 2-dimensional flaws located in the A533-B plate material. The Weibull stress density was introduced as a matrice for identifying regions along a semi-elliptical flaw front that have a higher probability of cleavage initiation. Cumulative
Energy Technology Data Exchange (ETDEWEB)
Yilmaz, Şeyda, E-mail: seydayilmaz@ktu.edu.tr; Bayrak, Erdem, E-mail: erdmbyrk@gmail.com [Karadeniz Technical University, Trabzon (Turkey); Bayrak, Yusuf, E-mail: bayrak@ktu.edu.tr [Ağrı İbrahim Çeçen University, Ağrı (Turkey)
2016-04-18
In this study we examined and compared the three different probabilistic distribution methods for determining the best suitable model in probabilistic assessment of earthquake hazards. We analyzed a reliable homogeneous earthquake catalogue between a time period 1900-2015 for magnitude M ≥ 6.0 and estimated the probabilistic seismic hazard in the North Anatolian Fault zone (39°-41° N 30°-40° E) using three distribution methods namely Weibull distribution, Frechet distribution and three-parameter Weibull distribution. The distribution parameters suitability was evaluated Kolmogorov-Smirnov (K-S) goodness-of-fit test. We also compared the estimated cumulative probability and the conditional probabilities of occurrence of earthquakes for different elapsed time using these three distribution methods. We used Easyfit and Matlab software to calculate these distribution parameters and plotted the conditional probability curves. We concluded that the Weibull distribution method was the most suitable than other distribution methods in this region.
Spatial and temporal patterns of global onshore wind speed distribution
International Nuclear Information System (INIS)
Zhou, Yuyu; Smith, Steven J
2013-01-01
Wind power, a renewable energy source, can play an important role in electrical energy generation. Information regarding wind energy potential is important both for energy related modeling and for decision-making in the policy community. While wind speed datasets with high spatial and temporal resolution are often ultimately used for detailed planning, simpler assumptions are often used in analysis work. An accurate representation of the wind speed frequency distribution is needed in order to properly characterize wind energy potential. Using a power density method, this study estimated global variation in wind parameters as fitted to a Weibull density function using NCEP/climate forecast system reanalysis (CFSR) data over land areas. The Weibull distribution performs well in fitting the time series wind speed data at most locations according to R 2 , root mean square error, and power density error. The wind speed frequency distribution, as represented by the Weibull k parameter, exhibits a large amount of spatial variation, a regionally varying amount of seasonal variation, and relatively low decadal variation. We also analyzed the potential error in wind power estimation when a commonly assumed Rayleigh distribution (Weibull k = 2) is used. We find that the assumption of the same Weibull parameter across large regions can result in non-negligible errors. While large-scale wind speed data are often presented in the form of mean wind speeds, these results highlight the need to also provide information on the wind speed frequency distribution. (letter)
The distribution of first-passage times and durations in FOREX and future markets
Sazuka, Naoya; Inoue, Jun-ichi; Scalas, Enrico
2009-07-01
Possible distributions are discussed for intertrade durations and first-passage processes in financial markets. The view-point of renewal theory is assumed. In order to represent market data with relatively long durations, two types of distributions are used, namely a distribution derived from the Mittag-Leffler survival function and the Weibull distribution. For the Mittag-Leffler type distribution, the average waiting time (residual life time) is strongly dependent on the choice of a cut-off parameter tmax, whereas the results based on the Weibull distribution do not depend on such a cut-off. Therefore, a Weibull distribution is more convenient than a Mittag-Leffler type if one wishes to evaluate relevant statistics such as average waiting time in financial markets with long durations. On the other hand, we find that the Gini index is rather independent of the cut-off parameter. Based on the above considerations, we propose a good candidate for describing the distribution of first-passage time in a market: The Weibull distribution with a power-law tail. This distribution compensates the gap between theoretical and empirical results more efficiently than a simple Weibull distribution. It should be stressed that a Weibull distribution with a power-law tail is more flexible than the Mittag-Leffler distribution, which itself can be approximated by a Weibull distribution and a power-law. Indeed, the key point is that in the former case there is freedom of choice for the exponent of the power-law attached to the Weibull distribution, which can exceed 1 in order to reproduce decays faster than possible with a Mittag-Leffler distribution. We also give a useful formula to determine an optimal crossover point minimizing the difference between the empirical average waiting time and the one predicted from renewal theory. Moreover, we discuss the limitation of our distributions by applying our distribution to the analysis of the BTP future and calculating the average waiting
International Nuclear Information System (INIS)
Hu Naihong; Rosso, M.; Zhang Honglian
2006-12-01
We further find the defining structure of a two-parameter quantum affine algebra U r,s (sl n -circumflex) (n > 2) in the sense of Benkart-Witherspoon [BW1] after the work of [BGH1], [HS] and [BH], which turns out to be a Drinfeld double. Of more importance for the 'affine' cases is that we work out the compatible two-parameter version of the Drinfeld realization as a quantum affinization of U r,s (sl n ) and establish the Drinfeld isomorphism Theorem in the two-parameter setting via developing a new remarkable combinatorial approach - quantum 'affine' Lyndon basis with an explicit valid algorithm, based on the Drinfeld realization. (author)
A two-parameter family of exact asymptotically flat solutions to the Einstein-scalar field equations
International Nuclear Information System (INIS)
Nikonov, V V; Tchemarina, Ju V; Tsirulev, A N
2008-01-01
We consider a static spherically symmetric real scalar field, minimally coupled to Einstein gravity. A two-parameter family of exact asymptotically flat solutions is obtained by using the inverse problem method. This family includes non-singular solutions, black holes and naked singularities. For each of these solutions the respective potential is partially negative but positive near spatial infinity. (comments, replies and notes)
Marcoulides, Katerina M.
2018-01-01
This study examined the use of Bayesian analysis methods for the estimation of item parameters in a two-parameter logistic item response theory model. Using simulated data under various design conditions with both informative and non-informative priors, the parameter recovery of Bayesian analysis methods were examined. Overall results showed that…
International Nuclear Information System (INIS)
Baseilhac, P.; Fateev, V.A.
1998-01-01
We calculate the vacuum expectation values of local fields for the two-parameter family of integrable field theories introduced and studied by Fateev (1996). Using this result we propose an explicit expression for the vacuum expectation values of local operators in parafermionic sine-Gordon models and in integrable perturbed SU(2) coset conformal field theories. (orig.)
Directory of Open Access Journals (Sweden)
M.M. Mohie El-Din
2011-10-01
Full Text Available In this paper, two sample Bayesian prediction intervals for order statistics (OS are obtained. This prediction is based on a certain class of the inverse exponential-type distributions using a right censored sample. A general class of prior density functions is used and the predictive cumulative function is obtained in the two samples case. The class of the inverse exponential-type distributions includes several important distributions such the inverse Weibull distribution, the inverse Burr distribution, the loglogistic distribution, the inverse Pareto distribution and the inverse paralogistic distribution. Special cases of the inverse Weibull model such as the inverse exponential model and the inverse Rayleigh model are considered.
O modelo q-weibull em confiabilidade, árvores de falha dinâmicas e implementação de manutenção
Assis, Edilson Machado de
2013-01-01
A distribuição q-Weibull foi aplicada em análise de con abilidade. Trata-se de uma generalização com quatro parâmetros de uma distribuição amplamente utilizada em con- abilidade, a distribuição Weibull, que possui três parâmetros. A distribuição Weibull é baseada na função exponencial do negativo de uma potência. A distribuição q-Weibull utiliza uma generalização da função exponencial, chamada q-exponencial, que apresenta o comportamento assintótico a uma lei de potência e recupe...
Distribution and Mobility of Wealth of Nations
R. Paap (Richard); H.K. van Dijk (Herman)
2009-01-01
textabstractWe estimate the empirical bimodal cross-section distribution of real Gross Domestic Product per capita of 120 countries over the period 1960–1989 by a mixture of a Weibull and a truncated normal density. The components of the mixture represent a group of poor and a group of rich
Directory of Open Access Journals (Sweden)
Paulo Eduardo Teodoro
2014-11-01
Full Text Available RESUMO: O projeto de estruturas de concreto possui uma modelagem matemática de natureza bastante subjetiva. Portanto, objetivou-se com esta pesquisa verificar se as distribuições Normal e de Weibull podem ser aplicadas aos dados resistências à compressão do concreto pronto, agrupados comercialmente. O estudo foi realizado durante o ano de 2011 na cidade de Campo Grande/MS. A resistência à compressão foi avaliada em ensaios de 189 amostras aos 28 dias a partir de diferentes construções de concreto armado realizados na cidade. Os ensaios ocorreram conforme prescrito pela NBR 5739 (ABNT, 2007. Para quantificar o grau em que a distribuição Normal e de Weibull se ajustaram os dados experimentais foram utilizados três testes de adequação: qui-quadrado, Anderson-Darling e Kolmogorov-Smirnov. Com base no presente estudo, a distribuição Weibull pode ser aplicada aos dados de resistência à compressão para concreto. Isto sugere que, apesar de os complexos processos envolvidos na falha de compressão para um material compósito quase frágil como o concreto, um modelo de força estatística é eficaz. Além disso, ao comparar os testes de ajuste, há grande diferença prática entre as distribuições Normal e de Weibull. Esta informação é uma importante adição experimental para a literatura científica no que diz respeito à ruptura de materiais “semi-frágeis”. ABSTRACT: The design of concrete structures and their mathematical modeling is rather subjective in its nature. Therefore, it is the purpose of this study to see whether the Weibull or Normal distributions can be applied to the compressive strengths of commercially batched ready-mixed concrete. The study was conducted during the year 2011 in the city of Campo Grande / MS. The compressive strength was evaluated in 189 test samples at 28 days from different concrete constructions conducted in the city. The trials took place as prescribed by NBR 5739 (ABNT, 2007. To
International Nuclear Information System (INIS)
Arcos-Olalla, Rafael; Reyes, Marco A.; Rosu, Haret C.
2012-01-01
We introduce an alternative factorization of the Hamiltonian of the quantum harmonic oscillator which leads to a two-parameter self-adjoint operator from which the standard harmonic oscillator, the one-parameter oscillators introduced by Mielnik, and the Hermite operator are obtained in certain limits of the parameters. In addition, a single Bernoulli-type parameter factorization, which is different from the one introduced by M.A. Reyes, H.C. Rosu, and M.R. Gutiérrez [Phys. Lett. A 375 (2011) 2145], is briefly discussed in the final part of this work. -- Highlights: ► Factorizations with operators which are not mutually adjoint are presented. ► New two-parameter and one-parameter self-adjoint oscillator operators are introduced. ► Their eigenfunctions are two- and one-parameter deformed Hermite functions.
Energy Technology Data Exchange (ETDEWEB)
Arcos-Olalla, Rafael, E-mail: olalla@fisica.ugto.mx [Departamento de Física, DCI Campus León, Universidad de Guanajuato, Apdo. Postal E143, 37150 León, Gto. (Mexico); Reyes, Marco A., E-mail: marco@fisica.ugto.mx [Departamento de Física, DCI Campus León, Universidad de Guanajuato, Apdo. Postal E143, 37150 León, Gto. (Mexico); Rosu, Haret C., E-mail: hcr@ipicyt.edu.mx [IPICYT, Instituto Potosino de Investigacion Cientifica y Tecnologica, Apdo. Postal 3-74 Tangamanga, 78231 San Luis Potosí, S.L.P. (Mexico)
2012-10-01
We introduce an alternative factorization of the Hamiltonian of the quantum harmonic oscillator which leads to a two-parameter self-adjoint operator from which the standard harmonic oscillator, the one-parameter oscillators introduced by Mielnik, and the Hermite operator are obtained in certain limits of the parameters. In addition, a single Bernoulli-type parameter factorization, which is different from the one introduced by M.A. Reyes, H.C. Rosu, and M.R. Gutiérrez [Phys. Lett. A 375 (2011) 2145], is briefly discussed in the final part of this work. -- Highlights: ► Factorizations with operators which are not mutually adjoint are presented. ► New two-parameter and one-parameter self-adjoint oscillator operators are introduced. ► Their eigenfunctions are two- and one-parameter deformed Hermite functions.
Directory of Open Access Journals (Sweden)
Mehmet KURBAN
2007-01-01
Full Text Available In this paper, the wind energy potential of the region is analyzed with Weibull and Reyleigh statistical distribution functions by using the wind speed data measured per 15 seconds in July, August, September, and October of 2005 at 10 m height of 30-m observation pole in the wind observation station constructed in the coverage of the scientific research project titled "The Construction of Hybrid (Wind-Solar Power Plant Model by Determining the Wind and Solar Potential in the Iki Eylul Campus of A.U." supported by Anadolu University. The Maximum likelihood method is used for finding the parameters of these distributions. The conclusion of the analysis for the months taken represents that the Weibull distribution models the wind speeds better than the Rayleigh distribution. Furthermore, the error rate in the monthly values of power density computed by using the Weibull distribution is smaller than the values by Rayleigh distribution.
Mixture distributions of wind speed in the UAE
Shin, J.; Ouarda, T.; Lee, T. S.
2013-12-01
Wind speed probability distribution is commonly used to estimate potential wind energy. The 2-parameter Weibull distribution has been most widely used to characterize the distribution of wind speed. However, it is unable to properly model wind speed regimes when wind speed distribution presents bimodal and kurtotic shapes. Several studies have concluded that the Weibull distribution should not be used for frequency analysis of wind speed without investigation of wind speed distribution. Due to these mixture distributional characteristics of wind speed data, the application of mixture distributions should be further investigated in the frequency analysis of wind speed. A number of studies have investigated the potential wind energy in different parts of the Arabian Peninsula. Mixture distributional characteristics of wind speed were detected from some of these studies. Nevertheless, mixture distributions have not been employed for wind speed modeling in the Arabian Peninsula. In order to improve our understanding of wind energy potential in Arabian Peninsula, mixture distributions should be tested for the frequency analysis of wind speed. The aim of the current study is to assess the suitability of mixture distributions for the frequency analysis of wind speed in the UAE. Hourly mean wind speed data at 10-m height from 7 stations were used in the current study. The Weibull and Kappa distributions were employed as representatives of the conventional non-mixture distributions. 10 mixture distributions are used and constructed by mixing four probability distributions such as Normal, Gamma, Weibull and Extreme value type-one (EV-1) distributions. Three parameter estimation methods such as Expectation Maximization algorithm, Least Squares method and Meta-Heuristic Maximum Likelihood (MHML) method were employed to estimate the parameters of the mixture distributions. In order to compare the goodness-of-fit of tested distributions and parameter estimation methods for
Probability distribution of machining center failures
International Nuclear Information System (INIS)
Jia Yazhou; Wang Molin; Jia Zhixin
1995-01-01
Through field tracing research for 24 Chinese cutter-changeable CNC machine tools (machining centers) over a period of one year, a database of operation and maintenance for machining centers was built, the failure data was fitted to the Weibull distribution and the exponential distribution, the effectiveness was tested, and the failure distribution pattern of machining centers was found. Finally, the reliability characterizations for machining centers are proposed
基于韦伯模型的风场储能容量计算%Storage Capacity Calculation of Wind Power Based on Weibull Model
Institute of Scientific and Technical Information of China (English)
王树超
2013-01-01
Wind speed model and wind generator output model are analyzed by applying Weibull function to set up wind speed distribution model and the concept of probability theory to calculate the power capacity of energy storage system . The ratio of wind energy and storage capacity is reasonable and meets requirement of energy system by means of stimula -tion experiment .Under the condition of satisfying China ’ s wind power grid standard , the energy storage scale should be minimized and be verified by actual wind farm data .%分析了风电场风速的模型、风力发电机输出模型，运用韦伯函数建立风速分布模型，采用概率论期望的思想，计算储能系统功率容量。通过模拟仿真实验，得出满足电力系统要求的合理风储比。在满足我国风电并网标准的条件下，尽可能地减小储能系统规模，并利用实际风电场数据加以分析验证。
Wang, Yue; Wang, Ping; Liu, Xiaoxia; Cao, Tian
2018-03-01
The performance of decode-and-forward dual-hop mixed radio frequency / free-space optical system in urban area is studied. The RF link is modeled by the Nakagami-m distribution and the FSO link is described by the composite exponentiated Weibull (EW) fading channels with nonzero boresight pointing errors (NBPE). For comparison, the ABER results without pointing errors (PE) and those with zero boresight pointing errors (ZBPE) are also provided. The closed-form expression for the average bit error rate (ABER) in RF link is derived with the help of hypergeometric function, and that in FSO link is obtained by Meijer's G and generalized Gauss-Laguerre quadrature functions. Then, the end-to-end ABERs with binary phase shift keying modulation are achieved on the basis of the computed ABER results of RF and FSO links. The end-to-end ABER performance is further analyzed with different Nakagami-m parameters, turbulence strengths, receiver aperture sizes and boresight displacements. The result shows that with ZBPE and NBPE considered, FSO link suffers a severe ABER degradation and becomes the dominant limitation of the mixed RF/FSO system in urban area. However, aperture averaging can bring significant ABER improvement of this system. Monte Carlo simulation is provided to confirm the validity of the analytical ABER expressions.
Perfect-fluid models admitting a non-Abelian and maximal two-parameter group of isometries
International Nuclear Information System (INIS)
Van den Bergh, N.
1988-01-01
A proof is given that, when a spacetime admits an invariant timelike congruence orthogonal to the orbits of a non-Abelian two-parameter group of isometries, the given congruence is vorticity-free provided the group is maximal. The result is used to derive a canonical coordinate form for perfect-fluid solutions satisfying the above condition. It is also shown that such a group of isometries cannot be orthogonally transitive and a brief discussion is given of the self-similar case. (author)
International Nuclear Information System (INIS)
Chi, Se-Hwan
2015-01-01
Changes in flexural strength and Weibull modulus due to specimen size were investigated for three nuclear graphite grades, IG-110, NBG-18, and PCEA, using four-point-1/3 point (4-1/3) loading with specimens of three different sizes: 3.18 (Thickness) × 6.35 (Width) × 50.8 (Length), 6.50 (T) × 12.0 (W) × 52.0 (L), 18.0 (T) × 16.0 (W) × 64 (L) (mm) (total: 210 specimens). Results showed some specimen size effects were grade dependent: While NBG-18 (a) showed rather significant specimen size effects (37% difference between the 3 T and 18 T), the differences in IG-110 and PCEA were 7.6–15%. The maximum differences in flexural strength due to specimen size were larger in the PCEA and NBG-18 having larger sized coke particles (medium grain size: >300 μm) than the IG-110 with super fine coke particle size (25 μm). The Weibull modulus showed a data population dependency, in that it decreased with increasing numbers of data used for modulus determination. A good correlation between the fracture surface roughness and the flexural strength was confirmed
Assessing a Tornado Climatology from Global Tornado Intensity Distributions
Feuerstein, B.; Dotzek, N.; Grieser, J.
2005-01-01
Recent work demonstrated that the shape of tornado intensity distributions from various regions worldwide is well described by Weibull functions. This statistical modeling revealed a strong correlation between the fit parameters c for shape and b for scale regardless of the data source. In the present work it is shown that the quality of the Weibull fits is optimized if only tornado reports of F1 and higher intensity are used and that the c–b correlation does indeed reflect a universal featur...
5_29 - 37_Dikko et al.,_A New Generalized-Exponential-Weibull ...
African Journals Online (AJOL)
user pc
ction, survival and hazard function, order statistics of the distribution. were estimated using ... he mathematical properties eibull distribution .... 2 December, 2017. Applying binomial expansion and further simplification, equation (15) become.
International Nuclear Information System (INIS)
Wang, B.; Bergstrom, D.J.
2002-01-01
The dynamic two-parameter mixed model (DTPMM) has been recently introduced in the large eddy simulation (LES). However, current approaches in the literatures are mathematically inconsistent. In this paper, the DTPMM has been optimized using the functional variational method. The mathematical inconsistency has been removed and a governing system of two integral equations for the model coefficients of the DTPMM and some significant features have been obtained. Coherent structures relating to the vortex motion of large vortices have been investigated, using the vortex λ 2 -definition of Jeong and Hussain (1995). The numerical results agrees with the classical wall law of von Karman (1939) and experimental correlation of Aydin and Leutheusser (1991). (author)
John R. Jones
1985-01-01
Quaking aspen is the most widely distributed native North American tree species (Little 1971, Sargent 1890). It grows in a great diversity of regions, environments, and communities (Harshberger 1911). Only one deciduous tree species in the world, the closely related Eurasian aspen (Populus tremula), has a wider range (Weigle and Frothingham 1911)....
Statistical distributions as applied to environmental surveillance data
International Nuclear Information System (INIS)
Speer, D.R.; Waite, D.A.
1975-09-01
Application of normal, log normal, and Weibull distributions to environmental surveillance data was investigated for approximately 300 nuclide-medium-year-location combinations. Corresponding W test calculations were made to determine the probability of a particular data set falling within the distribution of interest. Conclusions are drawn as to the fit of any data group to the various distributions. The significance of fitting statistical distributions to the data is discussed
Seuba, Jordi; Deville, Sylvain; Guizard, Christian; Stevenson, Adam J.
2016-01-01
Macroporous ceramics exhibit an intrinsic strength variability caused by the random distribution of defects in their structure. However, the precise role of microstructural features, other than pore volume, on reliability is still unknown. Here, we analyze the applicability of the Weibull analysis to unidirectional macroporous yttria-stabilized-zirconia (YSZ) prepared by ice-templating. First, we performed crush tests on samples with controlled microstructural features with the loading direction parallel to the porosity. The compressive strength data were fitted using two different fitting techniques, ordinary least squares and Bayesian Markov Chain Monte Carlo, to evaluate whether Weibull statistics are an adequate descriptor of the strength distribution. The statistical descriptors indicated that the strength data are well described by the Weibull statistical approach, for both fitting methods used. Furthermore, we assess the effect of different microstructural features (volume, size, densification of the walls, and morphology) on Weibull modulus and strength. We found that the key microstructural parameter controlling reliability is wall thickness. In contrast, pore volume is the main parameter controlling the strength. The highest Weibull modulus (?) and mean strength (198.2 MPa) were obtained for the samples with the smallest and narrowest wall thickness distribution (3.1 ?m) and lower pore volume (54.5%).
Directory of Open Access Journals (Sweden)
Carlos García Mogollón
2010-07-01
Full Text Available La guayaba (Psidium guajava L. es una fruta tropical susceptible de sufrir alteraciones indeseables que afectan su vida útil debido a condiciones inadecuadas de almacenamiento y acondicionamiento. En este trabajo se estimó la vida útil de guayaba en fresco utilizando el modelo probabilístico de Weibull y se valuó la calidad de los frutos durante almacenamiento en diferentes condiciones de temperatura y empaque. La evaluación poscosecha se hizo por 15 días con guayabas variedad regional roja. Se utilizó un dise&#ntilde;o completamente al azar con arreglo factorial consistente en tres factores: tiempo de almacenamiento con seis niveles (0, 3, 6, 9, 12 y 15 días, temperatura de almacenamiento con dos niveles: ambiente (37 °C y humedad relativa (HR entre 85 y 90% y refrigeración (9±2 °C y HR de 85 - 90%; dos tipo de empaques: bandeja de poliestireno con film plástico de PVC y 'foil' de aluminio. Durante la evaluación sensorial en el periodo de almacenamiento se usó una escala estructurada de tres puntos grado de satisfacción. El modelo de Weibull demostró ser adecuado para predecir la vida útil de la guayaba fresca basados en los criterios de ajustes, límites de confianza de aceptación y fallo. Durante el periodo de almacenamiento se observó que el factor tiempo, la temperatura y el tipo de empaque tienen un efecto estadístico significativo (P Guava is a tropical fruit susceptible to undesirable alterations that affect the shelf-life due to inadequate conditions of storage and packing. In this work the shelf-life of guava in fresh using the probabilistic model of Weibull was considered and the quality of the fruits was estimated during storage to different conditions of temperature and packing. The postharvest evaluation was made during 15 days with guavas variety `Red Regional´. The completely randomized design and factorial design with 3 factors: storage time with 6 levels (0, 3, 6, 9, 12, 15 days, storage temperature with
Distributions of Autocorrelated First-Order Kinetic Outcomes: Illness Severity.
Directory of Open Access Journals (Sweden)
James D Englehardt
Full Text Available Many complex systems produce outcomes having recurring, power law-like distributions over wide ranges. However, the form necessarily breaks down at extremes, whereas the Weibull distribution has been demonstrated over the full observed range. Here the Weibull distribution is derived as the asymptotic distribution of generalized first-order kinetic processes, with convergence driven by autocorrelation, and entropy maximization subject to finite positive mean, of the incremental compounding rates. Process increments represent multiplicative causes. In particular, illness severities are modeled as such, occurring in proportion to products of, e.g., chronic toxicant fractions passed by organs along a pathway, or rates of interacting oncogenic mutations. The Weibull form is also argued theoretically and by simulation to be robust to the onset of saturation kinetics. The Weibull exponential parameter is shown to indicate the number and widths of the first-order compounding increments, the extent of rate autocorrelation, and the degree to which process increments are distributed exponential. In contrast with the Gaussian result in linear independent systems, the form is driven not by independence and multiplicity of process increments, but by increment autocorrelation and entropy. In some physical systems the form may be attracting, due to multiplicative evolution of outcome magnitudes towards extreme values potentially much larger and smaller than control mechanisms can contain. The Weibull distribution is demonstrated in preference to the lognormal and Pareto I for illness severities versus (a toxicokinetic models, (b biologically-based network models, (c scholastic and psychological test score data for children with prenatal mercury exposure, and (d time-to-tumor data of the ED01 study.
Directory of Open Access Journals (Sweden)
Lalit Mohan Pradhan
2014-03-01
Full Text Available Background: In the present competitive business scenario researchers have developed various inventory models for deteriorating items considering various practical situations for better inventory control. Permissible delay in payments with various demands and deteriorations is considerably a new concept introduced in developing various inventory models. These models are very useful for both the consumers and the manufacturer. Methods: In the present work an inventory model has been developed for a three parameter Weibull deteriorating item with ramp type demand and salvage value under trade credit system. Here we have considered a single item for developing the model. Results and conclusion: Optimal order quantity, optimal cycle time and total variable cost during a cycle have been derived for the proposed inventory model. The results obtained in this paper have been illustrated with the help of numerical examples and sensitivity analysis.
Directory of Open Access Journals (Sweden)
A. Taravat
2013-09-01
Full Text Available As a major aspect of marine pollution, oil release into the sea has serious biological and environmental impacts. Among remote sensing systems (which is a tool that offers a non-destructive investigation method, synthetic aperture radar (SAR can provide valuable synoptic information about the position and size of the oil spill due to its wide area coverage and day/night, and all-weather capabilities. In this paper we present a new automated method for oil-spill monitoring. A new approach is based on the combination of Weibull Multiplicative Model and machine learning techniques to differentiate between dark spots and the background. First, the filter created based on Weibull Multiplicative Model is applied to each sub-image. Second, the sub-image is segmented by two different neural networks techniques (Pulsed Coupled Neural Networks and Multilayer Perceptron Neural Networks. As the last step, a very simple filtering process is used to eliminate the false targets. The proposed approaches were tested on 20 ENVISAT and ERS2 images which contained dark spots. The same parameters were used in all tests. For the overall dataset, the average accuracies of 94.05 % and 95.20 % were obtained for PCNN and MLP methods, respectively. The average computational time for dark-spot detection with a 256 × 256 image in about 4 s for PCNN segmentation using IDL software which is the fastest one in this field at present. Our experimental results demonstrate that the proposed approach is very fast, robust and effective. The proposed approach can be applied to the future spaceborne SAR images.
Taravat, A.; Del Frate, F.
2013-09-01
As a major aspect of marine pollution, oil release into the sea has serious biological and environmental impacts. Among remote sensing systems (which is a tool that offers a non-destructive investigation method), synthetic aperture radar (SAR) can provide valuable synoptic information about the position and size of the oil spill due to its wide area coverage and day/night, and all-weather capabilities. In this paper we present a new automated method for oil-spill monitoring. A new approach is based on the combination of Weibull Multiplicative Model and machine learning techniques to differentiate between dark spots and the background. First, the filter created based on Weibull Multiplicative Model is applied to each sub-image. Second, the sub-image is segmented by two different neural networks techniques (Pulsed Coupled Neural Networks and Multilayer Perceptron Neural Networks). As the last step, a very simple filtering process is used to eliminate the false targets. The proposed approaches were tested on 20 ENVISAT and ERS2 images which contained dark spots. The same parameters were used in all tests. For the overall dataset, the average accuracies of 94.05 % and 95.20 % were obtained for PCNN and MLP methods, respectively. The average computational time for dark-spot detection with a 256 × 256 image in about 4 s for PCNN segmentation using IDL software which is the fastest one in this field at present. Our experimental results demonstrate that the proposed approach is very fast, robust and effective. The proposed approach can be applied to the future spaceborne SAR images.
Directory of Open Access Journals (Sweden)
M.J. Uddin
2016-09-01
Full Text Available The two-dimensional unsteady laminar free convective heat and mass transfer fluid flow of a non-Newtonian fluid adjacent to a vertical plate has been analyzed numerically. The two parameters Lie group transformation method that transforms the three independent variables into a single variable is used to transform the continuity, the momentum, the energy and the concentration equations into a set of coupled similarity equations. The transformed equations have been solved by the Runge–Kutta–Fehlberg fourth-fifth order numerical method with shooting technique. Numerical calculations were carried out for the various parameters entering into the problem. The dimensionless velocity, temperature and concentration profiles were shown graphically and the skin friction, heat and mass transfer rates were given in tables. It is found that friction factor and heat transfer (mass transfer rate for methanol are higher (lower than those of hydrogen and water vapor. Friction factor decreases while heat and mass transfer rate increase as the Prandtl number increases. Friction (heat and mass transfer rate factor of Newtonian fluid is higher (lower than the dilatant fluid.
Inference for exponentiated general class of distributions based on record values
Directory of Open Access Journals (Sweden)
Samah N. Sindi
2017-09-01
Full Text Available The main objective of this paper is to suggest and study a new exponentiated general class (EGC of distributions. Maximum likelihood, Bayesian and empirical Bayesian estimators of the parameter of the EGC of distributions based on lower record values are obtained. Furthermore, Bayesian prediction of future records is considered. Based on lower record values, the exponentiated Weibull distribution, its special cases of distributions and exponentiated Gompertz distribution are applied to the EGC of distributions.
Homogeneity and scale testing of generalized gamma distribution
International Nuclear Information System (INIS)
Stehlik, Milan
2008-01-01
The aim of this paper is to derive the exact distributions of the likelihood ratio tests of homogeneity and scale hypothesis when the observations are generalized gamma distributed. The special cases of exponential, Rayleigh, Weibull or gamma distributed observations are discussed exclusively. The photoemulsion experiment analysis and scale test with missing time-to-failure observations are present to illustrate the applications of methods discussed
Directory of Open Access Journals (Sweden)
Alexander K. Volkov
2017-01-01
Full Text Available The modern approaches to the aviation security screeners’ efficiency have been analyzedand, certain drawbacks have been considered. The main drawback is the complexity of ICAO recommendations implementation concerning taking into account of shadow x-ray image complexity factors during preparation and evaluation of prohibited items detection efficiency by aviation security screeners. Х-ray image based factors are the specific properties of the x-ray image that in- fluence the ability to detect prohibited items by aviation security screeners. The most important complexity factors are: geometric characteristics of a prohibited item; view difficulty of prohibited items; superposition of prohibited items byother objects in the bag; bag content complexity; the color similarity of prohibited and usual items in the luggage.The one-dimensional two-parameter IRT model and the related criterion of aviation security screeners’ qualification have been suggested. Within the suggested model the probabilistic detection characteristics of aviation security screeners are considered as functions of such parameters as the difference between level of qualification and level of x-ray images com- plexity, and also between the aviation security screeners’ responsibility and structure of their professional knowledge. On the basis of the given model it is possible to consider two characteristic functions: first of all, characteristic function of qualifica- tion level which describes multi-complexity level of x-ray image interpretation competency of the aviation security screener; secondly, characteristic function of the x-ray image complexity which describes the range of x-ray image interpretation com- petency of the aviation security screeners having various training levels to interpret the x-ray image of a certain level of com- plexity. The suggested complex criterion to assess the level of the aviation security screener qualification allows to evaluate his or
Scaling in the distribution of intertrade durations of Chinese stocks
Jiang, Zhi-Qiang; Chen, Wei; Zhou, Wei-Xing
2008-10-01
The distribution of intertrade durations, defined as the waiting times between two consecutive transactions, is investigated based upon the limit order book data of 23 liquid Chinese stocks listed on the Shenzhen Stock Exchange in the whole year 2003. A scaling pattern is observed in the distributions of intertrade durations, where the empirical density functions of the normalized intertrade durations of all 23 stocks collapse onto a single curve. The scaling pattern is also observed in the intertrade duration distributions for filled and partially filled trades and in the conditional distributions. The ensemble distributions for all stocks are modeled by the Weibull and the Tsallis q-exponential distributions. Maximum likelihood estimation shows that the Weibull distribution outperforms the q-exponential for not-too-large intertrade durations which account for more than 98.5% of the data. Alternatively, nonlinear least-squares estimation selects the q-exponential as a better model, in which the optimization is conducted on the distance between empirical and theoretical values of the logarithmic probability densities. The distribution of intertrade durations is Weibull followed by a power-law tail with an asymptotic tail exponent close to 3.
Rafal Podlaski; Francis A. Roesch
2014-01-01
Two-component mixtures of either the Weibull distribution or the gamma distribution and the kernel density estimator were used for describing the diameter at breast height (dbh) empirical distributions of two-cohort stands. The data consisted of study plots from the Å wietokrzyski National Park (central Poland) and areas close to and including the North Carolina section...
Stress-strength reliability for general bivariate distributions
Directory of Open Access Journals (Sweden)
Alaa H. Abdel-Hamid
2016-10-01
Full Text Available An expression for the stress-strength reliability R=P(X1
Directory of Open Access Journals (Sweden)
Jesús Alexander Sánchez-González
2017-01-01
Full Text Available El objetivo de este trabajo fue evaluar la capacidad de las redes neuronales artificiales (RNA para predecir la vida útil y la acidez en el queso fresco envasado al vacío. En primer lugar, se prepararon muestras de queso de 200 g por unidad. Luego estas muestras se almacenaron en un intervalo de 2 a 4 días a temperaturas de 4, 10 y 16 ° C y humedad relativa del 67,5%. A lo largo del almacenamiento se determinaron la acidez (AC y la aceptabilidad sensorial. Esta aceptabilidad se utilizó para determinar el tiempo de vida útil (TVU por el método de riesgo sensorial Weibull modificado. Se creó y entrenó un conjunto de redes neuronales artificiales (RNA; como entradas se utilizaron la temperatura (T, tiempo de maduración (M y posibilidad de fallo (F (x y TVU y AC como salidas. A partir de este conjunto, se seleccionaron las redes con el menor error cuadrático medio (ECM y el mejor ajuste (R2. Estas redes mostraron coeficientes de correlación (R2 de 0,9996 y 0,6897 para TVU y AC respectivamente y buena precisión en comparación con modelos de regresión. Se muestra que la RNA puede usarse para modelar adecuadamente TVU y en menor grado AC de quesos frescos envasados al vacío.
Energy Technology Data Exchange (ETDEWEB)
Neilson, Henry J., E-mail: hjn2@case.edu [Case Western Reserve University, 10900 Euclid Ave, Cleveland, OH (United States); Petersen, Alex S.; Cheung, Andrew M.; Poon, S. Joseph; Shiflet, Gary J. [University of Virginia, 395 McCormick Road, P.O. Box 400745, Charlottesville, VA 22904 (United States); Widom, Mike [Carnegie Mellon University, 5000 Forbes Avenue, Wean Hall 3325, Pittsburgh, PA 15213 (United States); Lewandowski, John J. [Case Western Reserve University, 10900 Euclid Ave, Cleveland, OH (United States)
2015-05-14
In this study, the variations in mechanical properties of Ni−Co−Ta-based metallic glasses have been analyzed. Three different chemistries of metallic glass ribbons were analyzed: Ni{sub 45}Ta{sub 35}Co{sub 20}, Ni{sub 40}Ta{sub 35}Co{sub 20}Nb{sub 5}, and Ni{sub 30}Ta{sub 35}Co{sub 30}Nb{sub 5}. These alloys possess very high density (approximately 12.5 g/cm{sup 3}) and very high strength (e.g. >3 GPa). Differential scanning calorimetry (DSC) and x-ray diffraction (XRD) were used to characterize the amorphicity of the ribbons. Mechanical properties were measured via a combination of Vickers hardness, bending strength, and tensile strength for each chemistry. At least 50 tests were conducted for each chemistry and each test technique in order to quantify the variability of properties using both 2- and 3-parameter Weibull statistics. The variability in properties and their source(s) were compared to that of other engineering materials, while the nature of deformation via shear bands as well as fracture surface features have been determined using scanning electron microscopy (SEM). Toughness, the role of defects, and volume effects are also discussed.
ASEP of MIMO System with MMSE-OSIC Detection over Weibull-Gamma Fading Channel Subject to AWGGN
Directory of Open Access Journals (Sweden)
Keerti Tiwari
2016-01-01
Full Text Available Ordered successive interference cancellation (OSIC is adopted with minimum mean square error (MMSE detection to enhance the multiple-input multiple-output (MIMO system performance. The optimum detection technique improves the error rate performance but increases system complexity. Therefore, MMSE-OSIC detection is used which reduces error rate compared to traditional MMSE with low complexity. The system performance is analyzed in composite fading environment that includes multipath and shadowing effects known as Weibull-Gamma (WG fading. Along with the composite fading, a generalized noise that is additive white generalized Gaussian noise (AWGGN is considered to show the impact of wireless scenario. This noise model includes various forms of noise as special cases such as impulsive, Gamma, Laplacian, Gaussian, and uniform. Consequently, generalized Q-function is used to model noise. The average symbol error probability (ASEP of MIMO system is computed for 16-quadrature amplitude modulation (16-QAM using MMSE-OSIC detection in WG fading perturbed by AWGGN. Analytical expressions are given in terms of Fox-H function (FHF. These expressions demonstrate the best fit to simulation results.
Abaidoo-Ayin, Harold K; Boakye, Prince G; Jones, Kerby C; Wyatt, Victor T; Besong, Samuel A; Lumor, Stephen E
2017-08-01
This study investigated the compositional characteristics and shelf-life of Njangsa seed oil (NSO). Oil from Njangsa had a high polyunsaturated fatty acid (PUFA) content of which alpha eleostearic acid (α-ESA), an unusual conjugated linoleic acid was the most prevalent (about 52%). Linoleic acid was also present in appreciable amounts (approximately 34%). Our investigations also indicated that the acid-catalyzed transesterification of NSO resulted in lower yields of α-ESA methyl esters, due to isomerization, a phenomenon which was not observed under basic conditions. The triacylglycerol (TAG) profile analysis showed the presence of at least 1 α-ESA fatty acid chain in more than 95% of the oil's TAGs. Shelf-life was determined by the Weibull Hazard Sensory Method, where the end of shelf-life was defined as the time at which 50% of panelists found the flavor of NSO to be unacceptable. This was determined as 21 wk. Our findings therefore support the potential commercial viability of NSO as an important source of physiologically beneficial PUFAs. © 2017 Institute of Food Technologists®.
Recurrent frequency-size distribution of characteristic events
Directory of Open Access Journals (Sweden)
S. G. Abaimov
2009-04-01
Full Text Available Statistical frequency-size (frequency-magnitude properties of earthquake occurrence play an important role in seismic hazard assessments. The behavior of earthquakes is represented by two different statistics: interoccurrent behavior in a region and recurrent behavior at a given point on a fault (or at a given fault. The interoccurrent frequency-size behavior has been investigated by many authors and generally obeys the power-law Gutenberg-Richter distribution to a good approximation. It is expected that the recurrent frequency-size behavior should obey different statistics. However, this problem has received little attention because historic earthquake sequences do not contain enough events to reconstruct the necessary statistics. To overcome this lack of data, this paper investigates the recurrent frequency-size behavior for several problems. First, the sequences of creep events on a creeping section of the San Andreas fault are investigated. The applicability of the Brownian passage-time, lognormal, and Weibull distributions to the recurrent frequency-size statistics of slip events is tested and the Weibull distribution is found to be the best-fit distribution. To verify this result the behaviors of numerical slider-block and sand-pile models are investigated and the Weibull distribution is confirmed as the applicable distribution for these models as well. Exponents β of the best-fit Weibull distributions for the observed creep event sequences and for the slider-block model are found to have similar values ranging from 1.6 to 2.2 with the corresponding aperiodicities C_{V} of the applied distribution ranging from 0.47 to 0.64. We also note similarities between recurrent time-interval statistics and recurrent frequency-size statistics.
Directory of Open Access Journals (Sweden)
Taliat Ola Yusuf
2014-01-01
Full Text Available This paper investigates the influence of blending of metakaolin with silica rich palm oil fuel ash (POFA on the strength distribution of geopolymer mortar. The broadness of strength distribution of quasi-brittle to brittle materials depends strongly on the existence of flaws such as voids, microcracks, and impurities in the material. Blending of materials containing alumina and silica with the objective of improving the performance of geopolymer makes comprehensive characterization necessary. The Weibull distribution is used to study the strength distribution and the reliability of geopolymer mortar specimens prepared from 100% metakaolin, 50% and 70% palm and cured under ambient condition. Mortar prisms and cubes were used to test the materials in flexure and compression, respectively, at 28 days and the results were analyzed using Weibull distribution. In flexure, Weibull modulus increased with POFA replacement, indicating reduced broadness of strength distribution from an increased homogeneity of the material. Modulus, however, decreased with increase in replacement of POFA in the specimens tested under compression. It is concluded that Weibull distribution is suitable for analyses of the blended geopolymer system. While porous microstructure is mainly responsible for flexural failure, heterogeneity of reaction relics is responsible for the compression failure.
Stability of the laws for the distribution of the cumulative failures in railway transport
Kirill VOYNOV
2008-01-01
There are very many different laws of distribution (for example), bellshaped (Gaussian) distribution, lognormal, Weibull distribution, exponential, uniform, Poisson’s, Student’s distributions and so on, which help to describe the real picture of failures with elements in various mechanical systems, in locomotives and carriages, too. To diminish the possibility of getting the rough error in the output of maths data treatment the new method is demonstrated in this article. The task is solved bo...
DEFF Research Database (Denmark)
Feng, Ju; Shen, Wen Zhong
2015-01-01
Reliable wind modelling is of crucial importance for wind farm development. The common practice of using sector-wise Weibull distributions has been found inappropriate for wind farm layout optimization. In this study, we propose a simple and easily implementable method to construct joint distribu...
A mixture of exponentials distribution for a simple and precise assessment of the volcanic hazard
Directory of Open Access Journals (Sweden)
A. T. Mendoza-Rosas
2009-03-01
Full Text Available The assessment of volcanic hazard is the first step for disaster mitigation. The distribution of repose periods between eruptions provides important information about the probability of new eruptions occurring within given time intervals. The quality of the probability estimate, i.e., of the hazard assessment, depends on the capacity of the chosen statistical model to describe the actual distribution of the repose times. In this work, we use a mixture of exponentials distribution, namely the sum of exponential distributions characterized by the different eruption occurrence rates that may be recognized inspecting the cumulative number of eruptions with time in specific VEI (Volcanic Explosivity Index categories. The most striking property of an exponential mixture density is that the shape of the density function is flexible in a way similar to the frequently used Weibull distribution, matching long-tailed distributions and allowing clustering and time dependence of the eruption sequence, with distribution parameters that can be readily obtained from the observed occurrence rates. Thus, the mixture of exponentials turns out to be more precise and much easier to apply than the Weibull distribution. We recommended the use of a mixture of exponentials distribution when regimes with well-defined eruption rates can be identified in the cumulative series of events. As an example, we apply the mixture of exponential distributions to the repose-time sequences between explosive eruptions of the Colima and Popocatépetl volcanoes, México, and compare the results obtained with the Weibull and other distributions.
International Nuclear Information System (INIS)
EI-Shanshoury, G.I.
2011-01-01
Several statistical distributions are used to model various reliability and maintainability parameters. The applied distribution depends on the' nature of the data being analyzed. The presented paper deals with analysis of some statistical distributions used in reliability to reach the best fit of distribution analysis. The calculations rely on circuit quantity parameters obtained by using Relex 2009 computer program. The statistical analysis of ten different distributions indicated that Weibull distribution gives the best fit distribution for modeling the reliability of the data set of Temperature Alarm Circuit (TAC). However, the Exponential distribution is found to be the best fit distribution for modeling the failure rate
On the Distribution of Earthquake Interevent Times and the Impact of Spatial Scale
Hristopulos, Dionissios
2013-04-01
The distribution of earthquake interevent times is a subject that has attracted much attention in the statistical physics literature [1-3]. A recent paper proposes that the distribution of earthquake interevent times follows from the the interplay of the crustal strength distribution and the loading function (stress versus time) of the Earth's crust locally [4]. It was also shown that the Weibull distribution describes earthquake interevent times provided that the crustal strength also follows the Weibull distribution and that the loading function follows a power-law during the loading cycle. I will discuss the implications of this work and will present supporting evidence based on the analysis of data from seismic catalogs. I will also discuss the theoretical evidence in support of the Weibull distribution based on models of statistical physics [5]. Since other-than-Weibull interevent times distributions are not excluded in [4], I will illustrate the use of the Kolmogorov-Smirnov test in order to determine which probability distributions are not rejected by the data. Finally, we propose a modification of the Weibull distribution if the size of the system under investigation (i.e., the area over which the earthquake activity occurs) is finite with respect to a critical link size. keywords: hypothesis testing, modified Weibull, hazard rate, finite size References [1] Corral, A., 2004. Long-term clustering, scaling, and universality in the temporal occurrence of earthquakes, Phys. Rev. Lett., 9210) art. no. 108501. [2] Saichev, A., Sornette, D. 2007. Theory of earthquake recurrence times, J. Geophys. Res., Ser. B 112, B04313/1-26. [3] Touati, S., Naylor, M., Main, I.G., 2009. Origin and nonuniversality of the earthquake interevent time distribution Phys. Rev. Lett., 102 (16), art. no. 168501. [4] Hristopulos, D.T., 2003. Spartan Gibbs random field models for geostatistical applications, SIAM Jour. Sci. Comput., 24, 2125-2162. [5] I. Eliazar and J. Klafter, 2006
de Oliveira, Thales Leandro Coutinho; Soares, Rodrigo de Araújo; Piccoli, Roberta Hilsdorf
2013-03-01
The antimicrobial effect of oregano (Origanum vulgare L.) and lemongrass (Cymbopogon citratus (DC.) Stapf.) essential oils (EOs) against Salmonella enterica serotype Enteritidis in in vitro experiments, and inoculated in ground bovine meat during refrigerated storage (4±2 °C) for 6 days was evaluated. The Weibull model was tested to fit survival/inactivation bacterial curves (estimating of p and δ parameters). The minimum inhibitory concentration (MIC) value for both EOs on S. Enteritidis was 3.90 μl/ml. The EO concentrations applied in the ground beef were 3.90, 7.80 and 15.60 μl/g, based on MIC levels and possible activity reduction by food constituents. Both evaluated EOs in all tested levels, showed antimicrobial effects, with microbial populations reducing (p≤0.05) along time storage. Evaluating fit-quality parameters (RSS and RSE) Weibull models are able to describe the inactivation curves of EOs against S. Enteritidis. The application of EOs in processed meats can be used to control pathogens during refrigerated shelf-life. Copyright © 2012 Elsevier Ltd. All rights reserved.
Idealized models of the joint probability distribution of wind speeds
Monahan, Adam H.
2018-05-01
The joint probability distribution of wind speeds at two separate locations in space or points in time completely characterizes the statistical dependence of these two quantities, providing more information than linear measures such as correlation. In this study, we consider two models of the joint distribution of wind speeds obtained from idealized models of the dependence structure of the horizontal wind velocity components. The bivariate Rice distribution follows from assuming that the wind components have Gaussian and isotropic fluctuations. The bivariate Weibull distribution arises from power law transformations of wind speeds corresponding to vector components with Gaussian, isotropic, mean-zero variability. Maximum likelihood estimates of these distributions are compared using wind speed data from the mid-troposphere, from different altitudes at the Cabauw tower in the Netherlands, and from scatterometer observations over the sea surface. While the bivariate Rice distribution is more flexible and can represent a broader class of dependence structures, the bivariate Weibull distribution is mathematically simpler and may be more convenient in many applications. The complexity of the mathematical expressions obtained for the joint distributions suggests that the development of explicit functional forms for multivariate speed distributions from distributions of the components will not be practical for more complicated dependence structure or more than two speed variables.
Directory of Open Access Journals (Sweden)
Petros Damos
Full Text Available Temperature implies contrasting biological causes of demographic aging in poikilotherms. In this work, we used the reliability theory to describe the consistency of mortality with age in moth populations and to show that differentiation in hazard rates is related to extrinsic environmental causes such as temperature. Moreover, experiments that manipulate extrinsic mortality were used to distinguish temperature-related death rates and the pertinence of the Weibull aging model. The Newton-Raphson optimization method was applied to calculate parameters for small samples of ages at death by estimating the maximum likelihoods surfaces using scored gradient vectors and the Hessian matrix. The study reveals for the first time that the Weibull function is able to describe contrasting biological causes of demographic aging for moth populations maintained at different temperature regimes. We demonstrate that at favourable conditions the insect death rate accelerates as age advances, in contrast to the extreme temperatures in which each individual drifts toward death in a linear fashion and has a constant chance of passing away. Moreover, slope of hazard rates shifts towards a constant initial rate which is a pattern demonstrated by systems which are not wearing out (e.g. non-aging since the failure, or death, is a random event independent of time. This finding may appear surprising, because, traditionally, it was mostly thought as rule that in aging population force of mortality increases exponentially until all individuals have died. Moreover, in relation to other studies, we have not observed any typical decelerating aging patterns at late life (mortality leveling-off, but rather, accelerated hazard rates at optimum temperatures and a stabilized increase at the extremes.In most cases, the increase in aging-related mortality was simulated reasonably well according to the Weibull survivorship model that is applied. Moreover, semi log- probability hazard
Directory of Open Access Journals (Sweden)
Marlen Navarro
Full Text Available Con el objetivo de conocer el vigor de las semillas de Albizia lebbeck mediante la evaluación de la emergencia de plántulas, a través de la función Weibull modificada, se realizó la siembra en tres condiciones ambientales y en diferentes tiempos de almacenamiento de la semilla. El diseño fue completamente aleatorizado, con arreglo factorial. Se realizó análisis de varianza para los parámetros M (emergencia acumulada máxima, k (tasa de emergencia y Z (retraso para el inicio de la emergencia de la función Weibull modificada. A partir de los seis meses de iniciado el almacenamiento (44,1 % se observó la pérdida brusca del porcentaje de M en el vivero (A y ligeras variaciones en la cabina (C, en comparación con A y B (sombreador. El ámbito de dispersión del parámetro k osciló entre 0,4-2,6; 0,29-1,9 y 0,5-1,4 % emergencia d-1 para las evaluaciones realizadas en A, B y C, respectivamente. Del análisis de Z se interpretó que el tiempo para el inicio de la emergencia, sin distinción del ambiente de siembra, estuvo enmarcado entre los 3,0 y 7,3 días posteriores a la siembra. En el vivero a pleno sol, en la evaluación a 6 mdia (meses de iniciado el almacenamiento, se obtuvieron los mejores resultados de los parámetros biológicos de la ecuación de Weibull, lo cual permitió un análisis global que indicó un grado de vigor alto en las semillas de A. lebbeck, en comparación con las restantes evaluaciones
Attar, M.; Karrech, A.; Regenauer-Lieb, K.
2014-05-01
The free vibration of a shear deformable beam with multiple open edge cracks is studied using a lattice spring model (LSM). The beam is supported by a so-called two-parameter elastic foundation, where normal and shear foundation stiffnesses are considered. Through application of Timoshenko beam theory, the effects of transverse shear deformation and rotary inertia are taken into account. In the LSM, the beam is discretised into a one-dimensional assembly of segments interacting via rotational and shear springs. These springs represent the flexural and shear stiffnesses of the beam. The supporting action of the elastic foundation is described also by means of normal and shear springs acting on the centres of the segments. The relationship between stiffnesses of the springs and the elastic properties of the one-dimensional structure are identified by comparing the homogenised equations of motion of the discrete system and Timoshenko beam theory.
Chen, Fan; Huang, Shaoxiong; Ding, Jinjin; Ding, Jinjin; Gao, Bo; Xie, Yuguang; Wang, Xiaoming
2018-01-01
This paper proposes a fast reliability assessing method for distribution grid with distributed renewable energy generation. First, the Weibull distribution and the Beta distribution are used to describe the probability distribution characteristics of wind speed and solar irradiance respectively, and the models of wind farm, solar park and local load are built for reliability assessment. Then based on power system production cost simulation probability discretization and linearization power flow, a optimal power flow objected with minimum cost of conventional power generation is to be resolved. Thus a reliability assessment for distribution grid is implemented fast and accurately. The Loss Of Load Probability (LOLP) and Expected Energy Not Supplied (EENS) are selected as the reliability index, a simulation for IEEE RBTS BUS6 system in MATLAB indicates that the fast reliability assessing method calculates the reliability index much faster with the accuracy ensured when compared with Monte Carlo method.
DEFF Research Database (Denmark)
Gryning, Sven-Erik; Batchvarova, Ekaterina; Floors, Rogier Ralph
2014-01-01
An investigation of the long-term variability of wind profiles for wind energy applications is presented. The observations consists of wind measurements obtained from a ground-based wind lidar at heights between 100 and 600 m, in combination with measurements from tallmeteorological towers...... by the root-mean-square error was about 10 % lower for the analysis compared to the forecast simulations. At the rural coastal site, the observed mean wind speeds above 60 m were underestimated by both the analysis and forecast model runs. For the inland suburban area, the mean wind speed is overestimated...
DEFF Research Database (Denmark)
Gryning, Sven-Erik; Floors, Rogier Ralph; Peña, Alfredo
2016-01-01
Wind-speed observations from tall towers are used in combination with observations up to 600 m in altitude from a Doppler wind lidar to study the long-term conditions over suburban (Hamburg), rural coastal (Høvsøre) and marine (FINO3) sites. The variability in the wind field among the sites is ex...... of the vertical profile of the shape parameter fits well with observations over land, coastal regions and over the sea. An applied model for the dependence of the reversal height on the surface roughness is in good agreement with the observations over land....
DEFF Research Database (Denmark)
Gryning, Sven-Erik; Batchvarova, Ekaterina; Floors, Rogier
2013-01-01
By use of 1 yr of measurements performed with a wind lidar up to 600-m height, in combination with a tall meteorological tower, the impact of nudging on the simulated wind profile at a flat coastal site (Høvsøre) in western Denmark using the Advanced Research version of the Weather Research...
Indrayani, Ervina; Dimara, Lisiard; Paiki, Kalvin; Reba, Felix
2018-01-01
The coastal waters of East Yapen is one of the spawning sites and areas of care for marine biota in Papua. Because of its very open location, it is widely used by human activities such as fishing, residential, industrial and cruise lines. This indirectly affects the balance of coastal waters condition of East Yapen that impact on the existence of…
Institute of Scientific and Technical Information of China (English)
黄献; 刘裕恒; 莫志江
2006-01-01
目的:介绍用SPSS拟合药物溶出度Weibull参数.方法:采用SPSS软件经非线性回归拟合Weibull模型(Y=1-e-(t-τ)m/t0)处理药物溶出度数参数.结果:SPSS与目前其它方法拟合的Weibull模型处理药物溶出度参数比较,其准确性高,计算过程简便、快速.结论:本方法拟合药物Weibull参数操作简单,计算结果准确.
Stability of the laws for the distribution of the cumulative failures in railway transport
Directory of Open Access Journals (Sweden)
Kirill VOYNOV
2008-01-01
Full Text Available There are very many different laws of distribution (for example, bellshaped (Gaussian distribution, lognormal, Weibull distribution, exponential, uniform, Poisson’s, Student’s distributions and so on, which help to describe the real picture of failures with elements in various mechanical systems, in locomotives and carriages, too. To diminish the possibility of getting the rough error in the output of maths data treatment the new method is demonstrated in this article. The task is solved both to the discrete, and to the continuous distributions.
Langenbucher, Frieder
2003-01-01
MS Excel is a useful tool to handle in vitro/in vivo correlation (IVIVC) distribution functions, with emphasis on the Weibull and the biexponential distribution, which are most useful for the presentation of cumulative profiles, e.g. release in vitro or urinary excretion in vivo, and differential profiles such as the plasma response in vivo. The discussion includes moments (AUC and mean) as summarizing statistics, and data-fitting algorithms for parameter estimation.
Sun, Chengqi; Liu, Xiaolong; Hong, Youshi
2015-06-01
In this paper, ultrasonic (20 kHz) fatigue tests were performed on specimens of a high-strength steel in very high cycle fatigue (VHCF) regime. Experimental results showed that for most tested specimens failed in a VHCF regime, a fatigue crack originated from the interior of specimen with a fish-eye pattern, which contained a fine granular area (FGA) centered by an inclusion as the crack origin. Then, a two-parameter model is proposed to predict the fatigue life of high-strength steels with fish-eye mode failure in a VHCF regime, which takes into account the inclusion size and the FGA size. The model was verified by the data of present experiments and those in the literature. Furthermore, an analytic formula was obtained for estimating the equivalent crack growth rate within the FGA. The results also indicated that the stress intensity factor range at the front of the FGA varies within a small range, which is irrespective of stress amplitude and fatigue life.
Evaluación poscosecha y estimación de vida útil de guayaba fresca utilizando el modelo de Weibull
Directory of Open Access Journals (Sweden)
García Mogollón Carlos
2010-09-01
Full Text Available
La guayaba (Psidium guajava L. es una fruta tropical susceptible de sufrir alteraciones indeseables que afectan su vida útil debido a condiciones inadecuadas de almacenamiento y acondicionamiento. En este trabajo se estimó la vida útil de guayaba en fresco utilizando el modelo probabilístico de Weibull y se valuó la calidad de los frutos durante almacenamiento en diferentes condiciones de temperatura y empaque. La evaluación poscosecha se hizo por 15 días con guayabas variedad regional roja. Se utilizó un diseño completamente al azar con arreglo factorial consistente en tres factores: tiempo de almacenamiento con seis niveles (0, 3, 6, 9, 12 y 15 días, temperatura de almacenamiento con dos niveles: ambiente (37 °C y humedad relativa (HR entre 85 y 90% y refrigeración (9±2 °C y HR de 85 - 90%; dos tipo de empaques: bandeja de poliestireno con film plástico de PVC y 'foil' de aluminio. Durante la evaluación sensorial en el periodo de almacenamiento se usó una escala estructurada de tres puntos grado de satisfacción. El modelo de Weibull demostró ser adecuado para predecir la vida útil de la guayaba fresca basados en los criterios de ajustes, límites de confianza de aceptación y fallo. Durante el periodo de almacenamiento se observó que el factor tiempo, la temperatura y el tipo de empaque tienen un efecto estadístico significativo (P < 0.05 sobre el diámetro equivalente, esfericidad, masa específica aparente, SST, pH, acidez y evaluación sensorial de los frutos. El producto puede ser consumido como fruta fresca hasta diez días de almacenamiento a temperatura ambiente y máximo quince días en almacenamiento refrigerado.
Evaluación poscosecha y estimación de vida útil de guayaba fresca utilizando el modelo de Weibull
Directory of Open Access Journals (Sweden)
Carlos García Mogollón
2010-07-01
Full Text Available La guayaba (Psidium guajava L. es una fruta tropical susceptible de sufrir alteraciones indeseables que afectan su vida útil debido a condiciones inadecuadas de almacenamiento y acondicionamiento. En este trabajo se estimó la vida útil de guayaba en fresco utilizando el modelo probabilístico de Weibull y se valuó la calidad de los frutos durante almacenamiento en diferentes condiciones de temperatura y empaque. La evaluación poscosecha se hizo por 15 días con guayabas variedad regional roja. Se utilizó un dise&#ntilde;o completamente al azar con arreglo factorial consistente en tres factores: tiempo de almacenamiento con seis niveles (0, 3, 6, 9, 12 y 15 días, temperatura de almacenamiento con dos niveles: ambiente (37 °C y humedad relativa (HR entre 85 y 90% y refrigeración (9±2 °C y HR de 85 - 90%; dos tipo de empaques: bandeja de poliestireno con film plástico de PVC y 'foil' de aluminio. Durante la evaluación sensorial en el periodo de almacenamiento se usó una escala estructurada de tres puntos grado de satisfacción. El modelo de Weibull demostró ser adecuado para predecir la vida útil de la guayaba fresca basados en los criterios de ajustes, límites de confianza de aceptación y fallo. Durante el periodo de almacenamiento se observó que el factor tiempo, la temperatura y el tipo de empaque tienen un efecto estadístico significativo (P < 0.05 sobre el diámetro equivalente, esfericidad, masa específica aparente, SST, pH, acidez y evaluación sensorial de los frutos. El producto puede ser consumido como fruta fresca hasta diez días de almacenamiento a temperatura ambiente y máximo quince días en almacenamiento refrigerado.
International Nuclear Information System (INIS)
Ochiai, S; Matsubayashi, H; Okuda, H; Osamura, K; Otto, A; Malozemoff, A
2009-01-01
Distributions of local and overall critical currents and correlation of n value to the critical current of bent Bi2223 composite tape were studied from the statistical viewpoint. The data of the local and overall transport critical currents and n values of the Bi2223 composite tape specimens were collected experimentally for a wide range of bending strain (0-1.1%) by using the specimens, designed so as to characterize the local and overall critical currents and n values. The measured local and overall critical currents were analyzed with various types of Weibull distribution function. Which of the Weibull distribution functions is suitable for the description of the distribution of local and overall critical currents at each bending strain, and also how much the Weibull parameter values characterizing the distribution vary with bending strain, were revealed. Then we attempted to reproduce the overall critical current distribution and correlation of the overall n value to the overall critical current from the distribution of local critical currents and the correlation of the local n value to the local critical current by a Monte Carlo simulation. The measured average values of critical current and n value at each bending strain and the correlation of n value to critical current were reproduced well by the present simulation, while the distribution of critical current values was reproduced fairly well but not fully. The reason for this is discussed.
Handbook of exponential and related distributions for engineers and scientists
Pal, Nabendu; Lim, Wooi K
2005-01-01
The normal distribution is widely known and used by scientists and engineers. However, there are many cases when the normal distribution is not appropriate, due to the data being skewed. Rather than leaving you to search through journal articles, advanced theoretical monographs, or introductory texts for alternative distributions, the Handbook of Exponential and Related Distributions for Engineers and Scientists provides a concise, carefully selected presentation of the properties and principles of selected distributions that are most useful for application in the sciences and engineering.The book begins with all the basic mathematical and statistical background necessary to select the correct distribution to model real-world data sets. This includes inference, decision theory, and computational aspects including the popular Bootstrap method. The authors then examine four skewed distributions in detail: exponential, gamma, Weibull, and extreme value. For each one, they discuss general properties and applicabi...
DEFF Research Database (Denmark)
Missov, Trifon I.; Schöley, Jonas
to this criterion admissible distributions are, for example, the gamma, the beta, the truncated normal, the log-logistic and the Weibull, while distributions like the log-normal and the inverse Gaussian do not satisfy this condition. In this article we show that models with admissible frailty distributions...... and a Gompertz baseline provide a better fit to adult human mortality data than the corresponding models with non-admissible frailty distributions. We implement estimation procedures for mixture models with a Gompertz baseline and frailty that follows a gamma, truncated normal, log-normal, or inverse Gaussian...
Directory of Open Access Journals (Sweden)
Alberto Cargnelutti Filho
2004-12-01
Full Text Available O objetivo deste trabalho foi verificar o ajuste das séries de dados de radiação solar global média decendial, de 22 municípios do Estado do Rio Grande do Sul, às funções de distribuições de probabilidade normal, log-normal, gama, gumbel e weibull. Aplicou-se o teste de aderência de Kolmogorov-Smirnov, nas 792 séries de dados (22 municípios x 36 decêndios de radiação solar global média decendial, para verificar o ajuste dos dados às distribuições normal, log-normal, gama, gumbel e weibull, totalizando 3.960 testes. Os dados decendiais de radiação solar global média se ajustam às funções de distribuições de probabilidade normal, log-normal, gama, gumbel e weibull, e apresentam melhor ajuste à função de distribuição de probabilidade normal.The objective of this work was to verify the adjustment of data series for average global solar radiation to the normal, log-normal, gamma, gumbel and weibull probability distribution functions. Data were collected from 22 cities in Rio Grande do Sul State, Brazil. The Kolmogorov-Smirnov test was applied in the 792 series of data (22 localities x 36 periods of ten days of average global solar radiation to verify the adjustment of the data to the normal, log-normal, gamma, gumbel and weibull probability distribution functions, totalizing 3,960 tests. The data of average global solar radiation adjust to the normal, log-normal, gamma, gumbel and weibull probability distribution functions, and present a better adjustment to the normal probability function.
Energy Technology Data Exchange (ETDEWEB)
Muñoz-Jaramillo, Andrés; Windmueller, John C.; Amouzou, Ernest C.; Longcope, Dana W. [Department of Physics, Montana State University, Bozeman, MT 59717 (United States); Senkpeil, Ryan R. [Department of Physics, Purdue University, West Lafayette, IN 47907 (United States); Tlatov, Andrey G. [Kislovodsk Mountain Astronomical Station of the Pulkovo Observatory, Kislovodsk 357700 (Russian Federation); Nagovitsyn, Yury A. [Pulkovo Astronomical Observatory, Russian Academy of Sciences, St. Petersburg 196140 (Russian Federation); Pevtsov, Alexei A. [National Solar Observatory, Sunspot, NM 88349 (United States); Chapman, Gary A.; Cookson, Angela M. [San Fernando Observatory, Department of Physics and Astronomy, California State University Northridge, Northridge, CA 91330 (United States); Yeates, Anthony R. [Department of Mathematical Sciences, Durham University, South Road, Durham DH1 3LE (United Kingdom); Watson, Fraser T. [National Solar Observatory, Tucson, AZ 85719 (United States); Balmaceda, Laura A. [Institute for Astronomical, Terrestrial and Space Sciences (ICATE-CONICET), San Juan (Argentina); DeLuca, Edward E. [Harvard-Smithsonian Center for Astrophysics, Cambridge, MA 02138 (United States); Martens, Petrus C. H., E-mail: munoz@solar.physics.montana.edu [Department of Physics and Astronomy, Georgia State University, Atlanta, GA 30303 (United States)
2015-02-10
In this work, we take advantage of 11 different sunspot group, sunspot, and active region databases to characterize the area and flux distributions of photospheric magnetic structures. We find that, when taken separately, different databases are better fitted by different distributions (as has been reported previously in the literature). However, we find that all our databases can be reconciled by the simple application of a proportionality constant, and that, in reality, different databases are sampling different parts of a composite distribution. This composite distribution is made up by linear combination of Weibull and log-normal distributions—where a pure Weibull (log-normal) characterizes the distribution of structures with fluxes below (above) 10{sup 21}Mx (10{sup 22}Mx). Additionally, we demonstrate that the Weibull distribution shows the expected linear behavior of a power-law distribution (when extended to smaller fluxes), making our results compatible with the results of Parnell et al. We propose that this is evidence of two separate mechanisms giving rise to visible structures on the photosphere: one directly connected to the global component of the dynamo (and the generation of bipolar active regions), and the other with the small-scale component of the dynamo (and the fragmentation of magnetic structures due to their interaction with turbulent convection)
Directory of Open Access Journals (Sweden)
Pelumi E. Oguntunde
2017-01-01
Full Text Available Developing new compound distributions which are more flexible than the existing distributions have become the new trend in distribution theory. In this present study, the Lomax distribution was extended using the Gompertz family of distribution, its resulting densities and statistical properties were carefully derived, and the method of maximum likelihood estimation was proposed in estimating the model parameters. A simulation study to assess the performance of the parameters of Gompertz Lomax distribution was provided and an application to real life data was provided to assess the potentials of the newly derived distribution. Excerpt from the analysis indicates that the Gompertz Lomax distribution performed better than the Beta Lomax distribution, Weibull Lomax distribution, and Kumaraswamy Lomax distribution.
Chang, Wen-Ruey; Matz, Simon; Chang, Chien-Chi
2012-01-01
The available coefficient of friction (ACOF) for human locomotion is the maximum coefficient of friction that can be supported without a slip at the shoe and floor interface. A statistical model was introduced to estimate the probability of slip by comparing the ACOF with the required coefficient of friction, assuming that both coefficients have stochastic distributions. This paper presents an investigation of the stochastic distributions of the ACOF of quarry tiles under dry, water and glycerol conditions. One hundred friction measurements were performed on a walkway under the surface conditions of dry, water and 45% glycerol concentration. The Kolmogorov-Smirnov goodness-of-fit test was used to determine if the distribution of the ACOF was a good fit with the normal, log-normal and Weibull distributions. The results indicated that the ACOF appears to fit the normal and log-normal distributions better than the Weibull distribution for the water and glycerol conditions. However, no match was found between the distribution of ACOF under the dry condition and any of the three continuous distributions evaluated. Based on limited data, a normal distribution might be more appropriate due to its simplicity, practicality and familiarity among the three distributions evaluated.
Optimal power flow for distribution networks with distributed generation
Directory of Open Access Journals (Sweden)
Radosavljević Jordan
2015-01-01
Full Text Available This paper presents a genetic algorithm (GA based approach for the solution of the optimal power flow (OPF in distribution networks with distributed generation (DG units, including fuel cells, micro turbines, diesel generators, photovoltaic systems and wind turbines. The OPF is formulated as a nonlinear multi-objective optimization problem with equality and inequality constraints. Due to the stochastic nature of energy produced from renewable sources, i.e. wind turbines and photovoltaic systems, as well as load uncertainties, a probabilisticalgorithm is introduced in the OPF analysis. The Weibull and normal distributions are employed to model the input random variables, namely the wind speed, solar irradiance and load power. The 2m+1 point estimate method and the Gram Charlier expansion theory are used to obtain the statistical moments and the probability density functions (PDFs of the OPF results. The proposed approach is examined and tested on a modified IEEE 34 node test feeder with integrated five different DG units. The obtained results prove the efficiency of the proposed approach to solve both deterministic and probabilistic OPF problems for different forms of the multi-objective function. As such, it can serve as a useful decision-making supporting tool for distribution network operators. [Projekat Ministarstva nauke Republike Srbije, br. TR33046
Determining the distribution of fitness effects using a generalized Beta-Burr distribution.
Joyce, Paul; Abdo, Zaid
2017-07-12
In Beisel et al. (2007), a likelihood framework, based on extreme value theory (EVT), was developed for determining the distribution of fitness effects for adaptive mutations. In this paper we extend this framework beyond the extreme distributions and develop a likelihood framework for testing whether or not extreme value theory applies. By making two simple adjustments to the Generalized Pareto Distribution (GPD) we introduce a new simple five parameter probability density function that incorporates nearly every common (continuous) probability model ever used. This means that all of the common models are nested. This has important implications in model selection beyond determining the distribution of fitness effects. However, we demonstrate the use of this distribution utilizing likelihood ratio testing to evaluate alternative distributions to the Gumbel and Weibull domains of attraction of fitness effects. We use a bootstrap strategy, utilizing importance sampling, to determine where in the parameter space will the test be most powerful in detecting deviations from these domains and at what sample size, with focus on small sample sizes (n<20). Our results indicate that the likelihood ratio test is most powerful in detecting deviation from the Gumbel domain when the shape parameters of the model are small while the test is more powerful in detecting deviations from the Weibull domain when these parameters are large. As expected, an increase in sample size improves the power of the test. This improvement is observed to occur quickly with sample size n≥10 in tests related to the Gumbel domain and n≥15 in the case of the Weibull domain. This manuscript is in tribute to the contributions of Dr. Paul Joyce to the areas of Population Genetics, Probability Theory and Mathematical Statistics. A Tribute section is provided at the end that includes Paul's original writing in the first iterations of this manuscript. The Introduction and Alternatives to the GPD sections
Stand diameter distribution modelling and prediction based on Richards function.
Directory of Open Access Journals (Sweden)
Ai-guo Duan
Full Text Available The objective of this study was to introduce application of the Richards equation on modelling and prediction of stand diameter distribution. The long-term repeated measurement data sets, consisted of 309 diameter frequency distributions from Chinese fir (Cunninghamia lanceolata plantations in the southern China, were used. Also, 150 stands were used as fitting data, the other 159 stands were used for testing. Nonlinear regression method (NRM or maximum likelihood estimates method (MLEM were applied to estimate the parameters of models, and the parameter prediction method (PPM and parameter recovery method (PRM were used to predict the diameter distributions of unknown stands. Four main conclusions were obtained: (1 R distribution presented a more accurate simulation than three-parametric Weibull function; (2 the parameters p, q and r of R distribution proved to be its scale, location and shape parameters, and have a deep relationship with stand characteristics, which means the parameters of R distribution have good theoretical interpretation; (3 the ordinate of inflection point of R distribution has significant relativity with its skewness and kurtosis, and the fitted main distribution range for the cumulative diameter distribution of Chinese fir plantations was 0.4∼0.6; (4 the goodness-of-fit test showed diameter distributions of unknown stands can be well estimated by applying R distribution based on PRM or the combination of PPM and PRM under the condition that only quadratic mean DBH or plus stand age are known, and the non-rejection rates were near 80%, which are higher than the 72.33% non-rejection rate of three-parametric Weibull function based on the combination of PPM and PRM.
Tahouneh, Vahid; Naei, Mohammad Hasan
2016-03-01
The main purpose of this paper is to investigate the effect of bidirectional continuously graded nanocomposite materials on free vibration of thick shell panels rested on elastic foundations. The elastic foundation is considered as a Pasternak model after adding a shear layer to the Winkler model. The panels reinforced by randomly oriented straight single-walled carbon nanotubes are considered. The volume fractions of SWCNTs are assumed to be graded not only in the radial direction, but also in axial direction of the curved panel. This study presents a 2-D six-parameter power-law distribution for CNTs volume fraction of 2-D continuously graded nanocomposite that gives designers a powerful tool for flexible designing of structures under multi-functional requirements. The benefit of using generalized power-law distribution is to illustrate and present useful results arising from symmetric, asymmetric and classic profiles. The material properties are determined in terms of local volume fractions and material properties by Mori-Tanaka scheme. The 2-D differential quadrature method as an efficient numerical tool is used to discretize governing equations and to implement boundary conditions. The fast rate of convergence of the method is shown and results are compared against existing results in literature. Some new results for natural frequencies of the shell are prepared, which include the effects of elastic coefficients of foundation, boundary conditions, material and geometrical parameters. The interesting results indicate that a graded nanocomposite volume fraction in two directions has a higher capability to reduce the natural frequency than conventional 1-D functionally graded nanocomposite materials.
Modeling of speed distribution for mixed bicycle traffic flow
Directory of Open Access Journals (Sweden)
Cheng Xu
2015-11-01
Full Text Available Speed is a fundamental measure of traffic performance for highway systems. There were lots of results for the speed characteristics of motorized vehicles. In this article, we studied the speed distribution for mixed bicycle traffic which was ignored in the past. Field speed data were collected from Hangzhou, China, under different survey sites, traffic conditions, and percentages of electric bicycle. The statistics results of field data show that the total mean speed of electric bicycles is 17.09 km/h, 3.63 km/h faster and 27.0% higher than that of regular bicycles. Normal, log-normal, gamma, and Weibull distribution models were used for testing speed data. The results of goodness-of-fit hypothesis tests imply that the log-normal and Weibull model can fit the field data very well. Then, the relationships between mean speed and electric bicycle proportions were proposed using linear regression models, and the mean speed for purely electric bicycles or regular bicycles can be obtained. The findings of this article will provide effective help for the safety and traffic management of mixed bicycle traffic.
International Nuclear Information System (INIS)
Jung, Christopher; Schindler, Dirk; Laible, Jessica; Buchholz, Alexander
2017-01-01
Highlights: • Evaluation of statistical properties of 10,016 empirical wind speed distributions. • Analysis of the shape of empirical wind speed distributions by L-moment ratios. • Introduction of a new system of wind speed distributions (Swd). • Random forests classification of the most appropriate distribution. • Comprehensive goodness of Swd fit evaluation on a global scale. - Abstract: Accurate modeling of empirical wind speed distributions is a crucial step in the estimation of average wind turbine power output. For this purpose, the Weibull distribution has often been fitted to empirical wind speed distributions. However, the Weibull distribution has been found to be insufficient to reproduce many wind speed regimes existing around the world. Results from previous studies demonstrate that numerous one-component distributions as well as mixture distributions provide a better goodness-of-fit to empirical wind speed distributions than the Weibull distribution. Moreover, there is considerable interest to apply a single system of distributions that can be utilized to reproduce the large majority of near-surface wind speed regimes existing around the world. Therefore, a system of wind speed distributions was developed that is capable of reproducing the main characteristics of existing wind speed regimes. The proposed system consists of two one-component distributions (Kappa and Wakeby) and one mixture distribution (Burr-Generalized Extreme Value). A random forests classifier was trained in order to select the most appropriate of these three distributions for each of 10,016 globally distributed empirical wind speed distributions. The shape of the empirical wind speed distributions was described by L-moment ratios. The L-moment ratios were used as predictor variables for the random forests classifier. The goodness-of-fit of the system of wind speed distributions was evaluated according to eleven goodness-of-fit metrics, which were merged into one
International Nuclear Information System (INIS)
El-Shanshoury, Gh. I.; El-Hemamy, S.T.
2013-01-01
The main objective of this paper is to identify an appropriate probability model and best plotting position formula which represent the maximum annual wind speed in east Cairo. This model can be used to estimate the extreme wind speed and return period at a particular site as well as to determine the radioactive release distribution in case of accident occurrence at a nuclear power plant. Wind speed probabilities can be estimated by using probability distributions. An accurate determination of probability distribution for maximum wind speed data is very important in expecting the extreme value . The probability plots of the maximum annual wind speed (MAWS) in east Cairo are fitted to six major statistical distributions namely: Gumbel, Weibull, Normal, Log-Normal, Logistic and Log- Logistic distribution, while eight plotting positions of Hosking and Wallis, Hazen, Gringorten, Cunnane, Blom, Filliben, Benard and Weibull are used for determining exceedance of their probabilities. A proper probability distribution for representing the MAWS is selected by the statistical test criteria in frequency analysis. Therefore, the best plotting position formula which can be used to select appropriate probability model representing the MAWS data must be determined. The statistical test criteria which represented in: the probability plot correlation coefficient (PPCC), the root mean square error (RMSE), the relative root mean square error (RRMSE) and the maximum absolute error (MAE) are used to select the appropriate probability position and distribution. The data obtained show that the maximum annual wind speed in east Cairo vary from 44.3 Km/h to 96.1 Km/h within duration of 39 years . Weibull plotting position combined with Normal distribution gave the highest fit, most reliable, accurate predictions and determination of the wind speed in the study area having the highest value of PPCC and lowest values of RMSE, RRMSE and MAE
International Nuclear Information System (INIS)
Das, Rabindra Nath; Kim, Jinseog; Park, Jeong-Soo
2015-01-01
In quality engineering, the most commonly used lifetime distributions are log-normal, exponential, gamma and Weibull. Experimental designs are useful for predicting the optimal operating conditions of the process in lifetime improvement experiments. In the present article, invariant robust first-order D-optimal designs are derived for correlated lifetime responses having the above four distributions. Robust designs are developed for some correlated error structures. It is shown that robust first-order D-optimal designs for these lifetime distributions are always robust rotatable but the converse is not true. Moreover, it is observed that these designs depend on the respective error covariance structure but are invariant to the above four lifetime distributions. This article generalizes the results of Das and Lin [7] for the above four lifetime distributions with general (intra-class, inter-class, compound symmetry, and tri-diagonal) correlated error structures. - Highlights: • This paper presents invariant robust first-order D-optimal designs under correlated lifetime responses. • The results of Das and Lin [7] are extended for the four lifetime (log-normal, exponential, gamma and Weibull) distributions. • This paper also generalizes the results of Das and Lin [7] to more general correlated error structures
Hyperbolic Cosine–Exponentiated Exponential Lifetime Distribution and its Application in Reliability
Directory of Open Access Journals (Sweden)
Omid Kharazmi
2017-02-01
Full Text Available Recently, Kharazmi and Saadatinik (2016 introduced a new family of lifetime distributions called hyperbolic cosine – F (HCF distribution. In the present paper, it is focused on a special case of HCF family with exponentiated exponential distribution as a baseline distribution (HCEE. Various properties of the proposed distribution including explicit expressions for the moments, quantiles, mode, moment generating function, failure rate function, mean residual lifetime, order statistics and expression of the entropy are derived. Estimating parameters of HCEE distribution are obtained by eight estimation methods: maximum likelihood, Bayesian, maximum product of spacings, parametric bootstrap, non-parametric bootstrap, percentile, least-squares and weighted least-squares. A simulation study is conducted to examine the bias, mean square error of the maximum likelihood estimators. Finally, one real data set has been analyzed for illustrative purposes and it is observed that the proposed model ﬁts better than Weibull, gamma and generalized exponential distributions.
The probability distribution model of air pollution index and its dominants in Kuala Lumpur
AL-Dhurafi, Nasr Ahmed; Razali, Ahmad Mahir; Masseran, Nurulkamal; Zamzuri, Zamira Hasanah
2016-11-01
This paper focuses on the statistical modeling for the distributions of air pollution index (API) and its sub-indexes data observed at Kuala Lumpur in Malaysia. Five pollutants or sub-indexes are measured including, carbon monoxide (CO); sulphur dioxide (SO2); nitrogen dioxide (NO2), and; particulate matter (PM10). Four probability distributions are considered, namely log-normal, exponential, Gamma and Weibull in search for the best fit distribution to the Malaysian air pollutants data. In order to determine the best distribution for describing the air pollutants data, five goodness-of-fit criteria's are applied. This will help in minimizing the uncertainty in pollution resource estimates and improving the assessment phase of planning. The conflict in criterion results for selecting the best distribution was overcome by using the weight of ranks method. We found that the Gamma distribution is the best distribution for the majority of air pollutants data in Kuala Lumpur.
Directory of Open Access Journals (Sweden)
Daniel Henrique Breda Binoti
2013-09-01
Full Text Available Objetivou-se neste estudo avaliar a eficiência da função log-Pearson tipo V para a descrição da estrutura diamétrica de povoamentos equiâneos de eucaliptos, bem como propor um modelo de distribuição diamétrica utilizando essa função. A modelagem realizada pela função log-Pearson tipo V foi comparada com a modelagem realizada com a função Weibull e hiperbólica. Para isso utilizou-se dados de parcelas permanentes de eucalipto, localizadas na região centro oeste do estado de Minas Gerais. A função Pearson tipo V foi testada em três diferentes configurações, com três e dois parâmetros, e tendo o parâmetro de locação substituído pelo diâmetro mínimo da parcela. A aderência das funções aos dados foi comprovada pela aplicação do teste Kolmogorov-Sminorv (K-S. Todos os ajustes apresentaram aderência aos dados pelo teste KS. As funções Weibull e hiperbólica apresentaram desempenho superior ao demonstrado pela função Pearson tipo V.
Néda, Zoltán; Járai-Szabó, Ferenc; Boda, Szilárd
2017-10-01
The Kolmogorov-Johnson-Mehl-Avrami (KJMA) growth model is considered on a one-dimensional (1D) lattice. Cells can grow with constant speed and continuously nucleate on the empty sites. We offer an alternative mean-field-like approach for describing theoretically the dynamics and derive an analytical cell-size distribution function. Our method reproduces the same scaling laws as the KJMA theory and has the advantage that it leads to a simple closed form for the cell-size distribution function. It is shown that a Weibull distribution is appropriate for describing the final cell-size distribution. The results are discussed in comparison with Monte Carlo simulation data.
Experiment research on cognition reliability model of nuclear power plant
International Nuclear Information System (INIS)
Zhao Bingquan; Fang Xiang
1999-01-01
The objective of the paper is to improve the reliability of operation on real nuclear power plant of operators through the simulation research to the cognition reliability of nuclear power plant operators. The research method of the paper is to make use of simulator of nuclear power plant as research platform, to take present international research model of reliability of human cognition based on three-parameter Weibull distribution for reference, to develop and get the research model of Chinese nuclear power plant operators based on two-parameter Weibull distribution. By making use of two-parameter Weibull distribution research model of cognition reliability, the experiments about the cognition reliability of nuclear power plant operators have been done. Compared with the results of other countries such USA and Hungary, the same results can be obtained, which can do good to the safety operation of nuclear power plant
Fitting Statistical Distributions Functions on Ozone Concentration Data at Coastal Areas
International Nuclear Information System (INIS)
Muhammad Yazid Nasir; Nurul Adyani Ghazali; Muhammad Izwan Zariq Mokhtar; Norhazlina Suhaimi
2016-01-01
Ozone is known as one of the pollutant that contributes to the air pollution problem. Therefore, it is important to carry out the study on ozone. The objective of this study is to find the best statistical distribution for ozone concentration. There are three distributions namely Inverse Gaussian, Weibull and Lognormal were chosen to fit one year hourly average ozone concentration data in 2010 at Port Dickson and Port Klang. Maximum likelihood estimation (MLE) method was used to estimate the parameters to develop the probability density function (PDF) graph and cumulative density function (CDF) graph. Three performance indicators (PI) that are normalized absolute error (NAE), prediction accuracy (PA), and coefficient of determination (R 2 ) were used to determine the goodness-of-fit criteria of the distribution. Result shows that Weibull distribution is the best distribution with the smallest error measure value (NAE) at Port Klang and Port Dickson is 0.08 and 0.31, respectively. The best score for highest adequacy measure (PA: 0.99) with the value of R 2 is 0.98 (Port Klang) and 0.99 (Port Dickson). These results provide useful information to local authorities for prediction purpose. (author)
SYVAC3 parameter distribution package
Energy Technology Data Exchange (ETDEWEB)
Andres, T; Skeet, A
1995-01-01
SYVAC3 (Systems Variability Analysis Code, generation 3) is a computer program that implements a method called systems variability analysis to analyze the behaviour of a system in the presence of uncertainty. This method is based on simulating the system many times to determine the variation in behaviour it can exhibit. SYVAC3 specializes in systems representing the transport of contaminants, and has several features to simplify the modelling of such systems. It provides a general tool for estimating environmental impacts from the dispersal of contaminants. This report describes a software object type (a generalization of a data type) called Parameter Distribution. This object type is used in SYVAC3, and can also be used independently. Parameter Distribution has the following subtypes: beta distribution; binomial distribution; constant distribution; lognormal distribution; loguniform distribution; normal distribution; piecewise uniform distribution; Triangular distribution; and uniform distribution. Some of these distributions can be altered by correlating two parameter distribution objects. This report provides complete specifications for parameter distributions, and also explains how to use them. It should meet the needs of casual users, reviewers, and programmers who wish to add their own subtypes. (author). 30 refs., 75 tabs., 56 figs.
Sadhana | Indian Academy of Sciences
Indian Academy of Sciences (India)
In this study, we have discussed the development of an inventory model when the deterioration rate of the item follows Weibull two parameter distributions under the effect of selling price and time dependent demand, since, not only the selling price, but also the time is a crucial factor to enhance the demand in the market as ...
International Nuclear Information System (INIS)
Alavi, Omid; Mohammadi, Kasra; Mostafaeipour, Ali
2016-01-01
Highlights: • Suitability of different wind speed probability functions is assessed. • 5 stations distributed in east and south-east of Iran are considered as case studies. • Nakagami distribution is tested for first time and compared with 7 other functions. • Due to difference in wind features, best function is not similar for all stations. - Abstract: Precise information of wind speed probability distribution is truly significant for many wind energy applications. The objective of this study is to evaluate the suitability of different probability functions for estimating wind speed distribution at five stations, distributed in the east and southeast of Iran. Nakagami distribution function is utilized for the first time to estimate the distribution of wind speed. The performance of Nakagami function is compared with seven typically used distribution functions. The achieved results reveal that the more effective function is not similar among all stations. Wind speed characteristics, quantity and quality of the recorded wind speed data can be considered as influential parameters on the performance of the distribution functions. Also, the skewness of the recorded wind speed data may have influence on the accuracy of the Nakagami distribution. For Chabahar and Khaf stations the Nakagami distribution shows the highest performance while for Lutak, Rafsanjan and Zabol stations the Gamma, Generalized Extreme Value and Inverse-Gaussian distributions offer the best fits, respectively. Based on the analysis, the Nakagami distribution can generally be considered as an effective distribution since it provides the best fits in 2 stations and ranks 3rd to 5th in the remaining stations; however, due to the close performance of the Nakagami and Weibull distributions and also flexibility of the Weibull function as its widely proven feature, more assessments on the performance of the Nakagami distribution are required.
Fitting diameter distribution models to data from forest inventories with concentric plot design
Energy Technology Data Exchange (ETDEWEB)
Nanos, N.; Sjöstedt de Luna, S.
2017-11-01
Aim: Several national forest inventories use a complex plot design based on multiple concentric subplots where smaller diameter trees are inventoried when lying in the smaller-radius subplots and ignored otherwise. Data from these plots are truncated with threshold (truncation) diameters varying according to the distance from the plot centre. In this paper we designed a maximum likelihood method to fit the Weibull diameter distribution to data from concentric plots. Material and methods: Our method (M1) was based on multiple truncated probability density functions to build the likelihood. In addition, we used an alternative method (M2) presented recently. We used methods M1 and M2 as well as two other reference methods to estimate the Weibull parameters in 40000 simulated plots. The spatial tree pattern of the simulated plots was generated using four models of spatial point patterns. Two error indices were used to assess the relative performance of M1 and M2 in estimating relevant stand-level variables. In addition, we estimated the Quadratic Mean plot Diameter (QMD) using Expansion Factors (EFs). Main results: Methods M1 and M2 produced comparable estimation errors in random and cluster tree spatial patterns. Method M2 produced biased parameter estimates in plots with inhomogeneous Poisson patterns. Estimation of QMD using EFs produced biased results in plots within inhomogeneous intensity Poisson patterns. Research highlights:We designed a new method to fit the Weibull distribution to forest inventory data from concentric plots that achieves high accuracy and precision in parameter estimates regardless of the within-plot spatial tree pattern.
ESTIMATION ACCURACY OF EXPONENTIAL DISTRIBUTION PARAMETERS
Directory of Open Access Journals (Sweden)
muhammad zahid rashid
2011-04-01
Full Text Available The exponential distribution is commonly used to model the behavior of units that have a constant failure rate. The two-parameter exponential distribution provides a simple but nevertheless useful model for the analysis of lifetimes, especially when investigating reliability of technical equipment.This paper is concerned with estimation of parameters of the two parameter (location and scale exponential distribution. We used the least squares method (LSM, relative least squares method (RELS, ridge regression method (RR, moment estimators (ME, modified moment estimators (MME, maximum likelihood estimators (MLE and modified maximum likelihood estimators (MMLE. We used the mean square error MSE, and total deviation TD, as measurement for the comparison between these methods. We determined the best method for estimation using different values for the parameters and different sample sizes
Energy Technology Data Exchange (ETDEWEB)
Leiva, R.; Donoso, J. R.; Muehlich, U.; Labbe, F.
2004-07-01
The effect of the mismatched weld metal on the stress field close to the crack tip in an idealized weld joint made up of base metal (BM) and weld metal (WM), with the crack located in WM, parallel to the BM/WM interface, was numerically analyzed. The analysis was performed with a J-Q type two-parameter approach with a Modified Boundary Layer, MBL, model subject to a remote displacement field solely controlled by K{sub 1} in order to eliminate the effect of the geometry constraint. The numerical results show that the constraint level decreases for overmatched welds (yield stress of WM higher than that of BM), and increases for under matched welds (yield stress of WM lower than that BM). The constraint level depends on the degree of the mismatch, on the width of the weld, and on the applied load level. (Author) 21 refs.
Modeling wind speed and wind power distributions in Rwanda
Energy Technology Data Exchange (ETDEWEB)
Safari, Bonfils [Department of Physics, National University of Rwanda, P.O. Box 117, Huye District, South Province (Rwanda)
2011-02-15
Utilization of wind energy as an alternative energy source may offer many environmental and economical advantages compared to fossil fuels based energy sources polluting the lower layer atmosphere. Wind energy as other forms of alternative energy may offer the promise of meeting energy demand in the direct, grid connected modes as well as stand alone and remote applications. Wind speed is the most significant parameter of the wind energy. Hence, an accurate determination of probability distribution of wind speed values is very important in estimating wind speed energy potential over a region. In the present study, parameters of five probability density distribution functions such as Weibull, Rayleigh, lognormal, normal and gamma were calculated in the light of long term hourly observed data at four meteorological stations in Rwanda for the period of the year with fairly useful wind energy potential (monthly hourly mean wind speed anti v{>=}2 m s{sup -1}). In order to select good fitting probability density distribution functions, graphical comparisons to the empirical distributions were made. In addition, RMSE and MBE have been computed for each distribution and magnitudes of errors were compared. Residuals of theoretical distributions were visually analyzed graphically. Finally, a selection of three good fitting distributions to the empirical distribution of wind speed measured data was performed with the aid of a {chi}{sup 2} goodness-of-fit test for each station. (author)
International Nuclear Information System (INIS)
Caleyo, F.; Velazquez, J.C.; Valor, A.; Hallen, J.M.
2009-01-01
The probability distributions of external-corrosion pit depth and pit growth rate were investigated in underground pipelines using Monte Carlo simulations. The study combines a predictive pit growth model developed by the authors with the observed distributions of the model variables in a range of soils. Depending on the pipeline age, any of the three maximal extreme value distributions, i.e. Weibull, Frechet or Gumbel, can arise as the best fit to the pitting depth and rate data. The Frechet distribution best fits the corrosion data for long exposure periods. This can be explained by considering the long-term stabilization of the diffusion-controlled pit growth. The findings of the study provide reliability analysts with accurate information regarding the stochastic characteristics of the pitting damage in underground pipelines.
Energy Technology Data Exchange (ETDEWEB)
Caleyo, F. [Departamento de Ingenieria Metalurgica, ESIQIE, IPN, UPALM Edif. 7, Zacatenco, 07738 Mexico, D.F. (Mexico)], E-mail: fcaleyo@gmail.com; Velazquez, J.C. [Departamento de Ingenieria Metalurgica, ESIQIE, IPN, UPALM Edif. 7, Zacatenco, 07738 Mexico, D.F. (Mexico); Valor, A. [Facultad de Fisica, Universidad de La Habana, San Lazaro y L, Vedado, 10400, La Habana (Cuba); Hallen, J.M. [Departamento de Ingenieria Metalurgica, ESIQIE, IPN, UPALM Edif. 7, Zacatenco, 07738 Mexico, D.F. (Mexico)
2009-09-15
The probability distributions of external-corrosion pit depth and pit growth rate were investigated in underground pipelines using Monte Carlo simulations. The study combines a predictive pit growth model developed by the authors with the observed distributions of the model variables in a range of soils. Depending on the pipeline age, any of the three maximal extreme value distributions, i.e. Weibull, Frechet or Gumbel, can arise as the best fit to the pitting depth and rate data. The Frechet distribution best fits the corrosion data for long exposure periods. This can be explained by considering the long-term stabilization of the diffusion-controlled pit growth. The findings of the study provide reliability analysts with accurate information regarding the stochastic characteristics of the pitting damage in underground pipelines.
Chang, Wen-Ruey; Matz, Simon; Chang, Chien-Chi
2014-05-01
The maximum coefficient of friction that can be supported at the shoe and floor interface without a slip is usually called the available coefficient of friction (ACOF) for human locomotion. The probability of a slip could be estimated using a statistical model by comparing the ACOF with the required coefficient of friction (RCOF), assuming that both coefficients have stochastic distributions. An investigation of the stochastic distributions of the ACOF of five different floor surfaces under dry, water and glycerol conditions is presented in this paper. One hundred friction measurements were performed on each floor surface under each surface condition. The Kolmogorov-Smirnov goodness-of-fit test was used to determine if the distribution of the ACOF was a good fit with the normal, log-normal and Weibull distributions. The results indicated that the ACOF distributions had a slightly better match with the normal and log-normal distributions than with the Weibull in only three out of 15 cases with a statistical significance. The results are far more complex than what had heretofore been published and different scenarios could emerge. Since the ACOF is compared with the RCOF for the estimate of slip probability, the distribution of the ACOF in seven cases could be considered a constant for this purpose when the ACOF is much lower or higher than the RCOF. A few cases could be represented by a normal distribution for practical reasons based on their skewness and kurtosis values without a statistical significance. No representation could be found in three cases out of 15. Copyright © 2013 Elsevier Ltd and The Ergonomics Society. All rights reserved.
Energy Technology Data Exchange (ETDEWEB)
Vendramini, Elisa Zanuncio
1986-10-01
The theoretical model of wind speed distributions allow valuable information about the probability of events relative to the variable in study eliminating the necessity of a new experiment. The most used distributions has been the Weibull and the Rayleigh. These distributions are examined in the present investigation, as well as the exponential, gamma, chi square and lognormal distributions. Three years of hourly averages wind data recorded from a anemometer setting at the city of Ataliba Leonel, Sao Paulo State, Brazil, were used. Using wind speed distribution the theoretical relative frequency was calculated from the distributions which have been examined. Results from the Kolmogorov - Smirnov test allow to conclude that the lognormal distribution fit better the wind speed data, followed by the gamma and Rayleigh distributions. Using the lognormal probability density function the yearly energy output from a wind generator installed in the side was calculated. 30 refs, 4 figs, 14 tabs
International Nuclear Information System (INIS)
El-Shanshoury, Gh.I.
2015-01-01
Assessing the adequacy of probability distributions for estimating the extreme events of air temperature in Dabaa region is one of the pre-requisite s for any design purpose at Dabaa site which can be achieved by probability approach. In the present study, three extreme value distributions are considered and compared to estimate the extreme events of monthly and annual maximum and minimum temperature. These distributions include the Gumbel/Frechet distributions for estimating the extreme maximum values and Gumbel /Weibull distributions for estimating the extreme minimum values. Lieblein technique and Method of Moments are applied for estimating the distribution para meters. Subsequently, the required design values with a given return period of exceedance are obtained. Goodness-of-Fit tests involving Kolmogorov-Smirnov and Anderson-Darling are used for checking the adequacy of fitting the method/distribution for the estimation of maximum/minimum temperature. Mean Absolute Relative Deviation, Root Mean Square Error and Relative Mean Square Deviation are calculated, as the performance indicators, to judge which distribution and method of parameters estimation are the most appropriate one to estimate the extreme temperatures. The present study indicated that the Weibull distribution combined with Method of Moment estimators gives the highest fit, most reliable, accurate predictions for estimating the extreme monthly and annual minimum temperature. The Gumbel distribution combined with Method of Moment estimators showed the highest fit, accurate predictions for the estimation of the extreme monthly and annual maximum temperature except for July, August, October and November. The study shows that the combination of Frechet distribution with Method of Moment is the most accurate for estimating the extreme maximum temperature in July, August and November months while t he Gumbel distribution and Lieblein technique is the best for October
Directory of Open Access Journals (Sweden)
A. Stankovic
2012-12-01
Full Text Available The distributions of random variables are of interest in many areas of science. In this paper, ascertaining on the importance of multi-hop transmission in contemporary wireless communications systems operating over fading channels in the presence of cochannel interference, the probability density functions (PDFs of minimum of arbitrary number of ratios of Rayleigh, Rician, Nakagami-m, Weibull and α-µ random variables are derived. These expressions can be used to study the outage probability as an important multi-hop system performance measure. Various numerical results complement the proposed mathematical analysis.
Misra, Aalok
2008-01-01
We consider issues of moduli stabilization and "area codes" for type II flux compactifications, and the "Inverse Problem" and "Fake Superpotentials" for extremal (non)supersymmetric black holes in type II compactifications on (orientifold of) a compact two-parameter Calabi-Yau expressed as a degree-18 hypersurface in WCP^4[1,1,1,6,9] which has multiple singular loci in its moduli space. We argue the existence of extended "area codes" [1] wherein for the same set of large NS-NS and RR fluxes, one can stabilize all the complex structure moduli and the axion-dilaton modulus (to different sets of values) for points in the moduli space away as well as near the different singular conifold loci leading to the existence of domain walls. By including non-perturbative alpha' and instanton corrections in the Kaehler potential and superpotential [2], we show the possibility of getting a large-volume non-supersymmetric (A)dS minimum. Further, using techniques of [3] we explicitly show that given a set of moduli and choice...
Product of Ginibre matrices: Fuss-Catalan and Raney distributions
Penson, Karol A.; Życzkowski, Karol
2011-06-01
Squared singular values of a product of s square random Ginibre matrices are asymptotically characterized by probability distributions Ps(x), such that their moments are equal to the Fuss-Catalan numbers of order s. We find a representation of the Fuss-Catalan distributions Ps(x) in terms of a combination of s hypergeometric functions of the type sFs-1. The explicit formula derived here is exact for an arbitrary positive integer s, and for s=1 it reduces to the Marchenko-Pastur distribution. Using similar techniques, involving the Mellin transform and the Meijer G function, we find exact expressions for the Raney probability distributions, the moments of which are given by a two-parameter generalization of the Fuss-Catalan numbers. These distributions can also be considered as a two-parameter generalization of the Wigner semicircle law.
Escort entropies and divergences and related canonical distribution
International Nuclear Information System (INIS)
Bercher, J.-F.
2011-01-01
We discuss two families of two-parameter entropies and divergences, derived from the standard Renyi and Tsallis entropies and divergences. These divergences and entropies are found as divergences or entropies of escort distributions. Exploiting the nonnegativity of the divergences, we derive the expression of the canonical distribution associated to the new entropies and a observable given as an escort-mean value. We show that this canonical distribution extends, and smoothly connects, the results obtained in nonextensive thermodynamics for the standard and generalized mean value constraints. -- Highlights: → Two-parameter entropies are derived from q-entropies and escort distributions. → The related canonical distribution is derived. → This connects and extends known results in nonextensive statistics.
Bayesian analysis of general failure data from an ageing distribution: advances in numerical methods
International Nuclear Information System (INIS)
Procaccia, H.; Villain, B.; Clarotti, C.A.
1996-01-01
EDF and ENEA carried out a joint research program for developing the numerical methods and computer codes needed for Bayesian analysis of component-lives in the case of ageing. Early results of this study were presented at ESREL'94. Since then the following further steps have been gone: input data have been generalized to the case that observed lives are censored both on the right and on the left; allowable life distributions are Weibull and gamma - their parameters are both unknown and can be statistically dependent; allowable priors are histograms relative to different parametrizations of the life distribution of concern; first-and-second-order-moments of the posterior distributions can be computed. In particular the covariance will give some important information about the degree of the statistical dependence between the parameters of interest. An application of the code to the appearance of a stress corrosion cracking in a tube of the PWR Steam Generator system is presented. (authors)
Multi-choice stochastic transportation problem involving general form of distributions.
Quddoos, Abdul; Ull Hasan, Md Gulzar; Khalid, Mohammad Masood
2014-01-01
Many authors have presented studies of multi-choice stochastic transportation problem (MCSTP) where availability and demand parameters follow a particular probability distribution (such as exponential, weibull, cauchy or extreme value). In this paper an MCSTP is considered where availability and demand parameters follow general form of distribution and a generalized equivalent deterministic model (GMCSTP) of MCSTP is obtained. It is also shown that all previous models obtained by different authors can be deduced with the help of GMCSTP. MCSTP with pareto, power function or burr-XII distributions are also considered and equivalent deterministic models are obtained. To illustrate the proposed model two numerical examples are presented and solved using LINGO 13.0 software package.
Hu, Shuhua; Dunlavey, Michael; Guzy, Serge; Teuscher, Nathan
2018-04-01
A distributed delay approach was proposed in this paper to model delayed outcomes in pharmacokinetics and pharmacodynamics studies. This approach was shown to be general enough to incorporate a wide array of pharmacokinetic and pharmacodynamic models as special cases including transit compartment models, effect compartment models, typical absorption models (either zero-order or first-order absorption), and a number of atypical (or irregular) absorption models (e.g., parallel first-order, mixed first-order and zero-order, inverse Gaussian, and Weibull absorption models). Real-life examples were given to demonstrate how to implement distributed delays in Phoenix ® NLME™ 8.0, and to numerically show the advantages of the distributed delay approach over the traditional methods.
Comparison of the Gini and Zenga Indexes using Some Theoretical Income Distributions Abstract
Directory of Open Access Journals (Sweden)
Katarzyna Ostasiewicz
2013-01-01
Full Text Available The most common measure of inequality used in scientific research is the Gini index. In 2007, Zenga proposed a new index of inequality that has all the appropriate properties of an measure of equality. In this paper, we compared the Gini and Zenga indexes, calculating these quantities for the few distributions frequently used for approximating distributions of income, that is, the lognormal, gamma, inverse Gauss, Weibull and Burr distributions. Within this limited examination, we have observed three main differences. First, the Zenga index increases more rapidly for low values of the variation and decreases more slowly when the variation approaches intermediate values from above. Second, the Zenga index seems to be better predicted by the variation. Third, although the Zenga index is always higher than the Gini one, the ordering of some pairs of cases may be inverted. (original abstract
Directory of Open Access Journals (Sweden)
Daniel Henrique Breda Binoti
2012-04-01
Full Text Available Foram objetivos deste estudo ajustar e avaliar diferentes formas da função Weibull e hiperbólica para descrição da estrutura diamétrica de sistemas agrossilvipastoris. O componente florestal foi estabelecido com clones de híbridos de eucalipto no espaçamento 10 x 4 m, visando à produção de madeira para energia e serraria. Nas entrelinhas houve o plantio de culturas anuais, como o arroz no primeiro ano e a soja no segundo. A partir do ano seguinte ocorreu a formação de pastagens de braquiária, manejadas para engorda de gado de corte. Testaram-se as duas funções com três parâmetros, dois parâmetros e parâmetro de locação como diâmetro mínimo e truncada à direita. A aderência da função aos dados foi avaliada pelo teste de Kolmogorov-Smirnorv. As funções também foram avaliadas pela Soma de Quadrados de Desvios (SQD e pela análise gráfica entre valores observados e estimados. Os resultados indicaram a possibilidade do uso das funções para descrição da distribuição diamétrica de povoamentos desbastados.
Analysis of Flexural Fatigue Strength of Self Compacting Fibre Reinforced Concrete Beams
Murali, G.; Sudar Celestina, J. P. Arul; Subhashini, N.; Vigneshwari, M.
2017-07-01
This study presents the extensive statistical investigation ofvariations in flexural fatigue life of self-compacting Fibrous Concrete (FC) beams. For this purpose, the experimental data of earlier researchers were examined by two parameter Weibull distribution.Two methods namely Graphical and moment wereused to analyse the variations in experimental data and the results have been presented in the form of probability of survival. The Weibull parameters values obtained from graphical and method of moments are precise. At 0.7 stress level, the fatigue life shows 59861 cyclesfor areliability of 90%.
Log-concavity property for some well-known distributions
Directory of Open Access Journals (Sweden)
G. R. Mohtashami Borzadaran
2011-12-01
Full Text Available Interesting properties and propositions, in many branches of science such as economics have been obtained according to the property of cumulative distribution function of a random variable as a concave function. Caplin and Nalebuff (1988,1989, Bagnoli and Khanna (1989 and Bagnoli and Bergstrom (1989 , 1989, 2005 have discussed the log-concavity property of probability distributions and their applications, especially in economics. Log-concavity concerns twice differentiable real-valued function g whose domain is an interval on extended real line. g as a function is said to be log-concave on the interval (a,b if the function ln(g is a concave function on (a,b. Log-concavity of g on (a,b is equivalent to g'/g being monotone decreasing on (a,b or (ln(g" 6] have obtained log-concavity for distributions such as normal, logistic, extreme-value, exponential, Laplace, Weibull, power function, uniform, gamma, beta, Pareto, log-normal, Student's t, Cauchy and F distributions. We have discussed and introduced the continuous versions of the Pearson family, also found the log-concavity for this family in general cases, and then obtained the log-concavity property for each distribution that is a member of Pearson family. For the Burr family these cases have been calculated, even for each distribution that belongs to Burr family. Also, log-concavity results for distributions such as generalized gamma distributions, Feller-Pareto distributions, generalized Inverse Gaussian distributions and generalized Log-normal distributions have been obtained.
Directory of Open Access Journals (Sweden)
Amir Hossein Mirzabe
2018-06-01
Full Text Available Sunflower petal is one of the parts of the sunflower which has drawn attention and has several applications these days. These applications justify getting information about physical properties, mechanical properties, drying trends, etc. in order to design new machines and use new methods to harvest or dry the sunflower petals. For three varieties of sunflower, picking force of petals was measured; number of petals of each head was counted; unit mass and 1000-unit mass of fresh petals were measured and length, width, and projected area of fresh petals were calculated based on image processing technique; frequency distributions of these parameters were modeled using statistical distribution models namely Gamma, Generalized Extreme Value (G. E. V, Lognormal, and Weibull. Results of picking force showed that with increasing number of days after appearing the first petal on each head from 5 to 14 and decreasing loading rate from 150 g min−1 to 50 g min−1 values of picking force were decreased for three varieties, but diameter of sunflower head had different effects on picking force for each variety. Length, width, and number of petals of Dorsefid variety ranged from 38.52 to 95.44 mm, 3.80 to 9.28 mm and 29 to 89, respectively. The corresponding values ranged from 34.19 to 88.18 mm, 4.28 to 10.60 mm and 21 to 89, respectively for Shamshiri variety and ranged from 44.47 to 114.63 mm, 7.03 to 20.31 mm and 29 to 89 for Sirena variety. Results of frequency distribution modeling indicated that in most cases, G. E. V and Weibull distributions had better performance than other distributions. Keywords: Sunflower (Helianthus annus L. petal, Picking force, Image processing, Fibonacci sequence, Lucas sequence
International Nuclear Information System (INIS)
Genet, Martin; Couegnat, Guillaume; Tomsia, Antoni P.; Ritchie, Robert O.
2014-01-01
This paper presents an approach to predict the strength distribution of quasi-brittle materials across multiple length-scales, with emphasis on Nature-inspired ceramic structures. It permits the computation of the failure probability of any structure under any mechanical load, solely based on considerations of the microstructure and its failure properties by naturally incorporating the statistical and size-dependent aspects of failure. We overcome the intrinsic limitations of single periodic unit-based approaches by computing the successive failures of the material components and associated stress redistributions on arbitrary numbers of periodic units. For large size samples, the microscopic cells are replaced by a homogenized continuum with equivalent stochastic and damaged constitutive behavior. After establishing the predictive capabilities of the method, and illustrating its potential relevance to several engineering problems, we employ it in the study of the shape and scaling of strength distributions across differing length-scales for a particular quasi-brittle system. We find that the strength distributions display a Weibull form for samples of size approaching the periodic unit; however, these distributions become closer to normal with further increase in sample size before finally reverting to a Weibull form for macroscopic sized samples. In terms of scaling, we find that the weakest link scaling applies only to microscopic, and not macroscopic scale, samples. These findings are discussed in relation to failure patterns computed at different size-scales. (authors)
Scaling theory of quantum resistance distributions in disordered systems
International Nuclear Information System (INIS)
Jayannavar, A.M.
1991-01-01
The large scale distribution of quantum Ohmic resistance of a disorderd one-dimensional conductor is derived explicitly. It is shown that in the thermodynamic limit this distribution is characterized by two independent parameters for strong disorder, leading to a two-parameter scaling theory of localization. Only in the limit of weak disorder single parameter scaling consistent with existing theoretical treatments is recovered. (author). 33 refs., 4 figs
Scaling theory of quantum resistance distributions in disordered systems
International Nuclear Information System (INIS)
Jayannavar, A.M.
1990-05-01
We have derived explicitly, the large scale distribution of quantum Ohmic resistance of a disordered one-dimensional conductor. We show that in the thermodynamic limit this distribution is characterized by two independent parameters for strong disorder, leading to a two-parameter scaling theory of localization. Only in the limit of weak disorder we recover single parameter scaling, consistent with existing theoretical treatments. (author). 32 refs, 4 figs
Distribution functions for the linear region of the S-N curve
Energy Technology Data Exchange (ETDEWEB)
Mueller, Christian; Waechter, Michael; Masendorf, Rainer; Esderts, Alfons [TU Clausthal, Clausthal-Zellerfeld (Germany). Inst. for Plant Engineering and Fatigue Analysis
2017-08-01
This study establishes a database containing the results of fatigue tests from the linear region of the S-N curve using sources from the literature. Each set of test results originates from testing metallic components on a single load level. Eighty-nine test series with sample sizes of 14 ≤ n ≤ 500 are included in the database, resulting in a sum of 6,086 individual test results. The test series are tested in terms of the type of distribution function (log-normal or 2-parameter Weibull) using the Shapiro-Wilk test, the Anderson-Darling test and probability plots. The majority of the tested individual test results follows a log-normal distribution.
International Nuclear Information System (INIS)
Ayodele, T.R.; Jimoh, A.A.; Munda, J.L.; Agee, J.T.
2012-01-01
Highlights: ► We evaluate capacity factor of some commercially available wind turbines. ► Wind speed in the sites studied can best be modelled using Weibull distribution. ► Site WM05 has the highest wind power potential while site WM02 has the lowest. ► More wind power can be harnessed during the day period compared to the night. ► Turbine K seems to be the best turbine for the coastal region of South Africa. - Abstract: The operating curve parameters of a wind turbine should match the local wind regime optimally to ensure maximum exploitation of available energy in a mass of moving air. This paper provides estimates of the capacity factor of 20 commercially available wind turbines, based on the local wind characteristics of ten different sites located in the Western Cape region of South Africa. Ten-min average time series wind-speed data for a period of 1 year are used for the study. First, the wind distribution that best models the local wind regime of the sites is determined. This is based on root mean square error (RMSE) and coefficient of determination (R 2 ) which are used to test goodness of fit. First, annual, seasonal, diurnal and peak period-capacity factor are estimated analytically. Then, the influence of turbine power curve parameters on the capacity factor is investigated. Some of the key results show that the wind distribution of the entire site can best be modelled statistically using the Weibull distribution. Site WM05 (Napier) presents the highest capacity factor for all the turbines. This indicates that this site has the highest wind power potential of all the available sites. Site WM02 (Calvinia) has the lowest capacity factor i.e. lowest wind power potential. This paper can assist in the planning and development of large-scale wind power-generating sites in South Africa.
Houghton, J.C.
1988-01-01
The truncated shifted Pareto (TSP) distribution, a variant of the two-parameter Pareto distribution, in which one parameter is added to shift the distribution right and left and the right-hand side is truncated, is used to model size distributions of oil and gas fields for resource assessment. Assumptions about limits to the left-hand and right-hand side reduce the number of parameters to two. The TSP distribution has advantages over the more customary lognormal distribution because it has a simple analytic expression, allowing exact computation of several statistics of interest, has a "J-shape," and has more flexibility in the thickness of the right-hand tail. Oil field sizes from the Minnelusa play in the Powder River Basin, Wyoming and Montana, are used as a case study. Probability plotting procedures allow easy visualization of the fit and help the assessment. ?? 1988 International Association for Mathematical Geology.
Directory of Open Access Journals (Sweden)
Joel C. da Silva
2007-02-01
Full Text Available O estudo em pauta teve como objetivo analisar a distribuição da quantidade diária de precipitação e do número de dias com chuva e determinar a variação da probabilidade de ocorrência de precipitação diária, durante os meses do ano, em Santa Maria, RS. Os dados de precipitação utilizados foram obtidos durante 36 anos de observação, na Estação Climatológica do 8º Distrito de Meteorologia, localizada na Universidade Federal de Santa Maria (29º 43' 23" de latitude Sul e 53º 43' 15" de longitude Oeste, altitude 95 m. Analisaram-se as seguintes funções de distribuição de probabilidade: gama, Weibull, normal, log-normal e exponencial. As funções que melhor descreveram a distribuição das probabilidades foram gama e Weibull. O maior número de dias com chuva ocorreu durante os meses de inverno porém o volume de precipitação é menor nesses dias, resultando em total mensal semelhante para todos os meses do ano.The objectives of this study were to analyze the distribution of total daily rainfall data and the number of rainy-days, and to determine the probability variation of daily precipitation during the months of the year in Santa Maria, Rio Grande do Sul State, Brazil. A 36-year rainfall database measured at the Climatological Station of 8th District of Meteorology, located in Santa Maria Federal University (29º 43' 23" S and 53º 43' 15" W were used in the study. The following probability distribution functions were tested: gamma, Weibull, normal, lognormal and exponential. The functions that best described the frequency distribution were gamma and Weibull. There were more number of rainy days in the winter, but with less amount of rainfall, resulting in similar monthly total precipitation for the twelve months of the year.
Directory of Open Access Journals (Sweden)
Ítalo Nunes Silva
2013-09-01
Full Text Available Foram analisadas sete distribuições de probabilidade Exponencial, Gama, Log-normal, Normal, Weibull, Gumbel e Beta para a chuva mensal e anual na região Centro-Sul do Ceará, Brasil. Para verificação dos ajustes dos dados às funções densidade de probabilidade foi utilizado o teste não-paramétrico de Kolmogorov-Smirnov com nível de 5% de significância. Os dados de chuva foram obtidos da base de dados da SUDENE registrados durante o período de 1913 a 1989. Para a chuva total anual teve ajuste satisfatório dos dados às distribuições Gama, Gumbel, Normal e Weibull e não ocorreu ajuste às distribuições Exponencial, Log-normal e Beta. Recomenda-se o uso da distribuição Normal para estimar valores de chuva provável anual para a região, por ser um procedimento de fácil aplicação e também pelo bom desempenho nos testes. A distribuição de frequência Gumbel foi a que melhor representou os dados de chuva para o período mensal, com o maior número de ajustes no período chuvoso. No período seco os dados de chuva foram melhores representados pela distribuição Exponencial.Seven probability distributions were analysed: Exponential, Gamma, Log-Normal, Normal, Weibull, Gumbel and Beta, for monthly and annual rainfall in the south-central region of Ceará, Brazil. In order to verify the adjustments of the data to the probability density functions, the non-parametric Kolmogorov-Smirnov test was used with a 5% level of significance. The rainfall data were obtained from the database at SUDENE, recorded from 1913 to 1989. For the total annual rainfall, adjustment of the data to the Gamma, Gumbel, Normal and Weibull distributions was satisfactory, and there was no adjustment to the Exponential, Log-normal and Beta distributions. Use of Normal distribution is recommended to estimate the values of probable annual rainfall in the region, this being a procedure of easy application, performing well in the tests. The Gumbel frequency
Kroese, A.H.; van der Meulen, E.A.; Poortema, Klaas; Schaafsma, W.
1995-01-01
The making of statistical inferences in distributional form is conceptionally complicated because the epistemic 'probabilities' assigned are mixtures of fact and fiction. In this respect they are essentially different from 'physical' or 'frequency-theoretic' probabilities. The distributional form is
Comprehensive evaluation of wind speed distribution models: A case study for North Dakota sites
International Nuclear Information System (INIS)
Zhou Junyi; Erdem, Ergin; Li Gong; Shi Jing
2010-01-01
Accurate analysis of long term wind data is critical to the estimation of wind energy potential for a candidate location and its nearby area. Investigating the wind speed distribution is one critical task for this purpose. This paper presents a comprehensive evaluation on probability density functions for the wind speed data from five representative sites in North Dakota. Besides the popular Weibull and Rayleigh distributions, we also include other distributions such as gamma, lognormal, inverse Gaussian, and maximum entropy principle (MEP) derived probability density functions (PDFs). Six goodness-of-fit (GOF) statistics are used to determine the appropriate distributions for the wind speed data for each site. It is found that no particular distribution outperforms others for all five sites, while Rayleigh distribution performs poorly for most of the sites. Similar to other models, the performances of MEP-derived PDFs in fitting wind speed data varies from site to site. Also, the results demonstrate that MEP-derived PDFs are flexible and have the potential to capture other possible distribution patterns of wind speed data. Meanwhile, different GOF statistics may generate inconsistent ranking orders of fit performance among the candidate PDFs. In addition, one comprehensive metric that combines all individual statistics is proposed to rank the overall performance for the chosen statistical distributions.
Fissure formation in coke. 3: Coke size distribution and statistical analysis
Energy Technology Data Exchange (ETDEWEB)
D.R. Jenkins; D.E. Shaw; M.R. Mahoney [CSIRO, North Ryde, NSW (Australia). Mathematical and Information Sciences
2010-07-15
A model of coke stabilization, based on a fundamental model of fissuring during carbonisation is used to demonstrate the applicability of the fissuring model to actual coke size distributions. The results indicate that the degree of stabilization is important in determining the size distribution. A modified form of the Weibull distribution is shown to provide a better representation of the whole coke size distribution compared to the Rosin-Rammler distribution, which is generally only fitted to the lump coke. A statistical analysis of a large number of experiments in a pilot scale coke oven shows reasonably good prediction of the coke mean size, based on parameters related to blend rank, amount of low rank coal, fluidity and ash. However, the prediction of measures of the spread of the size distribution is more problematic. The fissuring model, the size distribution representation and the statistical analysis together provide a comprehensive capability for understanding and predicting the mean size and distribution of coke lumps produced during carbonisation. 12 refs., 16 figs., 4 tabs.
International Nuclear Information System (INIS)
Zhang, R.L.; Liu, Y.; Huang, Y.D.; Liu, L.
2013-01-01
Effect of particle size and distribution of the sizing agent on the performance of carbon fiber and carbon fiber composites has been investigated. Atomic force microscopy (AFM) and scanning electron microscopy (SEM) were used to characterize carbon fiber surface topographies. At the same time, the single fiber strength and Weibull distribution were also studied in order to investigate the effect of coatings on the fibers. The interfacial shear strength and hygrothermal aging of the carbon fiber/epoxy resin composites were also measured. The results indicated that the particle size and distribution is important for improving the surface of carbon fibers and its composites performance. Different particle size and distribution of sizing agent has different contribution to the wetting performance of carbon fibers. The fibers sized with P-2 had higher value of IFSS and better hygrothermal aging resistant properties.
Chang, Wen-Ruey; Matz, Simon; Chang, Chien-Chi
2012-01-01
This study investigated the stochastic distribution of the required coefficient of friction (RCOF) which is a critical element for estimating slip probability. Fifty participants walked under four walking conditions. The results of the Kolmogorov-Smirnov two-sample test indicate that 76% of the RCOF data showed a difference in distribution between both feet for the same participant under each walking condition; the data from both feet were kept separate. The results of the Kolmogorov-Smirnov goodness-of-fit test indicate that most of the distribution of the RCOF appears to have a good match with the normal (85.5%), log-normal (84.5%) and Weibull distributions (81.5%). However, approximately 7.75% of the cases did not have a match with any of these distributions. It is reasonable to use the normal distribution for representation of the RCOF distribution due to its simplicity and familiarity, but each foot had a different distribution from the other foot in 76% of cases. The stochastic distribution of the required coefficient of friction (RCOF) was investigated for use in a statistical model to improve the estimate of slip probability in risk assessment. The results indicate that 85.5% of the distribution of the RCOF appears to have a good match with the normal distribution.
International Nuclear Information System (INIS)
Fang Xiang; Zhao Bingquan
2000-01-01
In order to improve the reliability of NPP operation, the simulation research on the reliability of nuclear power plant operators is needed. Making use of simulator of nuclear power plant as research platform, and taking the present international reliability research model-human cognition reliability for reference, the part of the model is modified according to the actual status of Chinese nuclear power plant operators and the research model of Chinese nuclear power plant operators obtained based on two-parameter Weibull distribution. Experiments about the reliability of nuclear power plant operators are carried out using the two-parameter Weibull distribution research model. Compared with those in the world, the same results are achieved. The research would be beneficial to the operation safety of nuclear power plant
DEFF Research Database (Denmark)
Sørensen, John Dalsgaard; Enevoldsen, I.
1993-01-01
It has been observed and shown that in some examples a sensitivity analysis of the first order reliability index results in increasing reliability index, when the standard deviation for a stochastic variable is increased while the expected value is fixed. This unfortunate behaviour can occur when...... a stochastic variable is modelled by an asymmetrical density function. For lognormally, Gumbel and Weibull distributed stochastic variables it is shown for which combinations of the/3-point, the expected value and standard deviation the weakness can occur. In relation to practical application the behaviour...... is probably rather infrequent. A simple example is shown as illustration and to exemplify that for second order reliability methods and for exact calculations of the probability of failure this behaviour is much more infrequent....
Coupling of mass and charge distributions for low excited nuclear fission
International Nuclear Information System (INIS)
Salamatin, V.S.; )
2000-01-01
The simple model for calculation of charge distributions of fission fragments for low exited nuclear fission from experimental mass distributions is offered. The model contains two parameters, determining amplitude of even-odd effect of charge distributions and its dependence on excitation energy. Results for reactions 233 U(n th ,f), 235 U(n th ,f), 229 Th(n th ,f), 249 Cf(n th ,f) are spent [ru
Hallin, M.; Piegorsch, W.; El Shaarawi, A.
2012-01-01
The random variable X taking values 0,1,2,…,x,… with probabilities pλ(x) = e−λλx/x!, where λ∈R0+ is called a Poisson variable, and its distribution a Poisson distribution, with parameter λ. The Poisson distribution with parameter λ can be obtained as the limit, as n → ∞ and p → 0 in such a way that
National Aeronautics and Space Administration — Distributed Visualization allows anyone, anywhere, to see any simulation, at any time. Development focuses on algorithms, software, data formats, data systems and...
International Nuclear Information System (INIS)
Shi Jie
2010-01-01
An approach to determine the preventive maintenance cycle was proposed in consideration of the lifetime, optimal cost and economy. Two parameters Weibull distribution was used to calculate the lifetime of contact switch. The block replacement model and age replacement model were built with the objective of optimal cost, and the preventive replacement cycle was accounted. Eight proposals for preventive replacement cycle were given. Economy model was applied to assess those proposals and the optimal proposal was confirmed. (authors)
International Nuclear Information System (INIS)
Golubov, B I
2007-01-01
On the basis of the concept of pointwise dyadic derivative dyadic distributions are introduced as continuous linear functionals on the linear space D d (R + ) of infinitely differentiable functions compactly supported by the positive half-axis R + together with all dyadic derivatives. The completeness of the space D' d (R + ) of dyadic distributions is established. It is shown that a locally integrable function on R + generates a dyadic distribution. In addition, the space S d (R + ) of infinitely dyadically differentiable functions on R + rapidly decreasing in the neighbourhood of +∞ is defined. The space S' d (R + ) of dyadic distributions of slow growth is introduced as the space of continuous linear functionals on S d (R + ). The completeness of the space S' d (R + ) is established; it is proved that each integrable function on R + with polynomial growth at +∞ generates a dyadic distribution of slow growth. Bibliography: 25 titles.
International Nuclear Information System (INIS)
Meinhold, Lars; Clement, David; Tehei, M.; Daniel, R.M.; Finney, J.L.; Smith, Jeremy C.
2008-01-01
The temperature dependence of the dynamics of mesophilic and thermophilic dihydrofolate reductase is examined using elastic incoherent neutron scattering. It is demonstrated that the distribution of atomic displacement amplitudes can be derived from the elastic scattering data by assuming a (Weibull) functional form that resembles distributions seen in molecular dynamics simulations. The thermophilic enzyme has a significantly broader distribution than its mesophilic counterpart. Furthermore, although the rate of increase with temperature of the atomic mean-square displacements extracted from the dynamic structure factor is found to be comparable for both enzymes, the amplitudes are found to be slightly larger for the thermophilic enzyme. Therefore, these results imply that the thermophilic enzyme is the more flexible of the two
Furbish, David; Schmeeckle, Mark; Schumer, Rina; Fathel, Siobhan
2016-01-01
We describe the most likely forms of the probability distributions of bed load particle velocities, accelerations, hop distances, and travel times, in a manner that formally appeals to inferential statistics while honoring mechanical and kinematic constraints imposed by equilibrium transport conditions. The analysis is based on E. Jaynes's elaboration of the implications of the similarity between the Gibbs entropy in statistical mechanics and the Shannon entropy in information theory. By maximizing the information entropy of a distribution subject to known constraints on its moments, our choice of the form of the distribution is unbiased. The analysis suggests that particle velocities and travel times are exponentially distributed and that particle accelerations follow a Laplace distribution with zero mean. Particle hop distances, viewed alone, ought to be distributed exponentially. However, the covariance between hop distances and travel times precludes this result. Instead, the covariance structure suggests that hop distances follow a Weibull distribution. These distributions are consistent with high-resolution measurements obtained from high-speed imaging of bed load particle motions. The analysis brings us closer to choosing distributions based on our mechanical insight.
Asymmetric Bimodal Exponential Power Distribution on the Real Line
Directory of Open Access Journals (Sweden)
Mehmet Niyazi Çankaya
2018-01-01
Full Text Available The asymmetric bimodal exponential power (ABEP distribution is an extension of the generalized gamma distribution to the real line via adding two parameters that fit the shape of peakedness in bimodality on the real line. The special values of peakedness parameters of the distribution are a combination of half Laplace and half normal distributions on the real line. The distribution has two parameters fitting the height of bimodality, so capacity of bimodality is enhanced by using these parameters. Adding a skewness parameter is considered to model asymmetry in data. The location-scale form of this distribution is proposed. The Fisher information matrix of these parameters in ABEP is obtained explicitly. Properties of ABEP are examined. Real data examples are given to illustrate the modelling capacity of ABEP. The replicated artificial data from maximum likelihood estimates of parameters of ABEP and other distributions having an algorithm for artificial data generation procedure are provided to test the similarity with real data. A brief simulation study is presented.
DEFF Research Database (Denmark)
Borregaard, Michael Krabbe; Hendrichsen, Ditte Katrine; Nachman, Gøsta Støger
2008-01-01
, depending on the nature of intraspecific interactions between them: while the individuals of some species repel each other and partition the available area, others form groups of varying size, determined by the fitness of each group member. The spatial distribution pattern of individuals again strongly......Living organisms are distributed over the entire surface of the planet. The distribution of the individuals of each species is not random; on the contrary, they are strongly dependent on the biology and ecology of the species, and vary over different spatial scale. The structure of whole...... populations reflects the location and fragmentation pattern of the habitat types preferred by the species, and the complex dynamics of migration, colonization, and population growth taking place over the landscape. Within these, individuals are distributed among each other in regular or clumped patterns...
International Nuclear Information System (INIS)
Gruenemeyer, D.
1991-01-01
This paper reports on a Distribution Automation (DA) System enhances the efficiency and productivity of a utility. It also provides intangible benefits such as improved public image and market advantages. A utility should evaluate the benefits and costs of such a system before committing funds. The expenditure for distribution automation is economical when justified by the deferral of a capacity increase, a decrease in peak power demand, or a reduction in O and M requirements
The analysis of annual dose distributions for radiation workers
International Nuclear Information System (INIS)
Mill, A.J.
1984-05-01
The system of dose limitation recommended by the ICRP includes the requirement that no worker shall exceed the current dose limit of 50mSv/a. Continuous exposure at this limit corresponds to an annual death rate comparable with 'high risk' industries if all workers are continuously exposed at the dose limit. In practice, there is a distribution of doses with an arithmetic mean lower than the dose limit. In its 1977 report UNSCEAR defined a reference dose distribution for the purposes of comparison. However, this two parameter distribution does not show the departure from log-normality normally observed for actual distributions at doses which are a significant proportion of the annual limit. In this report an alternative model is suggested, based on a three parameter log-normal distribution. The third parameter is an ''effective dose limit'' and such a model fits very well the departure from log-normality observed in actual dose distributions. (author)
Sardet, Laure; Patilea, Valentin
When pricing a specific insurance premium, actuary needs to evaluate the claims cost distribution for the warranty. Traditional actuarial methods use parametric specifications to model claims distribution, like lognormal, Weibull and Pareto laws. Mixtures of such distributions allow to improve the flexibility of the parametric approach and seem to be quite well-adapted to capture the skewness, the long tails as well as the unobserved heterogeneity among the claims. In this paper, instead of looking for a finely tuned mixture with many components, we choose a parsimonious mixture modeling, typically a two or three-component mixture. Next, we use the mixture cumulative distribution function (CDF) to transform data into the unit interval where we apply a beta-kernel smoothing procedure. A bandwidth rule adapted to our methodology is proposed. Finally, the beta-kernel density estimate is back-transformed to recover an estimate of the original claims density. The beta-kernel smoothing provides an automatic fine-tuning of the parsimonious mixture and thus avoids inference in more complex mixture models with many parameters. We investigate the empirical performance of the new method in the estimation of the quantiles with simulated nonnegative data and the quantiles of the individual claims distribution in a non-life insurance application.
DEFF Research Database (Denmark)
Glaveanu, Vlad Petre
This book challenges the standard view that creativity comes only from within an individual by arguing that creativity also exists ‘outside’ of the mind or more precisely, that the human mind extends through the means of action into the world. The notion of ‘distributed creativity’ is not commonly...... used within the literature and yet it has the potential to revolutionise the way we think about creativity, from how we define and measure it to what we can practically do to foster and develop creativity. Drawing on cultural psychology, ecological psychology and advances in cognitive science......, this book offers a basic framework for the study of distributed creativity that considers three main dimensions of creative work: sociality, materiality and temporality. Starting from the premise that creativity is distributed between people, between people and objects and across time, the book reviews...
Van Steen, Maarten
2017-01-01
For this third edition of "Distributed Systems," the material has been thoroughly revised and extended, integrating principles and paradigms into nine chapters: 1. Introduction 2. Architectures 3. Processes 4. Communication 5. Naming 6. Coordination 7. Replication 8. Fault tolerance 9. Security A separation has been made between basic material and more specific subjects. The latter have been organized into boxed sections, which may be skipped on first reading. To assist in understanding the more algorithmic parts, example programs in Python have been included. The examples in the book leave out many details for readability, but the complete code is available through the book's Website, hosted at www.distributed-systems.net.
A two-parameter extension of classical nucleation theory
Lutsko, James F.; Durán-Olivencia, Miguel A.
2015-06-01
A two-variable stochastic model for diffusion-limited nucleation is developed using a formalism derived from fluctuating hydrodynamics. The model is a direct generalization of the standard classical nucleation theory (CNT). The nucleation rate and pathway are calculated in the weak-noise approximation and are shown to be in good agreement with direct numerical simulations for the weak-solution/strong-solution transition in globular proteins. We find that CNT underestimates the time needed for the formation of a critical cluster by two orders of magnitude and that this discrepancy is due to the more complex dynamics of the two variable model and not, as often is assumed, a result of errors in the estimation of the free energy barrier.
A two-parameter extension of classical nucleation theory
International Nuclear Information System (INIS)
Lutsko, James F; Durán-Olivencia, Miguel A
2015-01-01
A two-variable stochastic model for diffusion-limited nucleation is developed using a formalism derived from fluctuating hydrodynamics. The model is a direct generalization of the standard classical nucleation theory (CNT). The nucleation rate and pathway are calculated in the weak-noise approximation and are shown to be in good agreement with direct numerical simulations for the weak-solution/strong-solution transition in globular proteins. We find that CNT underestimates the time needed for the formation of a critical cluster by two orders of magnitude and that this discrepancy is due to the more complex dynamics of the two variable model and not, as often is assumed, a result of errors in the estimation of the free energy barrier. (paper)
Chiral Recognition by Fluorescence: One Measurement for Two Parameters
Directory of Open Access Journals (Sweden)
Shanshan Yu
2014-01-01
Full Text Available This outlook describes two strategies to simultaneously determine the enantiomeric composition and concentration of a chiral substrate by a single fluorescent measurement. One strategy utilizes a pseudoenantiomeric sensor pair that is composed of a 1,1′-bi-2-naphthol-based amino alcohol and a partially hydrogenated 1,1′-bi-2-naphthol-based amino alcohol. These two molecules have the opposite chiral configuration with fluorescent enhancement at two different emitting wavelengths when treated with the enantiomers of mandelic acid. Using the sum and difference of the fluorescent intensity at the two wavelengths allows simultaneous determination of both concentration and enantiomeric composition of the chiral acid. The other strategy employs a 1,1′-bi-2-naphthol-based trifluoromethyl ketone that exhibits fluorescent enhancement at two emission wavelengths upon interaction with a chiral diamine. One emission responds mostly to the concentration of the chiral diamine and the ratio of the two emissions depends on the chiral configuration of the enantiomer but independent of the concentration, allowing both the concentration and enantiomeric composition of the chiral diamine to be simultaneously determined. These strategies would significantly simplify the practical application of the enantioselective fluorescent sensors in high-throughput chiral assay.
Bubbling and bistability in two parameter discrete systems
Indian Academy of Sciences (India)
The birth of X *. · is concurrent with the ... for bistability viz. a½, and the higher order bistability points a¾, etc. are marked. The quadrilateral marked as ... The characteristics of 2 parameter 1-d maps that exhibit bubbling/bistability related to their ...
Optimal design of accelerated life tests for an extension of the exponential distribution
International Nuclear Information System (INIS)
Haghighi, Firoozeh
2014-01-01
Accelerated life tests provide information quickly on the lifetime distribution of the products by testing them at higher than usual levels of stress. In this paper, the lifetime of a product at any level of stress is assumed to have an extension of the exponential distribution. This new family has been recently introduced by Nadarajah and Haghighi (2011 [1]); it can be used as an alternative to the gamma, Weibull and exponentiated exponential distributions. The scale parameter of lifetime distribution at constant stress levels is assumed to be a log-linear function of the stress levels and a cumulative exposure model holds. For this model, the maximum likelihood estimates (MLEs) of the parameters, as well as the Fisher information matrix, are derived. The asymptotic variance of the scale parameter at a design stress is adopted as an optimization objective and its expression formula is provided using the maximum likelihood method. A Monte Carlo simulation study is carried out to examine the performance of these methods. The asymptotic confidence intervals for the parameters and hypothesis test for the parameter of interest are constructed
Temporal distribution of earthquakes using renewal process in the Dasht-e-Bayaz region
Mousavi, Mehdi; Salehi, Masoud
2018-01-01
Temporal distribution of earthquakes with M w > 6 in the Dasht-e-Bayaz region, eastern Iran has been investigated using time-dependent models. Based on these types of models, it is assumed that the times between consecutive large earthquakes follow a certain statistical distribution. For this purpose, four time-dependent inter-event distributions including the Weibull, Gamma, Lognormal, and the Brownian Passage Time (BPT) are used in this study and the associated parameters are estimated using the method of maximum likelihood estimation. The suitable distribution is selected based on logarithm likelihood function and Bayesian Information Criterion. The probability of the occurrence of the next large earthquake during a specified interval of time was calculated for each model. Then, the concept of conditional probability has been applied to forecast the next major ( M w > 6) earthquake in the site of our interest. The emphasis is on statistical methods which attempt to quantify the probability of an earthquake occurring within a specified time, space, and magnitude windows. According to obtained results, the probability of occurrence of an earthquake with M w > 6 in the near future is significantly high.
Lima, Robson B DE; Bufalino, Lina; Alves, Francisco T; Silva, José A A DA; Ferreira, Rinaldo L C
2017-01-01
Currently, there is a lack of studies on the correct utilization of continuous distributions for dry tropical forests. Therefore, this work aims to investigate the diameter structure of a brazilian tropical dry forest and to select suitable continuous distributions by means of statistic tools for the stand and the main species. Two subsets were randomly selected from 40 plots. Diameter at base height was obtained. The following functions were tested: log-normal; gamma; Weibull 2P and Burr. The best fits were selected by Akaike's information validation criterion. Overall, the diameter distribution of the dry tropical forest was better described by negative exponential curves and positive skewness. The forest studied showed diameter distributions with decreasing probability for larger trees. This behavior was observed for both the main species and the stand. The generalization of the function fitted for the main species show that the development of individual models is needed. The Burr function showed good flexibility to describe the diameter structure of the stand and the behavior of Mimosa ophthalmocentra and Bauhinia cheilantha species. For Poincianella bracteosa, Aspidosperma pyrifolium and Myracrodum urundeuva better fitting was obtained with the log-normal function.
Lindström, Robin; Rosvall, Tobias
2013-01-01
En prestandaanalys utfördes på en SAAB 2000 som referensobjekt. Olika metoder för att driva flygplan på ett miljövänligare sätt utvärderades tillsammans med distributed propulsion. Efter undersökningar valdes elmotorer tillsammans med Zink-luft batterier för att driva SAAB 2000 med distributed propulsion. En prestandaanalys utfördes på detta plan på samma sätt som för den ursprungliga SAAB 2000. Resultaten jämfördes och slutsatsen blev att räckvidden var för kort för att konfigurationen skull...
Quasihomogeneous distributions
von Grudzinski, O
1991-01-01
This is a systematic exposition of the basics of the theory of quasihomogeneous (in particular, homogeneous) functions and distributions (generalized functions). A major theme is the method of taking quasihomogeneous averages. It serves as the central tool for the study of the solvability of quasihomogeneous multiplication equations and of quasihomogeneous partial differential equations with constant coefficients. Necessary and sufficient conditions for solvability are given. Several examples are treated in detail, among them the heat and the Schrödinger equation. The final chapter is devoted to quasihomogeneous wave front sets and their application to the description of singularities of quasihomogeneous distributions, in particular to quasihomogeneous fundamental solutions of the heat and of the Schrödinger equation.
Binns, Lewis A.; Valachis, Dimitris; Anderson, Sean; Gough, David W.; Nicholson, David; Greenway, Phil
2002-07-01
Previously, we have developed techniques for Simultaneous Localization and Map Building based on the augmented state Kalman filter. Here we report the results of experiments conducted over multiple vehicles each equipped with a laser range finder for sensing the external environment, and a laser tracking system to provide highly accurate ground truth. The goal is simultaneously to build a map of an unknown environment and to use that map to navigate a vehicle that otherwise would have no way of knowing its location, and to distribute this process over several vehicles. We have constructed an on-line, distributed implementation to demonstrate the principle. In this paper we describe the system architecture, the nature of the experimental set up, and the results obtained. These are compared with the estimated ground truth. We show that distributed SLAM has a clear advantage in the sense that it offers a potential super-linear speed-up over single vehicle SLAM. In particular, we explore the time taken to achieve a given quality of map, and consider the repeatability and accuracy of the method. Finally, we discuss some practical implementation issues.
International Nuclear Information System (INIS)
Liu, Guoliang; Zhang, Feng; Hao, Lizhen
2012-01-01
We previously introduced a time record model for use in studying the duration of sand–dust storms. In the model, X is the normalized wind speed and Xr is the normalized wind speed threshold for the sand–dust storm. X is represented by a random signal with a normal Gaussian distribution. The storms occur when X ≥ Xr. From this model, the time interval distribution of N = Aexp(−bt) can be deduced, wherein N is the number of time intervals with length greater than t, A and b are constants, and b is related to Xr. In this study, sand–dust storm data recorded in spring at the Yanchi meteorological station in China were analysed to verify whether the time interval distribution of the sand–dust storms agrees with the above time interval distribution. We found that the distribution of the time interval between successive sand–dust storms in April agrees well with the above exponential equation. However, the interval distribution for the sand–dust storm data for the entire spring period displayed a better fit to the Weibull equation and depended on the variation of the sand–dust storm threshold wind speed. (paper)
Beare, Brendan K.
2009-01-01
Suppose that X and Y are random variables. We define a replicating function to be a function f such that f(X) and Y have the same distribution. In general, the set of replicating functions for a given pair of random variables may be infinite. Suppose we have some objective function, or cost function, defined over the set of replicating functions, and we seek to estimate the replicating function with the lowest cost. We develop an approach to estimating the cheapest replicating function that i...
2007-01-01
Please note that starting from 1 March 2007, the mail distribution and collection times will be modified for the following buildings: 6, 8, 9, 10, 15, 16, 17, 18, 19, 20, 21, 22, 23, 24, 25, 29, 69, 40, 70, 101, 102, 109, 118, 152, 153, 154, 155, 166, 167, 169, 171, 174, 261, 354, 358, 576, 579 and 580. Complementary Information on the new times will be posted on the entry doors and left in the mail boxes of each building. TS/FM Group
Stewart, Stan
2004-01-01
Switchgear plays a fundamental role within the power supply industry. It is required to isolate faulty equipment, divide large networks into sections for repair purposes, reconfigure networks in order to restore power supplies and control other equipment.This book begins with the general principles of the Switchgear function and leads on to discuss topics such as interruption techniques, fault level calculations, switching transients and electrical insulation; making this an invaluable reference source. Solutions to practical problems associated with Distribution Switchgear are also included.
Distributions of component failure rates estimated from LER data
International Nuclear Information System (INIS)
Atwood, C.L.
1985-01-01
Past analyses of Licensee Event Report (LER) data have noted that component failure rates vary from plant to plant, and have estimated the distributions by two-parameter gamma distributions. In this study, a more complicated distributional form is considered, a mixture of gammas. This could arise if the plants' failure rates cluster into distinct groups. The method was applied to selected published LER data for diesel generators, pumps, valves, and instrumentation and control assemblies. The improved fits from using a mixture rather than a single gamma distribution were minimal, and not statistically significant. There seem to be two possibilities: either explanatory variables affect the failure rates only in a gradual way, not a qualitative way; or, for estimating individual component failure rates, the published LER data have been analyzed to the limit of resolution. 9 refs
Universality in the tail of musical note rank distribution
Beltrán del Río, M.; Cocho, G.; Naumis, G. G.
2008-09-01
Although power laws have been used to fit rank distributions in many different contexts, they usually fail at the tails. Languages as sequences of symbols have been a popular subject for ranking distributions, and for this purpose, music can be treated as such. Here we show that more than 1800 musical compositions are very well fitted by the first kind two parameter beta distribution, which arises in the ranking of multiplicative stochastic processes. The parameters a and b are obtained for classical, jazz and rock music, revealing interesting features. Specially, we have obtained a clear trend in the values of the parameters for major and minor tonal modes. Finally, we discuss the distribution of notes for each octave and its connection with the ranking of the notes.
Topology Counts: Force Distributions in Circular Spring Networks
Heidemann, Knut M.; Sageman-Furnas, Andrew O.; Sharma, Abhinav; Rehfeldt, Florian; Schmidt, Christoph F.; Wardetzky, Max
2018-02-01
Filamentous polymer networks govern the mechanical properties of many biological materials. Force distributions within these networks are typically highly inhomogeneous, and, although the importance of force distributions for structural properties is well recognized, they are far from being understood quantitatively. Using a combination of probabilistic and graph-theoretical techniques, we derive force distributions in a model system consisting of ensembles of random linear spring networks on a circle. We show that characteristic quantities, such as the mean and variance of the force supported by individual springs, can be derived explicitly in terms of only two parameters: (i) average connectivity and (ii) number of nodes. Our analysis shows that a classical mean-field approach fails to capture these characteristic quantities correctly. In contrast, we demonstrate that network topology is a crucial determinant of force distributions in an elastic spring network. Our results for 1D linear spring networks readily generalize to arbitrary dimensions.
Topology Counts: Force Distributions in Circular Spring Networks.
Heidemann, Knut M; Sageman-Furnas, Andrew O; Sharma, Abhinav; Rehfeldt, Florian; Schmidt, Christoph F; Wardetzky, Max
2018-02-09
Filamentous polymer networks govern the mechanical properties of many biological materials. Force distributions within these networks are typically highly inhomogeneous, and, although the importance of force distributions for structural properties is well recognized, they are far from being understood quantitatively. Using a combination of probabilistic and graph-theoretical techniques, we derive force distributions in a model system consisting of ensembles of random linear spring networks on a circle. We show that characteristic quantities, such as the mean and variance of the force supported by individual springs, can be derived explicitly in terms of only two parameters: (i) average connectivity and (ii) number of nodes. Our analysis shows that a classical mean-field approach fails to capture these characteristic quantities correctly. In contrast, we demonstrate that network topology is a crucial determinant of force distributions in an elastic spring network. Our results for 1D linear spring networks readily generalize to arbitrary dimensions.
Distributions of component failure rates, estimated from LER data
International Nuclear Information System (INIS)
Atwood, C.L.
1985-01-01
Past analyses of Licensee Event Report (LER) data have noted that component failure rates vary from plant to plant, and have estimated the distributions by two-parameter γ distributions. In this study, a more complicated distributional form is considered, a mixture of γs. This could arise if the plants' failure rates cluster into distinct groups. The method was applied to selected published LER data for diesel generators, pumps, valves, and instrumentation and control assemblies. The improved fits from using a mixture rather than a single γ distribution were minimal, and not statistically significant. There seem to be two possibilities: either explanatory variables affect the failure rates only in a gradual way, not a qualitative way; or, for estimating individual component failure rates, the published LER data have been analyzed to the limit of resolution
Directory of Open Access Journals (Sweden)
André Gracioso Peres Silva
2015-12-01
Full Text Available This study aimed to map the stem biomass of an even-aged eucalyptus plantation in southeastern Brazil based on canopy height profile (CHPs statistics using wall-to-wall discrete return airborne laser scanning (ALS, and compare the results with alternative maps generated by ordinary kriging interpolation from field-derived measurements. The assessment of stem biomass with ALS data was carried out using regression analysis methods. Initially, CHPs were determined to express the distribution of laser point heights in the ALS cloud for each sample plot. The probability density function (pdf used was the Weibull distribution, with two parameters that in a secondary task, were used as explanatory variables to model stem biomass. ALS metrics such as height percentiles, dispersion of heights, and proportion of points were also investigated. A simple linear regression model of stem biomass as a function of the Weibull scale parameter showed high correlation (adj.R2 = 0.89. The alternative model considering the 30th percentile and the Weibull shape parameter slightly improved the quality of the estimation (adj.R2 = 0.93. Stem biomass maps based on the Weibull scale parameter doubled the accuracy of the ordinary kriging approach (relative root mean square error = 6 % and 13 %, respectively.
Starn, J. J.; Belitz, K.; Carlson, C.
2017-12-01
Groundwater residence-time distributions (RTDs) are critical for assessing susceptibility of water resources to contamination. This novel approach for estimating regional RTDs was to first simulate groundwater flow using existing regional digital data sets in 13 intermediate size watersheds (each an average of 7,000 square kilometers) that are representative of a wide range of glacial systems. RTDs were simulated with particle tracking. We refer to these models as "general models" because they are based on regional, as opposed to site-specific, digital data. Parametric RTDs were created from particle RTDs by fitting 1- and 2-component Weibull, gamma, and inverse Gaussian distributions, thus reducing a large number of particle travel times to 3 to 7 parameters (shape, location, and scale for each component plus a mixing fraction) for each modeled area. The scale parameter of these distributions is related to the mean exponential age; the shape parameter controls departure from the ideal exponential distribution and is partly a function of interaction with bedrock and with drainage density. Given the flexible shape and mathematical similarity of these distributions, any of them are potentially a good fit to particle RTDs. The 1-component gamma distribution provided a good fit to basin-wide particle RTDs. RTDs at monitoring wells and streams often have more complicated shapes than basin-wide RTDs, caused in part by heterogeneity in the model, and generally require 2-component distributions. A machine learning model was trained on the RTD parameters using features derived from regionally available watershed characteristics such as recharge rate, material thickness, and stream density. RTDs appeared to vary systematically across the landscape in relation to watershed features. This relation was used to produce maps of useful metrics with respect to risk-based thresholds, such as the time to first exceedance, time to maximum concentration, time above the threshold
International Nuclear Information System (INIS)
Mo, In Gyu
1992-01-01
This book tells of business strategy and distribution innovation, purpose of intelligent distribution, intelligent supply distribution, intelligent production distribution, intelligent sale distribution software for intelligence and future and distribution. It also introduces component technology keeping intelligent distribution such as bar cord, OCR, packing, and intelligent auto-warehouse, system technology, and cases in America, Japan and other countries.
A Comparative Study of Distribution System Parameter Estimation Methods
Energy Technology Data Exchange (ETDEWEB)
Sun, Yannan; Williams, Tess L.; Gourisetti, Sri Nikhil Gup
2016-07-17
In this paper, we compare two parameter estimation methods for distribution systems: residual sensitivity analysis and state-vector augmentation with a Kalman filter. These two methods were originally proposed for transmission systems, and are still the most commonly used methods for parameter estimation. Distribution systems have much lower measurement redundancy than transmission systems. Therefore, estimating parameters is much more difficult. To increase the robustness of parameter estimation, the two methods are applied with combined measurement snapshots (measurement sets taken at different points in time), so that the redundancy for computing the parameter values is increased. The advantages and disadvantages of both methods are discussed. The results of this paper show that state-vector augmentation is a better approach for parameter estimation in distribution systems. Simulation studies are done on a modified version of IEEE 13-Node Test Feeder with varying levels of measurement noise and non-zero error in the other system model parameters.
Directory of Open Access Journals (Sweden)
Azam Zaka
2014-10-01
Full Text Available This paper is concerned with the modifications of maximum likelihood, moments and percentile estimators of the two parameter Power function distribution. Sampling behavior of the estimators is indicated by Monte Carlo simulation. For some combinations of parameter values, some of the modified estimators appear better than the traditional maximum likelihood, moments and percentile estimators with respect to bias, mean square error and total deviation.
Ceramics Analysis and Reliability Evaluation of Structures (CARES). Users and programmers manual
Nemeth, Noel N.; Manderscheid, Jane M.; Gyekenyesi, John P.
1990-01-01
This manual describes how to use the Ceramics Analysis and Reliability Evaluation of Structures (CARES) computer program. The primary function of the code is to calculate the fast fracture reliability or failure probability of macroscopically isotropic ceramic components. These components may be subjected to complex thermomechanical loadings, such as those found in heat engine applications. The program uses results from MSC/NASTRAN or ANSYS finite element analysis programs to evaluate component reliability due to inherent surface and/or volume type flaws. CARES utilizes the Batdorf model and the two-parameter Weibull cumulative distribution function to describe the effect of multiaxial stress states on material strength. The principle of independent action (PIA) and the Weibull normal stress averaging models are also included. Weibull material strength parameters, the Batdorf crack density coefficient, and other related statistical quantities are estimated from four-point bend bar or unifrom uniaxial tensile specimen fracture strength data. Parameter estimation can be performed for single or multiple failure modes by using the least-square analysis or the maximum likelihood method. Kolmogorov-Smirnov and Anderson-Darling goodness-of-fit tests, ninety percent confidence intervals on the Weibull parameters, and Kanofsky-Srinivasan ninety percent confidence band values are also provided. The probabilistic fast-fracture theories used in CARES, along with the input and output for CARES, are described. Example problems to demonstrate various feature of the program are also included. This manual describes the MSC/NASTRAN version of the CARES program.
Krueger, Ute; Schimmelpfeng, Katja
2013-03-01
A sufficient staffing level in fire and rescue dispatch centers is crucial for saving lives. Therefore, it is important to estimate the expected workload properly. For this purpose, we analyzed whether a dispatch center can be considered as a call center. Current call center publications very often model call arrivals as a non-homogeneous Poisson process. This bases on the underlying assumption of the caller's independent decision to call or not to call. In case of an emergency, however, there are often calls from more than one person reporting the same incident and thus, these calls are not independent. Therefore, this paper focuses on the dependency of calls in a fire and rescue dispatch center. We analyzed and evaluated several distributions in this setting. Results are illustrated using real-world data collected from a typical German dispatch center in Cottbus ("Leitstelle Lausitz"). We identified the Pólya distribution as being superior to the Poisson distribution in describing the call arrival rate and the Weibull distribution to be more suitable than the exponential distribution for interarrival times and service times. However, the commonly used distributions offer acceptable approximations. This is important for estimating a sufficient staffing level in practice using, e.g., the Erlang-C model.
International Nuclear Information System (INIS)
Zhao Yongxiang; Gao Qing; Cai Lixun
1999-01-01
A statistical investigation into the fitting of four possible fatigue assumed distributions (three parameter Weibull, two parameter Weibull, lognormal and extreme maximum value distributions) for the crack initiation lives of piping structural welded joint in low cycle fatigue test of 240 degree C is performed by linear regression and least squares methods. The results reveal that the three parameters Weibull distribution may give misleading results in fatigue reliability analysis because the shape parameter is often less than 1. This means that the failure rate decreases with fatigue cycling which is contrary to the general understanding of the behaviour of welded joint. Reliability analyses may also affected by the slightly nonconservative evaluations in tail regions of this distribution. The other three distributions are slightly poor in the total fit effects, but they can be safety assumed in reliability analyses due to the non-conservative evaluations in tail regions mostly and the consistency with the fatigue physics of the structural behaviour of welded joint in the range of engineering practice. In addition, the extreme maximum value distribution is in good consists with the general physical understanding of the structural behaviour of welded joint
Chang, Wen-Ruey; Chang, Chien-Chi; Matz, Simon; Lesch, Mary F
2008-11-01
The required friction coefficient is defined as the minimum friction needed at the shoe and floor interface to support human locomotion. The available friction is the maximum friction coefficient that can be supported without a slip at the shoe and floor interface. A statistical model was recently introduced to estimate the probability of slip and fall incidents by comparing the available friction with the required friction, assuming that both the available and required friction coefficients have stochastic distributions. This paper presents a methodology to investigate the stochastic distributions of the required friction coefficient for level walking. In this experiment, a walkway with a layout of three force plates was specially designed in order to capture a large number of successful strikes without causing fatigue in participants. The required coefficient of friction data of one participant, who repeatedly walked on this walkway under four different walking conditions, is presented as an example of the readiness of the methodology examined in this paper. The results of the Kolmogorov-Smirnov goodness-of-fit test indicated that the required friction coefficient generated from each foot and walking condition by this participant appears to fit the normal, log-normal or Weibull distributions with few exceptions. Among these three distributions, the normal distribution appears to fit all the data generated with this participant. The average of successful strikes for each walk achieved with three force plates in this experiment was 2.49, ranging from 2.14 to 2.95 for each walking condition. The methodology and layout of the experimental apparatus presented in this paper are suitable for being applied to a full-scale study.
Imperfect Preventive Maintenance Model Study Based On Reliability Limitation
Directory of Open Access Journals (Sweden)
Zhou Qian
2016-01-01
Full Text Available Effective maintenance is crucial for equipment performance in industry. Imperfect maintenance conform to actual failure process. Taking the dynamic preventive maintenance cost into account, the preventive maintenance model was constructed by using age reduction factor. The model regards the minimization of repair cost rate as final target. It use allowed smallest reliability as the replacement condition. Equipment life was assumed to follow two parameters Weibull distribution since it was one of the most commonly adopted distributions to fit cumulative failure problems. Eventually the example verifies the rationality and benefits of the model.
Directory of Open Access Journals (Sweden)
Begoña Delgado
2005-01-01
Full Text Available Minimal processing implementation greatly depends on a detailed knowledge of the effects of preservation factors and their combinations on the spoilage and foodborne pathogenic microorganisms. The effectiveness of mild preservation conditions will become increasingly dependent on a more stochastic approach linking microbial physiological factors with product preservation factors. In this study, the validity of frequency distributions to efficiently describe the inactivation and growth of Bacillus cereus in the presence of natural antimicrobials (essential oils has been studied. For this purpose, vegetative cells were exposed to 0.6 mM of thymol or cymene, obtaining survival curves that were best described by the distribution of Weibull, since a tailing effect was observed. B. cereus was also exposed in a growth medium to a low concentration (0.1 mM of both antimicrobials, separately or combined, and the lag times obtained were fitted to a normal distribution, which allowed a description of dispersion of the start of growth. This allowed a more efficient evaluation of the experimental data to establish safe processing conditions according to accurate parameters and their implementation in risk assessment.
Viana, Duarte S; Santamaría, Luis; Figuerola, Jordi
2016-02-01
Propagule retention time is a key factor in determining propagule dispersal distance and the shape of "seed shadows". Propagules dispersed by animal vectors are either ingested and retained in the gut until defecation or attached externally to the body until detachment. Retention time is a continuous variable, but it is commonly measured at discrete time points, according to pre-established sampling time-intervals. Although parametric continuous distributions have been widely fitted to these interval-censored data, the performance of different fitting methods has not been evaluated. To investigate the performance of five different fitting methods, we fitted parametric probability distributions to typical discretized retention-time data with known distribution using as data-points either the lower, mid or upper bounds of sampling intervals, as well as the cumulative distribution of observed values (using either maximum likelihood or non-linear least squares for parameter estimation); then compared the estimated and original distributions to assess the accuracy of each method. We also assessed the robustness of these methods to variations in the sampling procedure (sample size and length of sampling time-intervals). Fittings to the cumulative distribution performed better for all types of parametric distributions (lognormal, gamma and Weibull distributions) and were more robust to variations in sample size and sampling time-intervals. These estimated distributions had negligible deviations of up to 0.045 in cumulative probability of retention times (according to the Kolmogorov-Smirnov statistic) in relation to original distributions from which propagule retention time was simulated, supporting the overall accuracy of this fitting method. In contrast, fitting the sampling-interval bounds resulted in greater deviations that ranged from 0.058 to 0.273 in cumulative probability of retention times, which may introduce considerable biases in parameter estimates. We
Evaluation of the climate change impact on wind resources in Taiwan Strait
International Nuclear Information System (INIS)
Chang, Tsang-Jung; Chen, Chun-Lung; Tu, Yi-Long; Yeh, Hung-Te; Wu, Yu-Ting
2015-01-01
Highlights: • We propose a new statistical downscaling framework to evaluate the climate change impact on wind resources in Taiwan Strait. • The statistical model relates Weibull distribution parameters to output of a GCM model and regression coefficients. • Validation of the simulated wind speed distribution presents an acceptable agreement with meteorological data. • Three chosen GCMs show the same tendency that the eastern half of Taiwan Strait stores higher wind resources. - Abstract: A new statistical downscaling framework is proposed to evaluate the climate change impact on wind resources in Taiwan Strait. In this framework, a two-parameter Weibull distribution function is used to estimate the wind energy density distribution in the strait. An empirically statistical downscaling model that relates the Weibull parameters to output of a General Circulation Model (GCM) and regression coefficients is adopted. The regression coefficients are calculated using wind speed results obtained from a past climate (1981–2000) simulation reconstructed by a Weather Research and Forecasting (WRF) model. These WRF-reconstructed wind speed results are validated with data collected at a weather station on an islet inside the strait. The comparison shows that the probability distributions of the monthly wind speeds obtained from WRF-reconstructed and measured wind speed data are in acceptable agreement, with small discrepancies of 10.3% and 7.9% for the shape and scale parameters of the Weibull distribution, respectively. The statistical downscaling framework with output from three chosen GCMs (i.e., ECHAM5, CM2.1 and CGCM2.3.2) is applied to evaluate the wind energy density distribution in Taiwan Strait for three future climate periods of 2011–2040, 2041–2070, and 2071–2100. The results show that the wind energy density distributions in the future climate periods are higher in the eastern half of Taiwan Strait, but reduce slightly by 3% compared with that in the
Statistical Analysis of Data for Timber Strengths
DEFF Research Database (Denmark)
Sørensen, John Dalsgaard; Hoffmeyer, P.
Statistical analyses are performed for material strength parameters from approximately 6700 specimens of structural timber. Non-parametric statistical analyses and fits to the following distributions types have been investigated: Normal, Lognormal, 2 parameter Weibull and 3-parameter Weibull...
Determination of material distribution in heading process of small bimetallic bar
Presz, Wojciech; Cacko, Robert
2018-05-01
The electrical connectors mostly have silver contacts joined by riveting. In order to reduce costs, the core of the contact rivet can be replaced with cheaper material, e.g. copper. There is a wide range of commercially available bimetallic (silver-copper) rivets on the market for the production of contacts. Following that, new conditions in the riveting process are created because the bi-metal object is riveted. In the analyzed example, it is a small size object, which can be placed on the border of microforming. Based on the FEM modeling of the load process of bimetallic rivets with different material distributions, the desired distribution was chosen and the choice was justified. Possible material distributions were parameterized with two parameters referring to desirable distribution characteristics. The parameter: Coefficient of Mutual Interactions of Plastic Deformations and the method of its determination have been proposed. The parameter is determined based of two-parameter stress-strain curves and is a function of these parameters and the range of equivalent strains occurring in the analyzed process. The proposed method was used for the upsetting process of the bimetallic head of the electrical contact. A nomogram was established to predict the distribution of materials in the head of the rivet and the appropriate selection of a pair of materials to achieve the desired distribution.
Modelling of extreme minimum rainfall using generalised extreme value distribution for Zimbabwe
Directory of Open Access Journals (Sweden)
Delson Chikobvu
2015-09-01
Full Text Available We modelled the mean annual rainfall for data recorded in Zimbabwe from 1901 to 2009. Extreme value theory was used to estimate the probabilities of meteorological droughts. Droughts can be viewed as extreme events which go beyond and/or below normal rainfall occurrences, such as exceptionally low mean annual rainfall. The duality between the distribution of the minima and maxima was exploited and used to fit the generalised extreme value distribution (GEVD to the data and hence find probabilities of extreme low levels of mean annual rainfall. The augmented Dickey Fuller test confirmed that rainfall data were stationary, while the normal quantile-quantile plot indicated that rainfall data deviated from the normality assumption at both ends of the tails of the distribution. The maximum likelihood estimation method and the Bayesian approach were used to find the parameters of the GEVD. The Kolmogorov-Smirnov and Anderson-Darling goodnessof- fit tests showed that the Weibull class of distributions was a good fit to the minima mean annual rainfall using the maximum likelihood estimation method. The mean return period estimate of a meteorological drought using the threshold value of mean annual rainfall of 473 mm was 8 years. This implies that if in the year there is a meteorological drought then another drought of the same intensity or greater is expected after 8 years. It is expected that the use of Bayesian inference may better quantify the level of uncertainty associated with the GEVD parameter estimates than with the maximum likelihood estimation method. The Markov chain Monte Carlo algorithm for the GEVD was applied to construct the model parameter estimates using the Bayesian approach. These findings are significant because results based on non-informative priors (Bayesian method and the maximum likelihood method approach are expected to be similar.
Energy Technology Data Exchange (ETDEWEB)
Carroll, Mark C., E-mail: mark.carroll@inl.gov [Idaho National Laboratory, PO Box 1625, Idaho Falls, ID 83415-2213 (United States); Windes, William E.; Rohrbaugh, David T. [Idaho National Laboratory, PO Box 1625, Idaho Falls, ID 83415-2213 (United States); Strizak, Joseph P.; Burchell, Timothy D. [Oak Ridge National Laboratory, PO Box 2008, Oak Ridge, TN 37831-6088 (United States)
2016-10-15
Highlights: • An effort is underway to fully quantify the properties of nuclear-grade graphites. • Physical and mechanical properties of graphite are best characterized by distributions. • The Weibull distribution is most representative of graphite based on goodness-of-fit. • Fine-grained isomolded grades exhibit higher Weibull modulus values, indicative of more homogeneous properties. - Abstract: The full characterization of the physical and mechanical properties of candidate nuclear-grade graphites is highly dependent upon an understanding of the distribution of values that are inherent to graphite. Not only do the material properties of graphites vary considerably between grades owing to the raw materials sources, filler particle type and size, methods of compaction, and production process parameters, but variability is observed between billets of the same grade from a single batch and even across spatial positions within a single billet. Properly enveloping the expected properties of interest requires both a substantial amount of data to statistically capture this variability and a representative distribution capable of accurately describing the range of values. A two-parameter Weibull distribution is confirmed to be representative of the distribution of physical (density, modulus) and mechanical (compressive, flexure, and tensile strength) values in five different nuclear-grades of graphite. The fine-grained isomolded grades tend toward higher Weibull modulus and characteristic strength values, while the extruded grade being examined exhibits relatively large distributions in property values. With the number of candidate graphite specimens that can undergo full irradiation exposure and subsequent testing having limited feasibility with regard to economics and timely evaluations, a proper capture of the raw material variability in an unirradiated state can provide crucial supplementary resolution to the limited amount of available data on irradiated
Centrality dependence of baryon and meson momentum distributions in proton-nucleus collisions
International Nuclear Information System (INIS)
Hwa, Rudolph C.; Yang, C.B.
2002-01-01
The proton and neutron inclusive distributions in the projectile fragmentation region of pA collisions are studied in the valon model. Momentum degradation and flavor changes due to the nuclear medium are described at the valon level using two parameters. Particle production is treated by means of the recombination subprocess. The centrality dependences of the net proton and neutron spectra of the NA49 data are satisfactorily reproduced. The effective degradation length is determined to be 17 fm. Pion inclusive distributions can be calculated without any adjustable parameters
Probability distribution relationships
Directory of Open Access Journals (Sweden)
Yousry Abdelkader
2013-05-01
Full Text Available In this paper, we are interesting to show the most famous distributions and their relations to the other distributions in collected diagrams. Four diagrams are sketched as networks. The first one is concerned to the continuous distributions and their relations. The second one presents the discrete distributions. The third diagram is depicted the famous limiting distributions. Finally, the Balakrishnan skew-normal density and its relationship with the other distributions are shown in the fourth diagram.
Reactor power distribution monitor
International Nuclear Information System (INIS)
Hoizumi, Atsushi.
1986-01-01
Purpose: To grasp the margin for the limit value of the power distribution peaking factor inside the reactor under operation by using the reactor power distribution monitor. Constitution: The monitor is composed of the 'constant' file, (to store in-reactor power distributions obtained from analysis), TIP and thermocouple, lateral output distribution calibrating apparatus, axial output distribution synthesizer and peaking factor synthesizer. The lateral output distribution calibrating apparatus is used to make calibration by comparing the power distribution obtained from the thermocouples to the power distribution obtained from the TIP, and then to provide the power distribution lateral peaking factors. The axial output distribution synthesizer provides the power distribution axial peaking factors in accordance with the signals from the out-pile neutron flux detector. These axial and lateral power peaking factors are synthesized with high precision in the three-dimensional format and can be monitored at any time. (Kamimura, M.)
On bivariate geometric distribution
Directory of Open Access Journals (Sweden)
K. Jayakumar
2013-05-01
Full Text Available Characterizations of bivariate geometric distribution using univariate and bivariate geometric compounding are obtained. Autoregressive models with marginals as bivariate geometric distribution are developed. Various bivariate geometric distributions analogous to important bivariate exponential distributions like, Marshall-Olkin’s bivariate exponential, Downton’s bivariate exponential and Hawkes’ bivariate exponential are presented.
Extended Poisson Exponential Distribution
Directory of Open Access Journals (Sweden)
Anum Fatima
2015-09-01
Full Text Available A new mixture of Modified Exponential (ME and Poisson distribution has been introduced in this paper. Taking the Maximum of Modified Exponential random variable when the sample size follows a zero truncated Poisson distribution we have derived the new distribution, named as Extended Poisson Exponential distribution. This distribution possesses increasing and decreasing failure rates. The Poisson-Exponential, Modified Exponential and Exponential distributions are special cases of this distribution. We have also investigated some mathematical properties of the distribution along with Information entropies and Order statistics of the distribution. The estimation of parameters has been obtained using the Maximum Likelihood Estimation procedure. Finally we have illustrated a real data application of our distribution.
Distributed Data Management and Distributed File Systems
Girone, Maria
2015-01-01
The LHC program has been successful in part due to the globally distributed computing resources used for collecting, serving, processing, and analyzing the large LHC datasets. The introduction of distributed computing early in the LHC program spawned the development of new technologies and techniques to synchronize information and data between physically separated computing centers. Two of the most challenges services are the distributed file systems and the distributed data management systems. In this paper I will discuss how we have evolved from local site services to more globally independent services in the areas of distributed file systems and data management and how these capabilities may continue to evolve into the future. I will address the design choices, the motivations, and the future evolution of the computing systems used for High Energy Physics.
ZERODUR: deterministic approach for strength design
Hartmann, Peter
2012-12-01
There is an increasing request for zero expansion glass ceramic ZERODUR substrates being capable of enduring higher operational static loads or accelerations. The integrity of structures such as optical or mechanical elements for satellites surviving rocket launches, filigree lightweight mirrors, wobbling mirrors, and reticle and wafer stages in microlithography must be guaranteed with low failure probability. Their design requires statistically relevant strength data. The traditional approach using the statistical two-parameter Weibull distribution suffered from two problems. The data sets were too small to obtain distribution parameters with sufficient accuracy and also too small to decide on the validity of the model. This holds especially for the low failure probability levels that are required for reliable applications. Extrapolation to 0.1% failure probability and below led to design strengths so low that higher load applications seemed to be not feasible. New data have been collected with numbers per set large enough to enable tests on the applicability of the three-parameter Weibull distribution. This distribution revealed to provide much better fitting of the data. Moreover it delivers a lower threshold value, which means a minimum value for breakage stress, allowing of removing statistical uncertainty by introducing a deterministic method to calculate design strength. Considerations taken from the theory of fracture mechanics as have been proven to be reliable with proof test qualifications of delicate structures made from brittle materials enable including fatigue due to stress corrosion in a straight forward way. With the formulae derived, either lifetime can be calculated from given stress or allowable stress from minimum required lifetime. The data, distributions, and design strength calculations for several practically relevant surface conditions of ZERODUR are given. The values obtained are significantly higher than those resulting from the two-parameter
TAIL ASYMPTOTICS OF LIGHT-TAILED WEIBULL-LIKE SUMS
DEFF Research Database (Denmark)
Asmussen, Soren; Hashorva, Enkelejd; Laub, Patrick J.
2017-01-01
We consider sums of n i.i.d. random variables with tails close to exp{-x(beta)} for some beta > 1. Asymptotics developed by Rootzen (1987) and Balkema, Kluppelberg, and Resnick (1993) are discussed from the point of view of tails rather than of densities, using a somewhat different angle...
Performance Analysis of Methods for Estimating Weibull Parameters ...
African Journals Online (AJOL)
The performance analysis revealed that the MLM was the most accurate model followed by the EPF and the GM. Furthermore, the comparison between the wind speed standard deviation predicted by the proposed models and the measured data showed that the MLM has a smaller relative error of -3.33% on average ...
Unifying distribution functions: some lesser known distributions.
Moya-Cessa, J R; Moya-Cessa, H; Berriel-Valdos, L R; Aguilar-Loreto, O; Barberis-Blostein, P
2008-08-01
We show that there is a way to unify distribution functions that describe simultaneously a classical signal in space and (spatial) frequency and position and momentum for a quantum system. Probably the most well known of them is the Wigner distribution function. We show how to unify functions of the Cohen class, Rihaczek's complex energy function, and Husimi and Glauber-Sudarshan distribution functions. We do this by showing how they may be obtained from ordered forms of creation and annihilation operators and by obtaining them in terms of expectation values in different eigenbases.
Cumulative Poisson Distribution Program
Bowerman, Paul N.; Scheuer, Ernest M.; Nolty, Robert
1990-01-01
Overflow and underflow in sums prevented. Cumulative Poisson Distribution Program, CUMPOIS, one of two computer programs that make calculations involving cumulative Poisson distributions. Both programs, CUMPOIS (NPO-17714) and NEWTPOIS (NPO-17715), used independently of one another. CUMPOIS determines cumulative Poisson distribution, used to evaluate cumulative distribution function (cdf) for gamma distributions with integer shape parameters and cdf for X (sup2) distributions with even degrees of freedom. Used by statisticians and others concerned with probabilities of independent events occurring over specific units of time, area, or volume. Written in C.
Predictable return distributions
DEFF Research Database (Denmark)
Pedersen, Thomas Quistgaard
trace out the entire distribution. A univariate quantile regression model is used to examine stock and bond return distributions individually, while a multivariate model is used to capture their joint distribution. An empirical analysis on US data shows that certain parts of the return distributions......-of-sample analyses show that the relative accuracy of the state variables in predicting future returns varies across the distribution. A portfolio study shows that an investor with power utility can obtain economic gains by applying the empirical return distribution in portfolio decisions instead of imposing...
Drinking Water Distribution Systems
Learn about an overview of drinking water distribution systems, the factors that degrade water quality in the distribution system, assessments of risk, future research about these risks, and how to reduce cross-connection control risk.
Distributed multiscale computing
Borgdorff, J.
2014-01-01
Multiscale models combine knowledge, data, and hypotheses from different scales. Simulating a multiscale model often requires extensive computation. This thesis evaluates distributing these computations, an approach termed distributed multiscale computing (DMC). First, the process of multiscale
TRANSMUTED EXPONENTIATED EXPONENTIAL DISTRIBUTION
MEROVCI, FATON
2013-01-01
In this article, we generalize the exponentiated exponential distribution using the quadratic rank transmutation map studied by Shaw etal. [6] to develop a transmuted exponentiated exponential distribution. Theproperties of this distribution are derived and the estimation of the model parameters is discussed. An application to real data set are finally presented forillustration
Leadership for Distributed Teams
De Rooij, J.P.G.
2009-01-01
The aim of this dissertation was to study the little examined, yet important issue of leadership for distributed teams. Distributed teams are defined as: “teams of which members are geographically distributed and are therefore working predominantly via mediated communication means on an
Ahsanullah, Mohammad
2016-01-01
The aim of the book is to give a through account of the basic theory of extreme value distributions. The book cover a wide range of materials available to date. The central ideas and results of extreme value distributions are presented. The book rwill be useful o applied statisticians as well statisticians interrested to work in the area of extreme value distributions.vmonograph presents the central ideas and results of extreme value distributions.The monograph gives self-contained of theory and applications of extreme value distributions.
DEFF Research Database (Denmark)
Jensen, Lotte Groth; Bossen, Claus
2016-01-01
different socio-technical systems (paper-based and electronic patient records). Drawing on the theory of distributed cognition and narrative theory, primarily inspired by the work done within health care by Cheryl Mattingly, we propose that the creation of overview may be conceptualised as ‘distributed plot......-making’. Distributed cognition focuses on the role of artefacts, humans and their interaction in information processing, while narrative theory focuses on how humans create narratives through the plot construction. Hence, the concept of distributed plot-making highlights the distribution of information processing...
Fitting and Analyzing Randomly Censored Geometric Extreme Exponential Distribution
Directory of Open Access Journals (Sweden)
Muhammad Yameen Danish
2016-06-01
Full Text Available The paper presents the Bayesian analysis of two-parameter geometric extreme exponential distribution with randomly censored data. The continuous conjugate prior of the scale and shape parameters of the model does not exist while computing the Bayes estimates, it is assumed that the scale and shape parameters have independent gamma priors. It is seen that the closed-form expressions for the Bayes estimators are not possible; we suggest the Lindley’s approximation to obtain the Bayes estimates. However, the Bayesian credible intervals cannot be constructed while using this method, we propose Gibbs sampling to obtain the Bayes estimates and also to construct the Bayesian credible intervals. Monte Carlo simulation study is carried out to observe the behavior of the Bayes estimators and also to compare with the maximum likelihood estimators. One real data analysis is performed for illustration.
Hierarchical species distribution models
Hefley, Trevor J.; Hooten, Mevin B.
2016-01-01
Determining the distribution pattern of a species is important to increase scientific knowledge, inform management decisions, and conserve biodiversity. To infer spatial and temporal patterns, species distribution models have been developed for use with many sampling designs and types of data. Recently, it has been shown that count, presence-absence, and presence-only data can be conceptualized as arising from a point process distribution. Therefore, it is important to understand properties of the point process distribution. We examine how the hierarchical species distribution modeling framework has been used to incorporate a wide array of regression and theory-based components while accounting for the data collection process and making use of auxiliary information. The hierarchical modeling framework allows us to demonstrate how several commonly used species distribution models can be derived from the point process distribution, highlight areas of potential overlap between different models, and suggest areas where further research is needed.
Directory of Open Access Journals (Sweden)
Jacques C. Tardif
2016-09-01
Full Text Available In central Canada, long fire history reconstructions are rare. In a context where both anthropogenic and climate influences on fire regime have changed, Parks Canada has a mandate to maintain ecological integrity. Here we present a fire history derived from fire-scarred jack pine (Pinus banksiana Lamb. trees growing at their southern distribution limit in Riding Mountain National Park (RMNP. In Lake Katherine Fire Management Unit (LKFMU, a subregion within the park, fire history was reconstructed from archival records, tree-ring records, and charcoal in lake sediment. From about 1450 to 1850 common era (CE the fire return intervals varied from 37 to 125 years, according to models. During the period 1864–1930 the study area burned frequently (Weibull Mean Fire Intervals between 2.66 and 5.62 years; this period coincided with the end of First Nations occupation and the start of European settlement. Major recruitment pulses were associated with the stand-replacing 1864 and 1894 fires. This period nevertheless corresponded to a reduction in charcoal accumulation. The current fire-free period in LKFMU (1930–today coincides with RMNP establishment, exclusion of First Nations land use and increased fire suppression. Charcoal accumulation further decreased during this period. In the absence of fire, jack pine exclusion in LKFMU is foreseeable and the use of prescribed burning is advocated to conserve this protected jack pine ecosystem, at the southern margins of its range, and in the face of potential climate change.
Janković, Bojan
2011-10-01
The non-isothermal pyrolysis kinetics of Acetocell (the organosolv) and Lignoboost® (kraft) lignins, in an inert atmosphere, have been studied by thermogravimetric analysis. Using isoconversional analysis, it was concluded that the apparent activation energy for all lignins strongly depends on conversion, showing that the pyrolysis of lignins is not a single chemical process. It was identified that the pyrolysis process of Acetocell and Lignoboost® lignin takes place over three reaction steps, which was confirmed by appearance of the corresponding isokinetic relationships (IKR). It was found that major pyrolysis stage of both lignins is characterized by stilbene pyrolysis reactions, which were subsequently followed by decomposition reactions of products derived from the stilbene pyrolytic process. It was concluded that non-isothermal pyrolysis of Acetocell and Lignoboost® lignins can be best described by n-th (n>1) reaction order kinetics, using the Weibull mixture model (as distributed reactivity model) with alternating shape parameters. Copyright © 2011 Elsevier Ltd. All rights reserved.
Kilany, N M
2016-01-01
The Lomax distribution (Pareto Type-II) is widely applicable in reliability and life testing problems in engineering as well as in survival analysis as an alternative distribution. In this paper, Weighted Lomax distribution is proposed and studied. The density function and its behavior, moments, hazard and survival functions, mean residual life and reversed failure rate, extreme values distributions and order statistics are derived and studied. The parameters of this distribution are estimated by the method of moments and the maximum likelihood estimation method and the observed information matrix is derived. Moreover, simulation schemes are derived. Finally, an application of the model to a real data set is presented and compared with some other well-known distributions.
Are Parton Distributions Positive?
Forte, Stefano; Ridolfi, Giovanni; Altarelli, Guido; Forte, Stefano; Ridolfi, Giovanni
1999-01-01
We show that the naive positivity conditions on polarized parton distributions which follow from their probabilistic interpretation in the naive parton model are reproduced in perturbative QCD at the leading log level if the quark and gluon distribution are defined in terms of physical processes. We show how these conditions are modified at the next-to-leading level, and discuss their phenomenological implications, in particular in view of the determination of the polarized gluon distribution
Are parton distributions positive?
International Nuclear Information System (INIS)
Forte, Stefano; Altarelli, Guido; Ridolfi, Giovanni
1999-01-01
We show that the naive positivity conditions on polarized parton distributions which follow from their probabilistic interpretation in the naive parton model are reproduced in perturbative QCD at the leading log level if the quark and gluon distribution are defined in terms of physical processes. We show how these conditions are modified at the next-to-leading level, and discuss their phenomenological implications, in particular in view of the determination of the polarized gluon distribution
dftools: Distribution function fitting
Obreschkow, Danail
2018-05-01
dftools, written in R, finds the most likely P parameters of a D-dimensional distribution function (DF) generating N objects, where each object is specified by D observables with measurement uncertainties. For instance, if the objects are galaxies, it can fit a mass function (D=1), a mass-size distribution (D=2) or the mass-spin-morphology distribution (D=3). Unlike most common fitting approaches, this method accurately accounts for measurement in uncertainties and complex selection functions.
Bringi, V. N.; Chandrasekar, V.; Hubbert, J.; Gorgucci, E.; Randeu, W. L.; Schoenhuber, M.
2003-01-01
The application of polarimetric radar data to the retrieval of raindrop size distribution parameters and rain rate in samples of convective and stratiform rain types is presented. Data from the Colorado State University (CSU), CHILL, NCAR S-band polarimetric (S-Pol), and NASA Kwajalein radars are analyzed for the statistics and functional relation of these parameters with rain rate. Surface drop size distribution measurements using two different disdrometers (2D video and RD-69) from a number of climatic regimes are analyzed and compared with the radar retrievals in a statistical and functional approach. The composite statistics based on disdrometer and radar retrievals suggest that, on average, the two parameters (generalized intercept and median volume diameter) for stratiform rain distributions lie on a straight line with negative slope, which appears to be consistent with variations in the microphysics of stratiform precipitation (melting of larger, dry snow particles versus smaller, rimed ice particles). In convective rain, `maritime-like' and `continental-like' clusters could be identified in the same two-parameter space that are consistent with the different multiplicative coefficients in the Z = aR1.5 relations quoted in the literature for maritime and continental regimes.
Mahmoud, Hosam M
2011-01-01
A cutting-edge look at the emerging distributional theory of sorting Research on distributions associated with sorting algorithms has grown dramatically over the last few decades, spawning many exact and limiting distributions of complexity measures for many sorting algorithms. Yet much of this information has been scattered in disparate and highly specialized sources throughout the literature. In Sorting: A Distribution Theory, leading authority Hosam Mahmoud compiles, consolidates, and clarifies the large volume of available research, providing a much-needed, comprehensive treatment of the
Sallam, A A
2010-01-01
"Electricity distribution is the penultimate stage in the delivery of electricity to end users. The only book that deals with the key topics of interest to distribution system engineers, Electric Distribution Systems presents a comprehensive treatment of the subject with an emphasis on both the practical and academic points of view. Reviewing traditional and cutting-edge topics, the text is useful to practicing engineers working with utility companies and industry, undergraduate graduate and students, and faculty members who wish to increase their skills in distribution system automation and monitoring."--
Cooling water distribution system
Orr, Richard
1994-01-01
A passive containment cooling system for a nuclear reactor containment vessel. Disclosed is a cooling water distribution system for introducing cooling water by gravity uniformly over the outer surface of a steel containment vessel using an interconnected series of radial guide elements, a plurality of circumferential collector elements and collector boxes to collect and feed the cooling water into distribution channels extending along the curved surface of the steel containment vessel. The cooling water is uniformly distributed over the curved surface by a plurality of weirs in the distribution channels.
Distributed Structure Searchable Toxicity
U.S. Environmental Protection Agency — The Distributed Structure Searchable Toxicity (DSSTox) online resource provides high quality chemical structures and annotations in association with toxicity data....
Distributed Energy Technology Laboratory
Federal Laboratory Consortium — The Distributed Energy Technologies Laboratory (DETL) is an extension of the power electronics testing capabilities of the Photovoltaic System Evaluation Laboratory...
Huang, Ding-wei
2013-03-01
We present a statistical model for the distribution of Chinese names. Both family names and given names are studied on the same basis. With naive expectation, the distribution of family names can be very different from that of given names. One is affected mostly by genealogy, while the other can be dominated by cultural effects. However, we find that both distributions can be well described by the same model. Various scaling behaviors can be understood as a result of stochastic processes. The exponents of different power-law distributions are controlled by a single parameter. We also comment on the significance of full-name repetition in Chinese population.
Statistical distribution sampling
Johnson, E. S.
1975-01-01
Determining the distribution of statistics by sampling was investigated. Characteristic functions, the quadratic regression problem, and the differential equations for the characteristic functions are analyzed.
Distributed security in closed distributed systems
DEFF Research Database (Denmark)
Hernandez, Alejandro Mario
properties. This is also restricted to distributed systems in which the set of locations is known a priori. All this follows techniques borrowed from both the model checking and the static analysis communities. In the end, we reach a step towards solving the problem of enforcing security in distributed...... systems. We achieve the goal of showing how this can be done, though we restrict ourselves to closed systems and with a limited set of enforceable security policies. In this setting, our approach proves to be efficient. Finally, we achieve all this by bringing together several fields of Computer Science......The goal of the present thesis is to discuss, argue and conclude about ways to provide security to the information travelling around computer systems consisting of several known locations. When developing software systems, security of the information managed by these plays an important role...
Distributed intelligence in CAMAC
International Nuclear Information System (INIS)
Kunz, P.F.
1977-01-01
The CAMAC digital interface standard has served us well since 1969. During this time there have been enormous advances in digital electronics. In particular, low cost microprocessors now make it feasible to consider use of distributed intelligence even in simple data acquisition systems. This paper describes a simple extension of the CAMAC standard which allows distributed intelligence at the crate level
Distributed intelligence in CAMAC
International Nuclear Information System (INIS)
Kunz, P.F.
1977-01-01
A simple extension of the CAMAC standard is described which allows distributed intelligence at the crate level. By distributed intelligence is meant that there is more than one source of control in a system. This standard is just now emerging from the NIM Dataway Working Group and its European counterpart. 1 figure
Bastiaans, M.J.; Testorf, M.; Hennelly, B.; Ojeda-Castañeda, J.
2009-01-01
In 1932 Wigner introduced a distribution function in mechanics that permitted a description of mechanical phenomena in a phase space. Such a Wigner distribution was introduced in optics by Dolin and Walther in the sixties, to relate partial coherence to radiometry. A few years later, the Wigner
Cache Oblivious Distribution Sweeping
DEFF Research Database (Denmark)
Brodal, G.S.; Fagerberg, R.
2002-01-01
We adapt the distribution sweeping method to the cache oblivious model. Distribution sweeping is the name used for a general approach for divide-and-conquer algorithms where the combination of solved subproblems can be viewed as a merging process of streams. We demonstrate by a series of algorith...
Distributed Energy Implementation Options
Energy Technology Data Exchange (ETDEWEB)
Shah, Chandralata N [National Renewable Energy Laboratory (NREL), Golden, CO (United States)
2017-09-13
This presentation covers the options for implementing distributed energy projects. It distinguishes between options available for distributed energy that is government owned versus privately owned, with a focus on the privately owned options including Energy Savings Performance Contract Energy Sales Agreements (ESPC ESAs). The presentation covers the new ESPC ESA Toolkit and other Federal Energy Management Program resources.
Mullender, Sape J.
1987-01-01
In the past five years, distributed operating systems research has gone through a consolidation phase. On a large number of design issues there is now considerable consensus between different research groups. In this paper, an overview of recent research in distributed systems is given. In turn, the
Intelligent distribution network design
Provoost, F.
2009-01-01
Distribution networks (medium voltage and low voltage) are subject to changes caused by re-regulation of the energy supply, economical and environmental constraints more sensitive equipment, power quality requirements and the increasing penetration of distributed generation. The latter is seen as
Directory of Open Access Journals (Sweden)
Yazhou Jiang
2016-04-01
Full Text Available The increasing importance of system reliability and resilience is changing the way distribution systems are planned and operated. To achieve a distribution system self-healing against power outages, emerging technologies and devices, such as remote-controlled switches (RCSs and smart meters, are being deployed. The higher level of automation is transforming traditional distribution systems into the smart distribution systems (SDSs of the future. The availability of data and remote control capability in SDSs provides distribution operators with an opportunity to optimize system operation and control. In this paper, the development of SDSs and resulting benefits of enhanced system capabilities are discussed. A comprehensive survey is conducted on the state-of-the-art applications of RCSs and smart meters in SDSs. Specifically, a new method, called Temporal Causal Diagram (TCD, is used to incorporate outage notifications from smart meters for enhanced outage management. To fully utilize the fast operation of RCSs, the spanning tree search algorithm is used to develop service restoration strategies. Optimal placement of RCSs and the resulting enhancement of system reliability are discussed. Distribution system resilience with respect to extreme events is presented. Test cases are used to demonstrate the benefit of SDSs. Active management of distributed generators (DGs is introduced. Future research in a smart distribution environment is proposed.
The Distributed Criterion Design
McDougall, Dennis
2006-01-01
This article describes and illustrates a novel form of the changing criterion design called the distributed criterion design, which represents perhaps the first advance in the changing criterion design in four decades. The distributed criterion design incorporates elements of the multiple baseline and A-B-A-B designs and is well suited to applied…
Evaluating Distributed Timing Constraints
DEFF Research Database (Denmark)
Kristensen, C.H.; Drejer, N.
1994-01-01
In this paper we describe a solution to the problem of implementing time-optimal evaluation of timing constraints in distributed real-time systems.......In this paper we describe a solution to the problem of implementing time-optimal evaluation of timing constraints in distributed real-time systems....
Advanced Distribution Management System
Avazov, Artur; Sobinova, Lubov Anatolievna
2016-01-01
This article describes the advisability of using advanced distribution management systems in the electricity distribution networks area and considers premises of implementing ADMS within the Smart Grid era. Also, it gives the big picture of ADMS and discusses the ADMS advantages and functionalities.
Advanced Distribution Management System
Avazov, Artur R.; Sobinova, Liubov A.
2016-02-01
This article describes the advisability of using advanced distribution management systems in the electricity distribution networks area and considers premises of implementing ADMS within the Smart Grid era. Also, it gives the big picture of ADMS and discusses the ADMS advantages and functionalities.
Advanced Distribution Management System
Directory of Open Access Journals (Sweden)
Avazov Artur R.
2016-01-01
Full Text Available This article describes the advisability of using advanced distribution management systems in the electricity distribution networks area and considers premises of implementing ADMS within the Smart Grid era. Also, it gives the big picture of ADMS and discusses the ADMS advantages and functionalities.
Development of distributed target
Yu Hai Jun; Li Qin; Zhou Fu Xin; Shi Jin Shui; Ma Bing; Chen Nan; Jing Xiao Bing
2002-01-01
Linear introduction accelerator is expected to generate small diameter X-ray spots with high intensity. The interaction of the electron beam with plasmas generated at the X-ray converter will make the spot on target increase with time and debase the X-ray dose and the imaging resolving power. A distributed target is developed which has about 24 pieces of thin 0.05 mm tantalum films distributed over 1 cm. due to the structure adoption, the distributed target material over a large volume decreases the energy deposition per unit volume and hence reduces the temperature of target surface, then reduces the initial plasma formalizing and its expansion velocity. The comparison and analysis with two kinds of target structures are presented using numerical calculation and experiments, the results show the X-ray dose and normalized angle distribution of the two is basically the same, while the surface of the distributed target is not destroyed like the previous block target
Fanfani, Alessandra; Sanches, Jose Afonso; Andreeva, Julia; Bagliesi, Giusepppe; Bauerdick, Lothar; Belforte, Stefano; Bittencourt Sampaio, Patricia; Bloom, Ken; Blumenfeld, Barry; Bonacorsi, Daniele; Brew, Chris; Calloni, Marco; Cesini, Daniele; Cinquilli, Mattia; Codispoti, Giuseppe; D'Hondt, Jorgen; Dong, Liang; Dongiovanni, Danilo; Donvito, Giacinto; Dykstra, David; Edelmann, Erik; Egeland, Ricky; Elmer, Peter; Eulisse, Giulio; Evans, Dave; Fanzago, Federica; Farina, Fabio; Feichtinger, Derek; Fisk, Ian; Flix, Josep; Grandi, Claudio; Guo, Yuyi; Happonen, Kalle; Hernandez, Jose M; Huang, Chih-Hao; Kang, Kejing; Karavakis, Edward; Kasemann, Matthias; Kavka, Carlos; Khan, Akram; Kim, Bockjoo; Klem, Jukka; Koivumaki, Jesper; Kress, Thomas; Kreuzer, Peter; Kurca, Tibor; Kuznetsov, Valentin; Lacaprara, Stefano; Lassila-Perini, Kati; Letts, James; Linden, Tomas; Lueking, Lee; Maes, Joris; Magini, Nicolo; Maier, Gerhild; McBride, Patricia; Metson, Simon; Miccio, Vincenzo; Padhi, Sanjay; Pi, Haifeng; Riahi, Hassen; Riley, Daniel; Rossman, Paul; Saiz, Pablo; Sartirana, Andrea; Sciaba, Andrea; Sekhri, Vijay; Spiga, Daniele; Tuura, Lassi; Vaandering, Eric; Vanelderen, Lukas; Van Mulders, Petra; Vedaee, Aresh; Villella, Ilaria; Wicklund, Eric; Wildish, Tony; Wissing, Christoph; Wurthwein, Frank
2009-01-01
The CMS experiment expects to manage several Pbytes of data each year during the LHC programme, distributing them over many computing sites around the world and enabling data access at those centers for analysis. CMS has identified the distributed sites as the primary location for physics analysis to support a wide community with thousands potential users. This represents an unprecedented experimental challenge in terms of the scale of distributed computing resources and number of user. An overview of the computing architecture, the software tools and the distributed infrastructure is reported. Summaries of the experience in establishing efficient and scalable operations to get prepared for CMS distributed analysis are presented, followed by the user experience in their current analysis activities.
Managing Distributed Software Projects
DEFF Research Database (Denmark)
Persson, John Stouby
Increasingly, software projects are becoming geographically distributed, with limited face-toface interaction between participants. These projects face particular challenges that need careful managerial attention. This PhD study reports on how we can understand and support the management...... of distributed software projects, based on a literature study and a case study. The main emphasis of the literature study was on how to support the management of distributed software projects, but also contributed to an understanding of these projects. The main emphasis of the case study was on how to understand...... the management of distributed software projects, but also contributed to supporting the management of these projects. The literature study integrates what we know about risks and risk-resolution techniques, into a framework for managing risks in distributed contexts. This framework was developed iteratively...
DEFF Research Database (Denmark)
Schultz, Ulrik Pagh
2007-01-01
. Programming a modular, self-reconfigurable robot is however a complicated task: the robot is essentially a real-time, distributed embedded system, where control and communication paths often are tightly coupled to the current physical configuration of the robot. To facilitate the task of programming modular....... This approach allows the programmer to dynamically distribute behaviors throughout a robot and moreover provides a partial abstraction over the concrete physical shape of the robot. We have implemented a prototype of a distributed control diffusion system for the ATRON modular, self-reconfigurable robot......, self-reconfigurable robots, we present the concept of distributed control diffusion: distributed queries are used to identify modules that play a specific role in the robot, and behaviors that implement specific control strategies are diffused throughout the robot based on these role assignments...
Distributed Language and Dialogism
DEFF Research Database (Denmark)
Steffensen, Sune Vork
2015-01-01
addresses Linell’s critique of Distributed Language as rooted in biosemiotics and in theories of organism-environment systems. It is argued that Linell’s sense-based approach entails an individualist view of how conspecific Others acquire their status as prominent parts of the sense-maker’s environment......This article takes a starting point in Per Linell’s (2013) review article on the book Distributed Language (Cowley, 2011a) and other contributions to the field of ‘Distributed Language’, including Cowley et al. (2010) and Hodges et al. (2012). The Distributed Language approach is a naturalistic...... and anti-representational approach to language that builds on recent developments in the cognitive sciences. With a starting point in Linell’s discussion of the approach, the article aims to clarify four aspects of a distributed view of language vis-à-vis the tradition of Dialogism, as presented by Linell...
Pervasive Electricity Distribution System
Directory of Open Access Journals (Sweden)
Muhammad Usman Tahir
2017-06-01
Full Text Available Now a days a country cannot become economically strong until and unless it has enough electrical power to fulfil industrial and domestic needs. Electrical power being the pillar of any country’s economy, needs to be used in an efficient way. The same step is taken here by proposing a new system for energy distribution from substation to consumer houses, also it monitors the consumer consumption and record data. Unlike traditional manual Electrical systems, pervasive electricity distribution system (PEDS introduces a fresh perspective to monitor the feeder line status at distribution and consumer level. In this system an effort is taken to address the issues of electricity theft, manual billing, online monitoring of electrical distribution system and automatic control of electrical distribution points. The project is designed using microcontroller and different sensors, its GUI is designed in Labview software.
Distribution Integration | Grid Modernization | NREL
Distribution Integration Distribution Integration The goal of NREL's distribution integration research is to tackle the challenges facing the widespread integration of distributed energy resources NREL engineers mapping out a grid model on a whiteboard. NREL's research on the integration of
Distributed Propulsion Vehicles
Kim, Hyun Dae
2010-01-01
Since the introduction of large jet-powered transport aircraft, the majority of these vehicles have been designed by placing thrust-generating engines either under the wings or on the fuselage to minimize aerodynamic interactions on the vehicle operation. However, advances in computational and experimental tools along with new technologies in materials, structures, and aircraft controls, etc. are enabling a high degree of integration of the airframe and propulsion system in aircraft design. The National Aeronautics and Space Administration (NASA) has been investigating a number of revolutionary distributed propulsion vehicle concepts to increase aircraft performance. The concept of distributed propulsion is to fully integrate a propulsion system within an airframe such that the aircraft takes full synergistic benefits of coupling of airframe aerodynamics and the propulsion thrust stream by distributing thrust using many propulsors on the airframe. Some of the concepts are based on the use of distributed jet flaps, distributed small multiple engines, gas-driven multi-fans, mechanically driven multifans, cross-flow fans, and electric fans driven by turboelectric generators. This paper describes some early concepts of the distributed propulsion vehicles and the current turboelectric distributed propulsion (TeDP) vehicle concepts being studied under the NASA s Subsonic Fixed Wing (SFW) Project to drastically reduce aircraft-related fuel burn, emissions, and noise by the year 2030 to 2035.
Centralized versus distributed propulsion
Clark, J. P.
1982-01-01
The functions and requirements of auxiliary propulsion systems are reviewed. None of the three major tasks (attitude control, stationkeeping, and shape control) can be performed by a collection of thrusters at a single central location. If a centralized system is defined as a collection of separated clusters, made up of the minimum number of propulsion units, then such a system can provide attitude control and stationkeeping for most vehicles. A distributed propulsion system is characterized by more numerous propulsion units in a regularly distributed arrangement. Various proposed large space systems are reviewed and it is concluded that centralized auxiliary propulsion is best suited to vehicles with a relatively rigid core. These vehicles may carry a number of flexible or movable appendages. A second group, consisting of one or more large flexible flat plates, may need distributed propulsion for shape control. There is a third group, consisting of vehicles built up from multiple shuttle launches, which may be forced into a distributed system because of the need to add additional propulsion units as the vehicles grow. The effects of distributed propulsion on a beam-like structure were examined. The deflection of the structure under both translational and rotational thrusts is shown as a function of the number of equally spaced thrusters. When two thrusters only are used it is shown that location is an important parameter. The possibility of using distributed propulsion to achieve minimum overall system weight is also examined. Finally, an examination of the active damping by distributed propulsion is described.
Technologies for distributed defense
Seiders, Barbara; Rybka, Anthony
2002-07-01
For Americans, the nature of warfare changed on September 11, 2001. Our national security henceforth will require distributed defense. One extreme of distributed defense is represented by fully deployed military troops responding to a threat from a hostile nation state. At the other extreme is a country of 'citizen soldiers', with families and communities securing their common defense through heightened awareness, engagement as good neighbors, and local support of and cooperation with local law enforcement, emergency and health care providers. Technologies - for information exploitation, biological agent detection, health care surveillance, and security - will be critical to ensuring success in distributed defense.
Electric power distribution handbook
Short, Thomas Allen
2014-01-01
Of the ""big three"" components of electrical infrastructure, distribution typically gets the least attention. In fact, a thorough, up-to-date treatment of the subject hasn't been published in years, yet deregulation and technical changes have increased the need for better information. Filling this void, the Electric Power Distribution Handbook delivers comprehensive, cutting-edge coverage of the electrical aspects of power distribution systems. The first few chapters of this pragmatic guidebook focus on equipment-oriented information and applications such as choosing transformer connections,
International Nuclear Information System (INIS)
Williams, Mike; Egede, Ulrik; Paterson, Stuart
2011-01-01
The distributed analysis experience to date at LHCb has been positive: job success rates are high and wait times for high-priority jobs are low. LHCb users access the grid using the GANGA job-management package, while the LHCb virtual organization manages its resources using the DIRAC package. This clear division of labor has benefitted LHCb and its users greatly; it is a major reason why distributed analysis at LHCb has been so successful. The newly formed LHCb distributed analysis support team has also proved to be a success.
Agile distributed software development
DEFF Research Database (Denmark)
Persson, John Stouby; Mathiassen, Lars; Aaen, Ivan
2012-01-01
While face-to-face interaction is fundamental in agile software development, distributed environments must rely extensively on mediated interactions. Practicing agile principles in distributed environments therefore poses particular control challenges related to balancing fixed vs. evolving quality...... requirements and people vs. process-based collaboration. To investigate these challenges, we conducted an in-depth case study of a successful agile distributed software project with participants from a Russian firm and a Danish firm. Applying Kirsch’s elements of control framework, we offer an analysis of how...
Money distribution with intermediation
Teles, Caio Augusto Colnago
2013-01-01
This pap er analyzes the distribution of money holdings in a commo dity money search-based mo del with intermediation. Intro ducing heterogeneity of costs to the Kiyotaki e Wright ( 1989 ) mo del, Cavalcanti e Puzzello ( 2010) gives rise to a non-degenerated distribution of money. We extend further this mo del intro ducing intermediation in the trading pro cess. We show that the distribution of money matters for savings decisions. This gives rises to a xed p oint problem for the ...
Learning Networks Distributed Environment
Martens, Harrie; Vogten, Hubert; Koper, Rob; Tattersall, Colin; Van Rosmalen, Peter; Sloep, Peter; Van Bruggen, Jan; Spoelstra, Howard
2005-01-01
Learning Networks Distributed Environment is a prototype of an architecture that allows the sharing and modification of learning materials through a number of transport protocols. The prototype implements a p2p protcol using JXTA.
Sheaves of Schwartz distributions
International Nuclear Information System (INIS)
Damyanov, B.P.
1991-09-01
The theory of sheaves is a relevant mathematical language for describing the localization principle, known to be valid for the Schwartz distributions (generalized functions). After introducing some fundamentals of sheaves and the basic facts about distribution spaces, the distribution sheaf D Ω of topological C-vector spaces over an open set Ω in R n is systematically studied. A sheaf D M of distributions on a C ∞ -manifold M is then introduced, following a definition of Hoermander's for its particular elements. Further, a general definition of sheaves on a manifold, that are locally isomorphic to (or, modelled on) a sheaf on R n , in proposed. The sheaf properties of D M are studied and this sheaf is shown to be locally isomorphic to D Ω , as a sheaf of topological vector spaces. (author). 14 refs
International Nuclear Information System (INIS)
Piasecki, E.
2009-01-01
Heavy-ion collisions often produce a fusion barrier distribution with structures displaying a fingerprint of couplings to highly collective excitations [1]. Basically the same distribution can be obtained from large-angle quasi-elastic scattering, though here the role of the many weak direct-reaction channels is unclear. For 2 0N e + 9 0Z r we have observed the barrier structures expected for the highly deformed neon projectile, but for 2 0N e + 9 2Z r we find completely smooth distribution (see Fig.1). We find that transfer channels in these systems are of similar strength but single particle excitations are significantly stronger in the latter case. They apparently reduce the 'resolving power' of the quasi-elastic channel, what leads to smeared out, or 'fuzzy' barrier distribution. This is the first case when such a phenomenon has been observed.(author)
ATLAS Distributed Computing Automation
Schovancova, J; The ATLAS collaboration; Borrego, C; Campana, S; Di Girolamo, A; Elmsheuser, J; Hejbal, J; Kouba, T; Legger, F; Magradze, E; Medrano Llamas, R; Negri, G; Rinaldi, L; Sciacca, G; Serfon, C; Van Der Ster, D C
2012-01-01
The ATLAS Experiment benefits from computing resources distributed worldwide at more than 100 WLCG sites. The ATLAS Grid sites provide over 100k CPU job slots, over 100 PB of storage space on disk or tape. Monitoring of status of such a complex infrastructure is essential. The ATLAS Grid infrastructure is monitored 24/7 by two teams of shifters distributed world-wide, by the ATLAS Distributed Computing experts, and by site administrators. In this paper we summarize automation efforts performed within the ATLAS Distributed Computing team in order to reduce manpower costs and improve the reliability of the system. Different aspects of the automation process are described: from the ATLAS Grid site topology provided by the ATLAS Grid Information System, via automatic site testing by the HammerCloud, to automatic exclusion from production or analysis activities.
Global Landslide Hazard Distribution
National Aeronautics and Space Administration — Global Landslide Hazard Distribution is a 2.5 minute grid of global landslide and snow avalanche hazards based upon work of the Norwegian Geotechnical Institute...
Financing Distributed Generation
Energy Technology Data Exchange (ETDEWEB)
Walker, A.
2001-06-29
This paper introduces the engineer who is undertaking distributed generation projects to a wide range of financing options. Distributed generation systems (such as internal combustion engines, small gas turbines, fuel cells and photovoltaics) all require an initial investment, which is recovered over time through revenues or savings. An understanding of the cost of capital and financing structures helps the engineer develop realistic expectations and not be offended by the common requirements of financing organizations. This paper discusses several mechanisms for financing distributed generation projects: appropriations; debt (commercial bank loan); mortgage; home equity loan; limited partnership; vendor financing; general obligation bond; revenue bond; lease; Energy Savings Performance Contract; utility programs; chauffage (end-use purchase); and grants. The paper also discusses financial strategies for businesses focusing on distributed generation: venture capital; informal investors (''business angels''); bank and debt financing; and the stock market.
DOLIB: Distributed Object Library
Energy Technology Data Exchange (ETDEWEB)
D' Azevedo, E.F.
1994-01-01
This report describes the use and implementation of DOLIB (Distributed Object Library), a library of routines that emulates global or virtual shared memory on Intel multiprocessor systems. Access to a distributed global array is through explicit calls to gather and scatter. Advantages of using DOLIB include: dynamic allocation and freeing of huge (gigabyte) distributed arrays, both C and FORTRAN callable interfaces, and the ability to mix shared-memory and message-passing programming models for ease of use and optimal performance. DOLIB is independent of language and compiler extensions and requires no special operating system support. DOLIB also supports automatic caching of read-only data for high performance. The virtual shared memory support provided in DOLIB is well suited for implementing Lagrangian particle tracking techniques. We have also used DOLIB to create DONIO (Distributed Object Network I/O Library), which obtains over a 10-fold improvement in disk I/O performance on the Intel Paragon.
DOLIB: Distributed Object Library
Energy Technology Data Exchange (ETDEWEB)
D`Azevedo, E.F.; Romine, C.H.
1994-10-01
This report describes the use and implementation of DOLIB (Distributed Object Library), a library of routines that emulates global or virtual shared memory on Intel multiprocessor systems. Access to a distributed global array is through explicit calls to gather and scatter. Advantages of using DOLIB include: dynamic allocation and freeing of huge (gigabyte) distributed arrays, both C and FORTRAN callable interfaces, and the ability to mix shared-memory and message-passing programming models for ease of use and optimal performance. DOLIB is independent of language and compiler extensions and requires no special operating system support. DOLIB also supports automatic caching of read-only data for high performance. The virtual shared memory support provided in DOLIB is well suited for implementing Lagrangian particle tracking techniques. We have also used DOLIB to create DONIO (Distributed Object Network I/O Library), which obtains over a 10-fold improvement in disk I/O performance on the Intel Paragon.
Financing Distributed Generation
International Nuclear Information System (INIS)
Walker, A.
2001-01-01
This paper introduces the engineer who is undertaking distributed generation projects to a wide range of financing options. Distributed generation systems (such as internal combustion engines, small gas turbines, fuel cells and photovoltaics) all require an initial investment, which is recovered over time through revenues or savings. An understanding of the cost of capital and financing structures helps the engineer develop realistic expectations and not be offended by the common requirements of financing organizations. This paper discusses several mechanisms for financing distributed generation projects: appropriations; debt (commercial bank loan); mortgage; home equity loan; limited partnership; vendor financing; general obligation bond; revenue bond; lease; Energy Savings Performance Contract; utility programs; chauffage (end-use purchase); and grants. The paper also discusses financial strategies for businesses focusing on distributed generation: venture capital; informal investors (''business angels''); bank and debt financing; and the stock market
Tradeoffs in distributed databases
Juntunen, R. (Risto)
2016-01-01
Abstract In a distributed database data is spread throughout the network into separated nodes with different DBMS systems (Date, 2000). According to CAP-theorem three database properties — consistency, availability and partition tolerance cannot be achieved simultaneously in distributed database systems. Two of these properties can be achieved but not all three at the same time (Brewer, 2000). Since this theorem there has b...
Distributed generation hits market
International Nuclear Information System (INIS)
Anon.
1997-01-01
The pace at which vendors are developing and marketing gas turbines and reciprocating engines for small-scale applications may signal the widespread growth of distributed generation. Loosely defined to refer to applications in which power generation equipment is located close to end users who have near-term power capacity needs, distributed generation encompasses a broad range of technologies and load requirements. Disagreement is inevitable, but many industry observers associate distributed generation with applications anywhere from 25 kW to 25 MW. Ten years ago, distributed generation users only represented about 2% of the world market. Today, that figure has increased to about 4 or 5%, and probably could settle in the 20% range within a 3-to-5-year period, according to Michael Jones, San Diego, Calif.-based Solar Turbines Inc. power generation marketing manager. The US Energy Information Administration predicts about 175 GW of generation capacity will be added domestically by 2010. If 20% comes from smaller plants, distributed generation could account for about 35 GW. Even with more competition, it's highly unlikely distributed generation will totally replace current market structures and central stations. Distributed generation may be best suited for making market inroads when and where central systems need upgrading, and should prove its worth when the system can't handle peak demands. Typical applications include small reciprocating engine generators at remote customer sites or larger gas turbines to boost the grid. Additional market opportunities include standby capacity, peak shaving, power quality, cogeneration and capacity rental for immediate demand requirements. Integration of distributed generation systems--using gas-fueled engines, gas-fired combustion engines and fuel cells--can upgrade power quality for customers and reduce operating costs for electric utilities
Camilleri, Mark Anthony
2017-01-01
The distribution channels link the customers with the businesses. For many years, the tourism businesses may have distributed their products and services through intermediaries. However, the latest advances in technology have brought significant changes in this regard. More individuals and corporate customers are increasingly benefiting of ubiquitous technologies, including digital media. The development of mobile devices and their applications, are offering a wide range of possibilities to t...
Diphoton generalized distribution amplitudes
International Nuclear Information System (INIS)
El Beiyad, M.; Pire, B.; Szymanowski, L.; Wallon, S.
2008-01-01
We calculate the leading order diphoton generalized distribution amplitudes by calculating the amplitude of the process γ*γ→γγ in the low energy and high photon virtuality region at the Born order and in the leading logarithmic approximation. As in the case of the anomalous photon structure functions, the γγ generalized distribution amplitudes exhibit a characteristic lnQ 2 behavior and obey inhomogeneous QCD evolution equations.
Crumpacker, John R.
2009-01-01
Approved for public release, distribution unlimited Password cracking requires significant processing power, which in today's world is located at a workstation or home in the form of a desktop computer. Berkeley Open Infrastructure for Network Computing (BOINC) is the conduit to this significant source of processing power and John the Ripper is the key. BOINC is a distributed data processing system that incorporates client-server relationships to generically process data. The BOINC structu...
Quantum dense key distribution
International Nuclear Information System (INIS)
Degiovanni, I.P.; Ruo Berchera, I.; Castelletto, S.; Rastello, M.L.; Bovino, F.A.; Colla, A.M.; Castagnoli, G.
2004-01-01
This paper proposes a protocol for quantum dense key distribution. This protocol embeds the benefits of a quantum dense coding and a quantum key distribution and is able to generate shared secret keys four times more efficiently than the Bennet-Brassard 1984 protocol. We hereinafter prove the security of this scheme against individual eavesdropping attacks, and we present preliminary experimental results, showing its feasibility
Dillon, Joshua V.; Langmore, Ian; Tran, Dustin; Brevdo, Eugene; Vasudevan, Srinivas; Moore, Dave; Patton, Brian; Alemi, Alex; Hoffman, Matt; Saurous, Rif A.
2017-01-01
The TensorFlow Distributions library implements a vision of probability theory adapted to the modern deep-learning paradigm of end-to-end differentiable computation. Building on two basic abstractions, it offers flexible building blocks for probabilistic computation. Distributions provide fast, numerically stable methods for generating samples and computing statistics, e.g., log density. Bijectors provide composable volume-tracking transformations with automatic caching. Together these enable...
Intelligent distributed computing
Thampi, Sabu
2015-01-01
This book contains a selection of refereed and revised papers of the Intelligent Distributed Computing Track originally presented at the third International Symposium on Intelligent Informatics (ISI-2014), September 24-27, 2014, Delhi, India. The papers selected for this Track cover several Distributed Computing and related topics including Peer-to-Peer Networks, Cloud Computing, Mobile Clouds, Wireless Sensor Networks, and their applications.
Parton Distributions Working Group
International Nuclear Information System (INIS)
Barbaro, L. de; Keller, S. A.; Kuhlmann, S.; Schellman, H.; Tung, W.-K.
2000-01-01
This report summarizes the activities of the Parton Distributions Working Group of the QCD and Weak Boson Physics workshop held in preparation for Run II at the Fermilab Tevatron. The main focus of this working group was to investigate the different issues associated with the development of quantitative tools to estimate parton distribution functions uncertainties. In the conclusion, the authors introduce a Manifesto that describes an optimal method for reporting data
Distribution System Pricing with Distributed Energy Resources
Energy Technology Data Exchange (ETDEWEB)
Hledik, Ryan [The Brattle Group, Cambridge, MA (United States); Lazar, Jim [The Regulatory Assistance Project, Montpelier, VT (United States); Schwartz, Lisa [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States)
2017-08-16
Technological changes in the electric utility industry bring tremendous opportunities and significant challenges. Customers are installing clean sources of on-site generation such as rooftop solar photovoltaic (PV) systems. At the same time, smart appliances and control systems that can communicate with the grid are entering the retail market. Among the opportunities these changes create are a cleaner and more diverse power system, the ability to improve system reliability and system resilience, and the potential for lower total costs. Challenges include integrating these new resources in a way that maintains system reliability, provides an equitable sharing of system costs, and avoids unbalanced impacts on different groups of customers, including those who install distributed energy resources (DERs) and low-income households who may be the least able to afford the transition.
Vaginal drug distribution modeling.
Katz, David F; Yuan, Andrew; Gao, Yajing
2015-09-15
This review presents and applies fundamental mass transport theory describing the diffusion and convection driven mass transport of drugs to the vaginal environment. It considers sources of variability in the predictions of the models. It illustrates use of model predictions of microbicide drug concentration distribution (pharmacokinetics) to gain insights about drug effectiveness in preventing HIV infection (pharmacodynamics). The modeling compares vaginal drug distributions after different gel dosage regimens, and it evaluates consequences of changes in gel viscosity due to aging. It compares vaginal mucosal concentration distributions of drugs delivered by gels vs. intravaginal rings. Finally, the modeling approach is used to compare vaginal drug distributions across species with differing vaginal dimensions. Deterministic models of drug mass transport into and throughout the vaginal environment can provide critical insights about the mechanisms and determinants of such transport. This knowledge, and the methodology that obtains it, can be applied and translated to multiple applications, involving the scientific underpinnings of vaginal drug distribution and the performance evaluation and design of products, and their dosage regimens, that achieve it. Copyright © 2015 Elsevier B.V. All rights reserved.
Sparse distributed memory overview
Raugh, Mike
1990-01-01
The Sparse Distributed Memory (SDM) project is investigating the theory and applications of massively parallel computing architecture, called sparse distributed memory, that will support the storage and retrieval of sensory and motor patterns characteristic of autonomous systems. The immediate objectives of the project are centered in studies of the memory itself and in the use of the memory to solve problems in speech, vision, and robotics. Investigation of methods for encoding sensory data is an important part of the research. Examples of NASA missions that may benefit from this work are Space Station, planetary rovers, and solar exploration. Sparse distributed memory offers promising technology for systems that must learn through experience and be capable of adapting to new circumstances, and for operating any large complex system requiring automatic monitoring and control. Sparse distributed memory is a massively parallel architecture motivated by efforts to understand how the human brain works. Sparse distributed memory is an associative memory, able to retrieve information from cues that only partially match patterns stored in the memory. It is able to store long temporal sequences derived from the behavior of a complex system, such as progressive records of the system's sensory data and correlated records of the system's motor controls.
International Nuclear Information System (INIS)
Zacharov, B.
1976-01-01
In recent years, there has been a growing tendency in high-energy physics and in other fields to solve computational problems by distributing tasks among the resources of inter-coupled processing devices and associated system elements. This trend has gained further momentum more recently with the increased availability of low-cost processors and with the development of the means of data distribution. In two lectures, the broad question of distributed computing systems is examined and the historical development of such systems reviewed. An attempt is made to examine the reasons for the existence of these systems and to discern the main trends for the future. The components of distributed systems are discussed in some detail and particular emphasis is placed on the importance of standards and conventions in certain key system components. The ideas and principles of distributed systems are discussed in general terms, but these are illustrated by a number of concrete examples drawn from the context of the high-energy physics environment. (Auth.)
A New Distribution-Random Limit Normal Distribution
Gong, Xiaolin; Yang, Shuzhen
2013-01-01
This paper introduces a new distribution to improve tail risk modeling. Based on the classical normal distribution, we define a new distribution by a series of heat equations. Then, we use market data to verify our model.
Energy Technology Data Exchange (ETDEWEB)
Fairbairn, R.J.; Maunder, D.; Kenyon, P.
1999-07-01
This report summarises the findings of a study reviewing the distribution network in England, Scotland and Wales to evaluate its ability to accommodate more embedded generation from both fossil fuel and renewable energy sources. The background to the study is traced, and descriptions of the existing electricity supply system, the licence conditions relating to embedded generation, and the effects of the Review of Electricity Trading Arrangements are given. The ability of the UK distribution networks to accept embedded generation is examined, and technical benefits/drawbacks arising from embedded generation, and the potential for uptake of embedded generation technologies are considered. The distribution network capacity and the potential uptake of embedded generation are compared, and possible solutions to overcome obstacles are suggested. (UK)
International Nuclear Information System (INIS)
Fairbairn, R.J.; Maunder, D.; Kenyon, P.
1999-01-01
This report summarises the findings of a study reviewing the distribution network in England, Scotland and Wales to evaluate its ability to accommodate more embedded generation from both fossil fuel and renewable energy sources. The background to the study is traced, and descriptions of the existing electricity supply system, the licence conditions relating to embedded generation, and the effects of the Review of Electricity Trading Arrangements are given. The ability of the UK distribution networks to accept embedded generation is examined, and technical benefits/drawbacks arising from embedded generation, and the potential for uptake of embedded generation technologies are considered. The distribution network capacity and the potential uptake of embedded generation are compared, and possible solutions to overcome obstacles are suggested. (UK)
Distributed Robotics Education
DEFF Research Database (Denmark)
Lund, Henrik Hautop; Pagliarini, Luigi
2011-01-01
Distributed robotics takes many forms, for instance, multirobots, modular robots, and self-reconfigurable robots. The understanding and development of such advanced robotic systems demand extensive knowledge in engineering and computer science. In this paper, we describe the concept of a distribu......Distributed robotics takes many forms, for instance, multirobots, modular robots, and self-reconfigurable robots. The understanding and development of such advanced robotic systems demand extensive knowledge in engineering and computer science. In this paper, we describe the concept...... to be changed, related to multirobot control and human-robot interaction control from virtual to physical representation. The proposed system is valuable for bringing a vast number of issues into education – such as parallel programming, distribution, communication protocols, master dependency, connectivity...
Fehr, Ralph
2016-01-01
In this fully updated version of Industrial Power Distribution, the author addresses key areas of electric power distribution from an end-user perspective for both electrical engineers, as well as students who are training for a career in the electrical power engineering field. Industrial Power Distribution, Second Edition, begins by describing how industrial facilities are supplied from utility sources, which is supported with background information on the components of AC power, voltage drop calculations, and the sizing of conductors and transformers. Important concepts and discussions are featured throughout the book including those for sequence networks, ladder logic, motor application, fault calculations, and transformer connections. The book concludes with an introduction to power quality, how it affects industrial power systems, and an expansion of the concept of power factor, including a distortion term made necessary by the existence of harmonic.
DEFF Research Database (Denmark)
Melikov, Arsen Krikor
2011-01-01
The aim of total volume air distribution (TVAD) involves achieving uniform temperature and velocity in the occupied zone and environment designed for an average occupant. The supply of large amounts of clean and cool air are needed to maintain temperature and pollution concentration at acceptable...... levels in the entire space, leading to increased energy consumption and the use of large and costly HVAC and duct systems. The performance of desk installed PV combined with background TVAD used for room temperature control has been studied in an office building located in a hot and humid climate....... Ventilation in hospitals is essential to decrease the risk of airborne cross-infection. At present, mixing air distribution at a minimum of 12 ach is used in infection wards. Advanced air distribution has the potential to aid in achieving healthy, comfortable and productive indoor environments at levels...
Electricity Distribution Effectiveness
Directory of Open Access Journals (Sweden)
Waldemar Szpyra
2015-12-01
Full Text Available This paper discusses the basic concepts of cost accounting in the power industry and selected ways of assessing the effectiveness of electricity distribution. The results of effectiveness analysis of MV/LV distribution transformer replacement are presented, and unit costs of energy transmission through various medium-voltage line types are compared. The calculation results confirm the viability of replacing transformers manufactured before 1975. Replacing transformers manufactured after 1975 – only to reduce energy losses – is not economically justified. Increasing use of a PAS type line for energy transmission in local distribution networks is reasonable. Cabling these networks under the current calculation rules of discounts for excessive power outages is not viable, even in areas particularly exposed to catastrophic wire icing.
Schovancova, J; The ATLAS collaboration
2011-01-01
The poster details the different aspects of the ATLAS Distributed Computing experience after the first year of LHC data taking. We describe the performance of the ATLAS distributed computing system and the lessons learned during the 2010 run, pointing out parts of the system which were in a good shape, and also spotting areas which required improvements. Improvements ranged from hardware upgrade on the ATLAS Tier-0 computing pools to improve data distribution rates, tuning of FTS channels between CERN and Tier-1s, and studying data access patterns for Grid analysis to improve the global processing rate. We show recent software development driven by operational needs with emphasis on data management and job execution in the ATLAS production system.
Distributed Wind Market Applications
Energy Technology Data Exchange (ETDEWEB)
Forsyth, T.; Baring-Gould, I.
2007-11-01
Distributed wind energy systems provide clean, renewable power for on-site use and help relieve pressure on the power grid while providing jobs and contributing to energy security for homes, farms, schools, factories, private and public facilities, distribution utilities, and remote locations. America pioneered small wind technology in the 1920s, and it is the only renewable energy industry segment that the United States still dominates in technology, manufacturing, and world market share. The series of analyses covered by this report were conducted to assess some of the most likely ways that advanced wind turbines could be utilized apart from large, central station power systems. Each chapter represents a final report on specific market segments written by leading experts in this field. As such, this document does not speak with one voice but rather a compendium of different perspectives, which are documented from a variety of people in the U.S. distributed wind field.
DEFF Research Database (Denmark)
Chemi, Tatiana
2016-01-01
This chapter aims to deconstruct some persistent myths about creativity: the myth of individualism and of the genius. By looking at literature that approaches creativity as a participatory and distributed phenomenon and by bringing empirical evidence from artists’ studios, the author presents a p......, what can educators at higher education learn from the ways creative groups solve problems? How can artists contribute to inspiring higher education?......This chapter aims to deconstruct some persistent myths about creativity: the myth of individualism and of the genius. By looking at literature that approaches creativity as a participatory and distributed phenomenon and by bringing empirical evidence from artists’ studios, the author presents...... a perspective that is relevant to higher education. The focus here is on how artists solve problems in distributed paths, and on the elements of creative collaboration. Creative problem-solving will be looked at as an ongoing dialogue that artists engage with themselves, with others, with recipients...
Distributed Web Service Repository
Directory of Open Access Journals (Sweden)
Piotr Nawrocki
2015-01-01
Full Text Available The increasing availability and popularity of computer systems has resulted in a demand for new, language- and platform-independent ways of data exchange. That demand has in turn led to a significant growth in the importance of systems based on Web services. Alongside the growing number of systems accessible via Web services came the need for specialized data repositories that could offer effective means of searching of available services. The development of mobile systems and wireless data transmission technologies has allowed the use of distributed devices and computer systems on a greater scale. The accelerating growth of distributed systems might be a good reason to consider the development of distributed Web service repositories with built-in mechanisms for data migration and synchronization.
Remote entanglement distribution
International Nuclear Information System (INIS)
Sanders, B.C.; Gour, G.; Meyer, D.A.
2005-01-01
Full text: Shared bipartite entanglement is a crucial shared resource for many quantum information tasks such as teleportation, entanglement swapping, and remote state preparation. In general different nodes of a quantum network share an entanglement resource, such as ebits, that are consumed during the task. In practice, generating entangled states is expensive, but here we establish a protocol by which a quantum network requires only a single supplier of entanglement to all nodes who, by judicious measurements and classical communication, provides the nodes with a unique pair wise entangled state independent of the measurement outcome. Furthermore, we extend this result to a chain of suppliers and nodes, which enables an operational interpretation of concurrence. In the special case that the supplier shares bipartite states with two nodes, and such states are pure and maximally entangled, our protocol corresponds to entanglement swapping. However, in the practical case that initial shared entanglement between suppliers and nodes involves partially entangled or mixed states, we show that general local operations and classical communication by all parties (suppliers and nodes) yields distributions of entangled states between nodes. In general a distribution of bipartite entangled states between any two nodes will include states that do not have the same entanglement; thus we name this general process remote entanglement distribution. In our terminology entanglement swapping with partially entangled states is a particular class of remote entanglement distribution protocols. Here we identify which distributions of states that can or cannot be created by remote entanglement distribution. In particular we prove a powerful theorem that establishes an upper bound on the entanglement of formation that can be produced between two qubit nodes. We extend this result to the case of a linear chain of parties that play the roles of suppliers and nodes; this extension provides
International Nuclear Information System (INIS)
Bobrowski, Sebastian; Chen, Hong; Döring, Maik; Jensen, Uwe; Schinköthe, Wolfgang
2015-01-01
In practice manufacturers may have lots of failure data of similar products using the same technology basis under different operating conditions. Thus, one can try to derive predictions for the distribution of the lifetime of newly developed components or new application environments through the existing data using regression models based on covariates. Three categories of such regression models are considered: a parametric, a semiparametric and a nonparametric approach. First, we assume that the lifetime is Weibull distributed, where its parameters are modelled as linear functions of the covariate. Second, the Cox proportional hazards model, well-known in Survival Analysis, is applied. Finally, a kernel estimator is used to interpolate between empirical distribution functions. In particular the last case is new in the context of reliability analysis. We propose a goodness of fit measure (GoF), which can be applied to all three types of regression models. Using this GoF measure we discuss a new model selection procedure. To illustrate this method of reliability prediction, the three classes of regression models are applied to real test data of motor experiments. Further the performance of the approaches is investigated by Monte Carlo simulations. - Highlights: • We estimate the lifetime distribution in the presence of a covariate. • Three types of regression models are considered and compared. • A new nonparametric estimator based on our particular data structure is introduced. • We propose a goodness of fit measure and show a new model selection procedure. • A case study with real data and Monte Carlo simulations are performed
Directory of Open Access Journals (Sweden)
Kulagin S. A.
2017-01-01
Full Text Available We review a microscopic model of the nuclear parton distribution functions, which accounts for a number of nuclear effects including Fermi motion and nuclear binding, nuclear meson-exchange currents, off-shell corrections to bound nucleon distributions and nuclear shadowing. We also discuss applications of this model to a number of processes including lepton-nucleus deep inelastic scattering, proton-nucleus Drell-Yan lepton pair production at Fermilab, as well as W± and Z0 boson production in proton-lead collisions at the LHC.
CERN. Geneva
2018-01-01
Global science calls for global infrastructure. A typical large-scale research group will use a suite of international services and involve hundreds of collaborating institutes and users from around the world. How can these users access those services securely? How can their digital identities be established, verified and maintained? We will explore the motivation for distributed authentication and the ways in which research communities are addressing the challenges. We will discuss security incident response in distributed environments - a particular challenge for the operators of these infrastructures. Through this course you should gain an overview of federated identity technologies and protocols, including x509 certificates, SAML and OIDC.
Directory of Open Access Journals (Sweden)
Kareema Abed Al-Kadim
2017-12-01
Full Text Available In this paper Rayleigh Pareto distribution have introduced denote by( R_PD. We stated some useful functions. Therefor we give some of its properties like the entropy function, mean, mode, median , variance , the r-th moment about the mean, the rth moment about the origin, reliability, hazard functions, coefficients of variation, of sekeness and of kurtosis. Finally, we estimate the parameters so the aim of this search is to introduce a new distribution
DEFF Research Database (Denmark)
Fischer, Lars; Grønager, Michael; Kleist, Josva
2008-01-01
The Tier-1 facility operated by the Nordic DataGrid Facility (NDGF) differs significantly from other Tier-1s in several aspects: firstly, it is not located at one or a few premises, but instead is distributed throughout the Nordic countries; secondly, it is not under the governance of a single...... organization but instead is a meta-center built of resources under the control of a number of different national organizations. We present some technical implications of these aspects as well as the high-level design of this distributed Tier-1. The focus will be on computing services, storage and monitoring....
Reconfiguration of distribution nets
International Nuclear Information System (INIS)
Latorre Bayona, Gerardo; Angarita Marquez, Jorge Luis
2000-01-01
Starting of the location of the reconfiguration problem inside the context of the operation of distribution nets, of the quality indicators definition and of the presentation of the alternatives more used for reduction of technical losses, they are related diverse reconfiguration methodologies proposed in the technical literature, pointing out their three principals limitations; also are presents the results of lost obtained starting from simulation works carried out in distribution circuits of the ESSA ESP, which permitting to postulate the reconfiguration of nets like an excellent alternative to reduce technical losses
Fischer, L.; Grønager, M.; Kleist, J.; Smirnova, O.
2008-07-01
The Tier-1 facility operated by the Nordic DataGrid Facility (NDGF) differs significantly from other Tier-1s in several aspects: firstly, it is not located at one or a few premises, but instead is distributed throughout the Nordic countries; secondly, it is not under the governance of a single organization but instead is a meta-center built of resources under the control of a number of different national organizations. We present some technical implications of these aspects as well as the high-level design of this distributed Tier-1. The focus will be on computing services, storage and monitoring.
Liquidity, welfare and distribution
Directory of Open Access Journals (Sweden)
Martín Gil Samuel
2012-01-01
Full Text Available This work presents a dynamic general equilibrium model where wealth distribution is endogenous. I provide channels of causality that suggest a complex relationship between financial markets and the real activity which breaks down the classical dichotomy. As a consequence, the Friedman rule does not hold. In terms of the current events taking place in the world economy, this paper provides a rationale to advert against the perils of an economy satiated with liquidity. Efficiency and distribution cannot thus be considered as separate attributes once we account for the interactions between financial markets and the economic performance.
Distributed photovoltaic grid transformers
Shertukde, Hemchandra Madhusudan
2014-01-01
The demand for alternative energy sources fuels the need for electric power and controls engineers to possess a practical understanding of transformers suitable for solar energy. Meeting that need, Distributed Photovoltaic Grid Transformers begins by explaining the basic theory behind transformers in the solar power arena, and then progresses to describe the development, manufacture, and sale of distributed photovoltaic (PV) grid transformers, which help boost the electric DC voltage (generally at 30 volts) harnessed by a PV panel to a higher level (generally at 115 volts or higher) once it is
Pérez-Sánchez, Julio; Senent-Aparicio, Javier
2017-08-01
Dry spells are an essential concept of drought climatology that clearly defines the semiarid Mediterranean environment and whose consequences are a defining feature for an ecosystem, so vulnerable with regard to water. The present study was conducted to characterize rainfall drought in the Segura River basin located in eastern Spain, marked by the self seasonal nature of these latitudes. A daily precipitation set has been utilized for 29 weather stations during a period of 20 years (1993-2013). Furthermore, four sets of dry spell length (complete series, monthly maximum, seasonal maximum, and annual maximum) are used and simulated for all the weather stations with the following probability distribution functions: Burr, Dagum, error, generalized extreme value, generalized logistic, generalized Pareto, Gumbel Max, inverse Gaussian, Johnson SB, Log-Logistic, Log-Pearson 3, Triangular, Weibull, and Wakeby. Only the series of annual maximum spell offer a good adjustment for all the weather stations, thereby gaining the role of Wakeby as the best result, with a p value means of 0.9424 for the Kolmogorov-Smirnov test (0.2 significance level). Probability of dry spell duration for return periods of 2, 5, 10, and 25 years maps reveal the northeast-southeast gradient, increasing periods with annual rainfall of less than 0.1 mm in the eastern third of the basin, in the proximity of the Mediterranean slope.
Gap length distributions by PEPR
International Nuclear Information System (INIS)
Warszawer, T.N.
1980-01-01
Conditions guaranteeing exponential gap length distributions are formulated and discussed. Exponential gap length distributions of bubble chamber tracks first obtained on a CRT device are presented. Distributions of resulting average gap lengths and their velocity dependence are discussed. (orig.)
A distributed multimedia toolbox
Scholten, Johan; Jansen, P.G.
1997-01-01
Emphasis of our research lies on the application of realtime multimedia technology: tele-teaching, teleconferencing and collaborative work. To support this research we need a real-time environment that supports rapid prototyping of distributed multimedia applications. Because other systems were not
Distributed Parameter Modelling Applications
DEFF Research Database (Denmark)
Sales-Cruz, Mauricio; Cameron, Ian; Gani, Rafiqul
2011-01-01
and the development of a short-path evaporator. The oil shale processing problem illustrates the interplay amongst particle flows in rotating drums, heat and mass transfer between solid and gas phases. The industrial application considers the dynamics of an Alberta-Taciuk processor, commonly used in shale oil and oil...... the steady state, distributed behaviour of a short-path evaporator....
Distribution center consolidation games
Klijn, F.; Slikker, M.
2005-01-01
We study a location-inventory model to analyze the impact of consolidation of distribution centers on facility and inventory costs. We introduce a cooperative game and show that when demand processes are i.i.d. the core is non-empty, i.e., consolidation allows for a stable division of the minimal
Directory of Open Access Journals (Sweden)
Mario Ã…Â½agar
2006-06-01
Full Text Available Concept of distributed digital book (DDB based on the XML is proposed. The author's side as well as the reader's side are analyzed. Different book modules and their frameworks are defined. Module design tools are proposed. Practical solution with appropriate examples is shown.
Tanenbaum, A.S.; van Steen, M.R.
2016-01-01
For this third edition of "Distributed Systems," the material has been thoroughly revised and extended, integrating principles and paradigms into nine chapters: 1. Introduction 2. Architectures 3. Processes 4. Communication 5. Naming 6. Coordination 7. Replication 8. Fault tolerance 9. Security A
Enabling distributed petascale science
International Nuclear Information System (INIS)
Baranovski, Andrew; Bharathi, Shishir; Bresnahan, John
2007-01-01
Petascale science is an end-to-end endeavour, involving not only the creation of massive datasets at supercomputers or experimental facilities, but the subsequent analysis of that data by a user community that may be distributed across many laboratories and universities. The new SciDAC Center for Enabling Distributed Petascale Science (CEDPS) is developing tools to support this end-to-end process. These tools include data placement services for the reliable, high-performance, secure, and policy-driven placement of data within a distributed science environment; tools and techniques for the construction, operation, and provisioning of scalable science services; and tools for the detection and diagnosis of failures in end-to-end data placement and distributed application hosting configurations. In each area, we build on a strong base of existing technology and have made useful progress in the first year of the project. For example, we have recently achieved order-of-magnitude improvements in transfer times (for lots of small files) and implemented asynchronous data staging capabilities; demonstrated dynamic deployment of complex application stacks for the STAR experiment; and designed and deployed end-to-end troubleshooting services. We look forward to working with SciDAC application and technology projects to realize the promise of petascale science
Hyperfinite representation of distributions
Indian Academy of Sciences (India)
A nonstandard treatment of the theory of distributions in terms of a hyperfinite representa- ... is an (internal) hyperfinite set of hyperreal numbers with internal cardinality 2. ..... The factor space ΠgI ΠDIauI is a C-vector space which may be.
Pedersen, J
1999-01-01
The power distribution for the LHC machine and its experiments will be realised making extensive use of the existing infrastructure for the LEP. The overall power requirement is approximately the same, about 125 MW. The load distribution will however change. The even points will loose in importance and the points 1 and 5 will, due to the installation of ATLAS and CMS, gain. A thorough reorganisation of the 18 kV distribution will thus be necessary. Due to the important cryogenic installations required for the LHC, the 3.3 kV distribution system, supplying mainly cryogenic compressors, will be extended with a number of new substations. The important number of new surface buildings, underground caverns and other underground structures all will receive general service installations: Lighting and power. The new injection tunnels will require complete installations: A.C. supplies for the power converters and for general service, and D.C. cabling for the magnets of the beam line. Special safe power installations ar...
Two Photon Distribution Amplitudes
International Nuclear Information System (INIS)
El Beiyad, M.; Pire, B.; Szymanowski, L.; Wallon, S.
2008-01-01
The factorization of the amplitude of the process γ*γ→γγ in the low energy and high photon virtuality region is demonstrated at the Born order and in the leading logarithmic approximation. The leading order two photon (generalized) distribution amplitudes exhibit a characteristic ln Q 2 behaviour and obey new inhomogeneous evolution equations
Plutonium valence state distributions
International Nuclear Information System (INIS)
Silver, G.L.
1974-01-01
A calculational method for ascertaining equilibrium valence state distributions of plutonium in acid solutions as a function of the plutonium oxidation number and the solution acidity is illustrated with an example. The method may be more practical for manual use than methods based upon polynomial equations. (T.G.)
Distributed Treatment Systems.
Zgonc, David; Plante, Luke
2017-10-01
This section presents a review of the literature published in 2016 on topics relating to distributed treatment systems. This review is divided into the following sections with multiple subsections under each: constituent removal; treatment technologies; and planning and treatment system management.