Reliability Analysis of DOOF for Weibull Distribution
Institute of Scientific and Technical Information of China (English)
陈文华; 崔杰; 樊小燕; 卢献彪; 相平
2003-01-01
Hierarchical Bayesian method for estimating the failure probability under DOOF by taking the quasi-Beta distribution as the prior distribution is proposed in this paper. The weighted Least Squares Estimate method was used to obtain the formula for computing reliability distribution parameters and estimating the reliability characteristic values under DOOF. Taking one type of aerospace electrical connector as an example, the correctness of the above method through statistical analysis of electrical connector accelerated life test data was verified.
Reliability analysis of DOOF for Weibull distribution
Institute of Scientific and Technical Information of China (English)
陈文华; 崔杰; 樊晓燕; 卢献彪; 相平
2003-01-01
Hierarchical Bayesian method for estimating the failure probability Pi under DOOF by taking the quasi-Beta distribution B(pi-1 , 1,1, b ) as the prior distribution is proposed in this paper. The weighted Least Squares Estimate method was used to obtain the formula for computing reliability distribution parameters and estimating the reliability characteristic values under DOOF. Taking one type of aerospace electrical connectoras an example, the correctness of the above method through statistical analysis of electrical connector acceler-ated life test data was verified.
A MULTIVARIATE WEIBULL DISTRIBUTION
Directory of Open Access Journals (Sweden)
Cheng Lee
2010-07-01
Full Text Available A multivariate survival function of Weibull Distribution is developed by expanding the theorem by Lu and Bhattacharyya. From the survival function, the probability density function, the cumulative probability function, the determinant of the Jacobian Matrix, and the general moment are derived.
Transmuted Complementary Weibull Geometric Distribution
Directory of Open Access Journals (Sweden)
Ahmed Z. A fify
2014-12-01
Full Text Available This paper provides a new generalization of the complementary Weibull geometric distribution that introduced by Tojeiro et al. (2014, using the quadratic rank transmutation map studied by Shaw and Buckley (2007. The new distribution is referred to as transmuted complementary Weibull geometric distribution (TCWGD. The TCWG distribution includes as special cases the complementary Weibull geometric distribution (CWGD, complementary exponential geometric distribution(CEGD,Weibull distribution (WD and exponential distribution (ED. Various structural properties of the new distribution including moments, quantiles, moment generating function and RØnyi entropy of the subject distribution are derived. We proposed the method of maximum likelihood for estimating the model parameters and obtain the observed information matrix. A real data set are used to compare the exibility of the transmuted version versus the complementary Weibull geometric distribution.
The Weibull distribution a handbook
Rinne, Horst
2008-01-01
The Most Comprehensive Book on the SubjectChronicles the Development of the Weibull Distribution in Statistical Theory and Applied StatisticsExploring one of the most important distributions in statistics, The Weibull Distribution: A Handbook focuses on its origin, statistical properties, and related distributions. The book also presents various approaches to estimate the parameters of the Weibull distribution under all possible situations of sampling data as well as approaches to parameter and goodness-of-fit testing.Describes the Statistical Methods, Concepts, Theories, and Applications of T
Energy Technology Data Exchange (ETDEWEB)
Jaramillo, O.A.; Borja, M.A.
2004-07-01
The International Standard IEC 61400-12 and other international recommendations suggest the use of the two-parameter Weibull probability distribution function (PDF) to estimate the Annual Energy Production (AEP) of a wind turbine. Most of the commercial software uses the unimodal Weibull PDF as the default option to carry out estimations of AEP, which in turn, are used to optimise wind farm layouts. Furthermore, AEP is essential data to assess the economic feasibility of a wind power project. However, in some regions of the world, the use of these widely adopted and recommended methods lead to incorrect results. This is the case for the region of La Ventosa in Mexico, where the frequency of the wind speed shows a bimodal distribution. In this work, mathematical formulations by using a Weibull PDF and a bimodal distribution are established to compare the AEP, the capacity factor and the levelised production cost for a specific wind turbine. By combining one year of wind speed data with the hypothetic power performance of the Vestas V27-225 kW wind turbine, it was found that using the Weibull PDF underestimates AEP (and thus the Capacity Factor) by about 12%. (author)
Iskandar, Ismed; Satria Gondokaryono, Yudi
2016-02-01
In reliability theory, the most important problem is to determine the reliability of a complex system from the reliability of its components. The weakness of most reliability theories is that the systems are described and explained as simply functioning or failed. In many real situations, the failures may be from many causes depending upon the age and the environment of the system and its components. Another problem in reliability theory is one of estimating the parameters of the assumed failure models. The estimation may be based on data collected over censored or uncensored life tests. In many reliability problems, the failure data are simply quantitatively inadequate, especially in engineering design and maintenance system. The Bayesian analyses are more beneficial than the classical one in such cases. The Bayesian estimation analyses allow us to combine past knowledge or experience in the form of an apriori distribution with life test data to make inferences of the parameter of interest. In this paper, we have investigated the application of the Bayesian estimation analyses to competing risk systems. The cases are limited to the models with independent causes of failure by using the Weibull distribution as our model. A simulation is conducted for this distribution with the objectives of verifying the models and the estimators and investigating the performance of the estimators for varying sample size. The simulation data are analyzed by using Bayesian and the maximum likelihood analyses. The simulation results show that the change of the true of parameter relatively to another will change the value of standard deviation in an opposite direction. For a perfect information on the prior distribution, the estimation methods of the Bayesian analyses are better than those of the maximum likelihood. The sensitivity analyses show some amount of sensitivity over the shifts of the prior locations. They also show the robustness of the Bayesian analysis within the range
The Transmuted Generalized Inverse Weibull Distribution
Directory of Open Access Journals (Sweden)
Faton Merovci
2014-05-01
Full Text Available A generalization of the generalized inverse Weibull distribution the so-called transmuted generalized inverse Weibull distribution is proposed and studied. We will use the quadratic rank transmutation map (QRTM in order to generate a flexible family of probability distributions taking the generalized inverseWeibull distribution as the base value distribution by introducing a new parameter that would offer more distributional flexibility. Various structural properties including explicit expressions for the moments, quantiles, and moment generating function of the new distribution are derived. We propose the method of maximum likelihood for estimating the model parameters and obtain the observed information matrix. A real data set are used to compare the flexibility of the transmuted version versus the generalized inverse Weibull distribution.
Energy Technology Data Exchange (ETDEWEB)
Ramos, A.; Muniz-Calvente, M.; Fernandez, P.; Fernandez Cantel, A.; Lamela, M. J.
2015-10-01
Glass and ceramics present a brittle behaviour so a large scatter in the test results is obtained. This dispersion is mainly due to the inevitable presence of micro-cracks on its surface, edge defects or internal defects, which must be taken into account using an appropriate failure criteria non-deterministic but probabilistic. Among the existing probability distributions, the two or three parameter Weibull distribution is generally used in adjusting material resistance results, although the method of use thereof is not always correct. Firstly, in this work, the results of a large experimental programme using annealed glass specimens of different dimensions based on four-point bending and coaxial double ring tests was performed. Then, the finite element models made for each type of test, the adjustment of the parameters of the three-parameter Weibull distribution function (cdf) (λ: location, β: shape, d: scale) for a certain failure criterion and the calculation of the effective areas from the cumulative distribution function are presented. Summarizing, this work aims to generalize the use of the three-parameter Weibull function in structural glass elements with stress distributions not analytically described, allowing to apply the probabilistic model proposed in general loading distributions. (Author)
Censored Weibull Distributed Data in Experimental Design
Støtvig, Jeanett Gunneklev
2014-01-01
Give an introduction to experimental design. Investigate how four methods handle Weibull distributed censored data, where the four methods are the quick and dirty method, the maximum likelihood method, single imputation and multiple imputation.
A CLASS OF WEIGHTED WEIBULL DISTRIBUTION
Directory of Open Access Journals (Sweden)
Saman Shahbaz
2010-07-01
Full Text Available The weighted Weibull model is proposed following the method of Azzalini (1985. Basic properties of the distribution; including moments, generating function, hazard rate function and estimation of parameters; have been studied. The weighted Weibull model is proposed following the method of Azzalini (1985. Basic properties of the distribution; including moments, generating function, hazard rate function and estimation of parameters; have been studied.
Energy Technology Data Exchange (ETDEWEB)
Kantar, Yeliz Mert; Usta, Ilhan [Department of Statistics, Anadolu University, Eskisehir 26470 (Turkey)
2008-05-15
In this study, the minimum cross entropy (MinxEnt) principle is applied for the first time to the wind energy field. This principle allows the inclusion of previous information of a wind speed distribution and covers the maximum entropy (MaxEnt) principle, which is also discussed by Li and Li and Ramirez as special cases in their wind power study. The MinxEnt probability density function (pdf) derived from the MinxEnt principle are used to determine the diurnal, monthly, seasonal and annual wind speed distributions. A comparison between MinxEnt pdfs defined on the basis of the MinxEnt principle and the Weibull pdf on wind speed data, which are taken from different sources and measured in various regions, is conducted. The wind power densities of the considered regions obtained from Weibull and MinxEnt pdfs are also compared. The results indicate that the pdfs derived from the MinxEnt principle fit better to a variety of measured wind speed data than the conventionally applied empirical Weibull pdf. Therefore, it is shown that the MinxEnt principle can be used as an alternative method to estimate both wind distribution and wind power accurately. (author)
Directory of Open Access Journals (Sweden)
Chris Bambey Guure
2012-01-01
Full Text Available The survival function of the Weibull distribution determines the probability that a unit or an individual will survive beyond a certain specified time while the failure rate is the rate at which a randomly selected individual known to be alive at time will die at time (. The classical approach for estimating the survival function and the failure rate is the maximum likelihood method. In this study, we strive to determine the best method, by comparing the classical maximum likelihood against the Bayesian estimators using an informative prior and a proposed data-dependent prior known as generalised noninformative prior. The Bayesian estimation is considered under three loss functions. Due to the complexity in dealing with the integrals using the Bayesian estimator, Lindley’s approximation procedure is employed to reduce the ratio of the integrals. For the purpose of comparison, the mean squared error (MSE and the absolute bias are obtained. This study is conducted via simulation by utilising different sample sizes. We observed from the study that the generalised prior we assumed performed better than the others under linear exponential loss function with respect to MSE and under general entropy loss function with respect to absolute bias.
Weibull Distributions for the Preterm Delivery
Directory of Open Access Journals (Sweden)
Kavitha, N
2014-06-01
Full Text Available The purposes of this study are to evaluate the levels of CRH at pregnancy by using Weibull distributions. Also this study found the rate of change in placental CRH and the level of maternal cortisol in preterm delivery by the mathematical formulas.
A New Approach for Parameter Estimation of Mixed Weibull Distribution:A Case Study in Spindle
Institute of Scientific and Technical Information of China (English)
Dongwei Gu; Zhiqiong Wang; Guixiang Shen; Yingzhi Zhang; Xilu Zhao
2016-01-01
In order to improve the accuracy and efficiency of graphical method and maximum likelihood estimation ( MLE) in Mixed Weibull distribution parameters estimation, Graphical-GA combines the advantage of graphical method and genetic algorithm ( GA) is proposed. Firstly, with the analysis of Weibull probability paper (WPP), mixed Weibull is identified to data fitting. Secondly, the observed value of shape and scale parameters are obtained by graphical method with least square, then optimizing the parameters of mixed Weibull with GA. Thirdly, with the comparative analysis on graphical method, piecewise Weibull and two⁃Weibull, it shows graphical⁃GA mixed Weibull is the best. Finally, the spindle MTBF point estimation and interval estimation are got based on mixed Weibull distribution. The results indicate that graphical⁃GA are improved effectively and the evaluation of spindle can provide the basis for design and reliability growth.
Directory of Open Access Journals (Sweden)
D. Kidmo Kaoga
2014-12-01
Full Text Available In this study, five numerical Weibull distribution methods, namely, the maximum likelihood method, the modified maximum likelihood method (MLM, the energy pattern factor method (EPF, the graphical method (GM, and the empirical method (EM were explored using hourly synoptic data collected from 1985 to 2013 in the district of Maroua in Cameroon. The performance analysis revealed that the MLM was the most accurate model followed by the EPF and the GM. Furthermore, the comparison between the wind speed standard deviation predicted by the proposed models and the measured data showed that the MLM has a smaller relative error of -3.33% on average compared to -11.67% on average for the EPF and -8.86% on average for the GM. As a result, the MLM was precisely recommended to estimate the scale and shape parameters for an accurate and efficient wind energy potential evaluation.
Transmuted New Generalized Inverse Weibull Distribution
Directory of Open Access Journals (Sweden)
Muhammad Shuaib Khan
2017-06-01
Full Text Available This paper introduces the transmuted new generalized inverse Weibull distribution by using the quadratic rank transmutation map (QRTM scheme studied by Shaw et al. (2007. The proposed model contains the twenty three lifetime distributions as special sub-models. Some mathematical properties of the new distribution are formulated, such as quantile function, Rényi entropy, mean deviations, moments, moment generating function and order statistics. The method of maximum likelihood is used for estimating the model parameters. We illustrate the flexibility and potential usefulness of the new distribution by using reliability data.
Modeling particle size distributions by the Weibull distribution function
Energy Technology Data Exchange (ETDEWEB)
Fang, Zhigang (Rogers Tool Works, Rogers, AR (United States)); Patterson, B.R.; Turner, M.E. Jr (Univ. of Alabama, Birmingham, AL (United States))
1993-10-01
A method is proposed for modeling two- and three-dimensional particle size distributions using the Weibull distribution function. Experimental results show that, for tungsten particles in liquid phase sintered W-14Ni-6Fe, the experimental cumulative section size distributions were well fit by the Weibull probability function, which can also be used to compute the corresponding relative frequency distributions. Modeling the two-dimensional section size distributions facilitates the use of the Saltykov or other methods for unfolding three-dimensional (3-D) size distributions with minimal irregularities. Fitting the unfolded cumulative 3-D particle size distribution with the Weibull function enables computation of the statistical distribution parameters from the parameters of the fit Weibull function.
determination of weibull parameters and analysis of wind power ...
African Journals Online (AJOL)
HOD
Resulting from the analysis, the values of the average wind speed, the average daily wind power, the ... Keywords: Wind power potential, Energy production, Weibull distribution, Wind ... was added globally in 2015 indicating a 23.2% increase.
ZERODUR strength modeling with Weibull statistical distributions
Hartmann, Peter
2016-07-01
The decisive influence on breakage strength of brittle materials such as the low expansion glass ceramic ZERODUR is the surface condition. For polished or etched surfaces it is essential if micro cracks are present and how deep they are. Ground surfaces have many micro cracks caused by the generation process. Here only the depths of the micro cracks are relevant. In any case presence and depths of micro cracks are statistical by nature. The Weibull distribution is the model used traditionally for the representation of such data sets. It is based on the weakest link ansatz. The use of the two or three parameter Weibull distribution for data representation and reliability prediction depends on the underlying crack generation mechanisms. Before choosing the model for a specific evaluation, some checks should be done. Is there only one mechanism present or is it to be expected that an additional mechanism might contribute deviating results? For ground surfaces the main mechanism is the diamond grains' action on the surface. However, grains breaking from their bonding might be moved by the tool across the surface introducing a slightly deeper crack. It is not to be expected that these scratches follow the same statistical distribution as the grinding process. Hence, their description with the same distribution parameters is not adequate. Before including them a dedicated discussion should be performed. If there is additional information available influencing the selection of the model, for example the existence of a maximum crack depth, this should be taken into account also. Micro cracks introduced by small diamond grains on tools working with limited forces cannot be arbitrarily deep. For data obtained with such surfaces the existence of a threshold breakage stress should be part of the hypothesis. This leads to the use of the three parameter Weibull distribution. A differentiation based on the data set alone without preexisting information is possible but requires a
Using the Weibull distribution reliability, modeling and inference
McCool, John I
2012-01-01
Understand and utilize the latest developments in Weibull inferential methods While the Weibull distribution is widely used in science and engineering, most engineers do not have the necessary statistical training to implement the methodology effectively. Using the Weibull Distribution: Reliability, Modeling, and Inference fills a gap in the current literature on the topic, introducing a self-contained presentation of the probabilistic basis for the methodology while providing powerful techniques for extracting information from data. The author explains the use of the Weibull distribution
Packing fraction of particles with a Weibull size distribution
Brouwers, H. J. H.
2016-07-01
This paper addresses the void fraction of polydisperse particles with a Weibull (or Rosin-Rammler) size distribution. It is demonstrated that the governing parameters of this distribution can be uniquely related to those of the lognormal distribution. Hence, an existing closed-form expression that predicts the void fraction of particles with a lognormal size distribution can be transformed into an expression for Weibull distributions. Both expressions contain the contraction coefficient β. Likewise the monosized void fraction φ1, it is a physical parameter which depends on the particles' shape and their state of compaction only. Based on a consideration of the scaled binary void contraction, a linear relation for (1 - φ1)β as function of φ1 is proposed, with proportionality constant B, depending on the state of compaction only. This is validated using computational and experimental packing data concerning random close and random loose packing arrangements. Finally, using this β, the closed-form analytical expression governing the void fraction of Weibull distributions is thoroughly compared with empirical data reported in the literature, and good agreement is found. Furthermore, the present analysis yields an algebraic equation relating the void fraction of monosized particles at different compaction states. This expression appears to be in good agreement with a broad collection of random close and random loose packing data.
Distributed Fuzzy CFAR Detection for Weibull Clutter
Zaimbashi, Amir; Taban, Mohammad Reza; Nayebi, Mohammad Mehdi
In Distributed detection systems, restricting the output of the local decision to one bit certainly implies a substantial information loss. In this paper, we consider the fuzzy detection, which uses a function called membership function for mapping the observation space of each local detector to a value between 0 and 1, indicating the degree of assurance about presence or absence of a signal. In this case, we examine the problem of distributed Maximum Likelihood (ML) and Order Statistic (OS) constant false alarm rate (CFAR) detections using fuzzy fusion rules such as “Algebraic Product” (AP), “Algebraic Sum” (AS), “Union” (Un) and “Intersection” (IS) in the fusion centre. For the Weibull clutter, the expression of the membership function based on the ML or OS CFAR processors in the local detectors is also obtained. For comparison, we consider a binary distributed detector, which uses the Maximum Likelihood and Algebraic Product (MLAP) or Order Statistic and Algebraic Product (OSAP) CFAR processors as the local detectors. In homogenous and non homogenous situations, multiple targets or clutter edge, the performances of the fuzzy and binary distributed detectors are analyzed and compared. The simulation results indicate the superior and robust performance of the distributed systems using fuzzy detection in the homogenous and non homogenous situations.
Weibull model of multiplicity distribution in hadron-hadron collisions
Dash, Sadhana; Nandi, Basanta K.; Sett, Priyanka
2016-06-01
We introduce the use of the Weibull distribution as a simple parametrization of charged particle multiplicities in hadron-hadron collisions at all available energies, ranging from ISR energies to the most recent LHC energies. In statistics, the Weibull distribution has wide applicability in natural processes that involve fragmentation processes. This provides a natural connection to the available state-of-the-art models for multiparticle production in hadron-hadron collisions, which involve QCD parton fragmentation and hadronization. The Weibull distribution describes the multiplicity data at the most recent LHC energies better than the single negative binomial distribution.
Directory of Open Access Journals (Sweden)
Antonio Colmenar-Santos
2014-10-01
Full Text Available Electric power losses are constantly present during the service life of wind farms and must be considered in the calculation of the income arising from selling the produced electricity. It is typical to estimate the electrical losses in the design stage as those occurring when the wind farm operates at rated power, nevertheless, it is necessary to determine a method for checking if the actual losses meet the design requirements during the operation period. In this paper, we prove that the electric losses at rated power should not be considered as a reference level and a simple methodology will be developed to analyse and foresee the actual losses in a set period as a function of the wind resource in such period, defined according to the Weibull distribution, and the characteristics of the wind farm electrical infrastructure. This methodology facilitates a simple way, to determine in the design phase and to check during operation, the actual electricity losses.
A Weibull distribution accrual failure detector for cloud computing.
Liu, Jiaxi; Wu, Zhibo; Wu, Jin; Dong, Jian; Zhao, Yao; Wen, Dongxin
2017-01-01
Failure detectors are used to build high availability distributed systems as the fundamental component. To meet the requirement of a complicated large-scale distributed system, accrual failure detectors that can adapt to multiple applications have been studied extensively. However, several implementations of accrual failure detectors do not adapt well to the cloud service environment. To solve this problem, a new accrual failure detector based on Weibull Distribution, called the Weibull Distribution Failure Detector, has been proposed specifically for cloud computing. It can adapt to the dynamic and unexpected network conditions in cloud computing. The performance of the Weibull Distribution Failure Detector is evaluated and compared based on public classical experiment data and cloud computing experiment data. The results show that the Weibull Distribution Failure Detector has better performance in terms of speed and accuracy in unstable scenarios, especially in cloud computing.
On Generalized Upper(kRecord Values From Weibull Distribution
Directory of Open Access Journals (Sweden)
Jerin Paul
2015-09-01
Full Text Available In this paper we study the generalized upper(krecord values arising from Weibull distribution. Expressions for the moments and product moments of those generalized upper(krecord values are derived. Some properties of generalized upper(krecord values which characterize the Weibull distribution have been established. Also some distributional properties of generalized upper(krecord values arising from Weibull distribution are considered and used for suggesting an estimator for the shape parameter of Weibull distribution. The location and scale parameters are estimated using the Best Linear Unbiased Estimation procedure. Prediction of a future record using Best Linear Unbiased Predictor has been studied. A real life data is used to illustrate the results generated in this work.
comparison of estimation methods for fitting weibull distribution to ...
African Journals Online (AJOL)
Tersor
JOURNAL OF RESEARCH IN FORESTRY, WILDLIFE AND ENVIRONMENT VOLUME 7, No.2 SEPTEMBER, 2015. ... method was more accurate in fitting the Weibull distribution to the natural stand. ... appropriate for mixed age group.
Weibull model of Multiplicity Distribution in hadron-hadron collisions
Dash, Sadhana
2014-01-01
We introduce the Weibull distribution as a simple parametrization of charged particle multiplicities in hadron-hadron collisions at all available energies, ranging from ISR energies to the most recent LHC energies. In statistics, the Weibull distribution has wide applicability in natural processes involving fragmentation processes. This gives a natural connection to the available state-of-the-art models for multi-particle production in hadron hadron collisions involving QCD parton fragmentation and hadronization.
Polynomial approximations of the Normal toWeibull Distribution transformation
Directory of Open Access Journals (Sweden)
Andrés Feijóo
2014-09-01
Full Text Available Some of the tools that are generally employed in power system analysis need to use approaches based on statistical distributions for simulating the cumulative behavior of the different system devices. For example, the probabilistic load flow. The presence of wind farms in power systems has increased the use of Weibull and Rayleigh distributions among them. Not only the distributions themselves, but also satisfying certain constraints such as correlation between series of data or even autocorrelation can be of importance in the simulation. Correlated Weibull or Rayleigh distributions can be obtained by transforming correlated Normal distributions, and it can be observed that certain statistical values such as the means and the standard deviations tend to be retained when operating such transformations, although why this happens is not evident. The objective of this paper is to analyse the consequences of using such transformations. The methodology consists of comparing the results obtained by means of a direct transformation and those obtained by means of approximations based on the use of first and second degree polynomials. Simulations have been carried out with series of data which can be interpreted as wind speeds. The use of polynomial approximations gives accurate results in comparison with direct transformations and provides an approach that helps explain why the statistical values are retained during the transformations.
Constant-step stress accelerated life test of VFD under Weibull distribution case
Institute of Scientific and Technical Information of China (English)
ZHANG Jian-ping; GENG Xin-min
2005-01-01
Constant-step stress accelerated life test of Vacuum Fluorescent Display (VFD) was conducted with increased cathode temperature. Statistical analysis was done by applying Weibull distribution for describing the life, and Least Square Method (LSM)for estimating Weibull parameters. Self-designed special software was used to predict the VFD life. Numerical results showed that the average life of VFD is over 30000 h, that the VFD life follows Weibull distribution, and that the life-stress relationship satisfies linear Arrhenius equation completely. Accurate calculation of the key parameter enabled rapid estimation of VFD life.
On the Weibull distribution for wind energy assessment
DEFF Research Database (Denmark)
Batchvarova, Ekaterina; Gryning, Sven-Erik
2014-01-01
The two parameter Weibull distribution is traditionally used to describe the long term fluctuations in the wind speed as part of the theoretical framework for wind energy assessment of wind farms. The Weibull distribution is described by a shape and a scale parameter. Here, based on recent long......-term measurements performed by a wind lidar, the vertical profile of the shape parameter will be discussed for a sub-urban site, a coastal site and a marine site. The profile of the shape parameter was found to be substantially different over land and sea. A parameterization of the vertical behavior of the shape...
ASYMPTOTIC PROPERTIES OF MLE FOR WEIBULL DISTRIBUTION WITH GROUPED DATA
Institute of Scientific and Technical Information of China (English)
XUE Hongqi; SONG Lixin
2002-01-01
A grouped data model for Weibull distribution is considered. Under mild con-ditions, the maximum likelihood estimators(MLE) are shown to be identifiable, strongly consistent, asymptotically normal, and satisfy the law of iterated logarithm. Newton iter- ation algorithm is also considered, which converges to the unique solution of the likelihood equation. Moreover, we extend these results to a random case.
ASYMPTOTIC PROPERTIES OF MLE FOR WEIBULL DISTRIBUTION WITH GROUPED DATA
Institute of Scientific and Technical Information of China (English)
XUEHongqi; SONGLixin
2002-01-01
A grouped data model for weibull distribution is considered.Under mild conditions .the maximum likelihood estimators(MLE)are shown to be identifiable,strongly consistent,asymptotically normal,and satisfy the law of iterated logarithm .Newton iteration algorthm is also condsidered,which converges to the unique solution of the likelihood equation.Moreover,we extend these results to a random case.
Investigation of Weibull statistics in fracture analysis of cast aluminum
Holland, F. A., Jr.; Zaretsky, E. V.
1989-01-01
The fracture strengths of two large batches of A357-T6 cast aluminum coupon specimens were compared by using two-parameter Weibull analysis. The minimum number of these specimens necessary to find the fracture strength of the material was determined. The applicability of three-parameter Weibull analysis was also investigated. A design methodolgy based on the combination of elementary stress analysis and Weibull statistical analysis is advanced and applied to the design of a spherical pressure vessel shell. The results from this design methodology are compared with results from the applicable ASME pressure vessel code.
Investigation of Weibull statistics in fracture analysis of cast aluminum
Holland, F. A., Jr.; Zaretsky, E. V.
1989-01-01
The fracture strengths of two large batches of A357-T6 cast aluminum coupon specimens were compared by using two-parameter Weibull analysis. The minimum number of these specimens necessary to find the fracture strength of the material was determined. The applicability of three-parameter Weibull analysis was also investigated. A design methodolgy based on the combination of elementary stress analysis and Weibull statistical analysis is advanced and applied to the design of a spherical pressure vessel shell. The results from this design methodology are compared with results from the applicable ASME pressure vessel code.
On modeling of lifetime data using two-parameter Gamma and Weibull distributions
Shanker, Rama; Shukla, Kamlesh Kumar; Shanker, Ravi; Leonida, Tekie Asehun
2016-01-01
The analysis and modeling of lifetime data are crucial in almost all applied sciences including medicine, insurance, engineering, behavioral sciences and finance, amongst others. The main objective of this paper is to have a comparative study of two-parameter gamma and Weibull distributions for mode
Directory of Open Access Journals (Sweden)
Jain Sanjay
2010-01-01
Full Text Available In this present paper an inventory model is developed with ramp type demand, starting with shortage and three - parameter Weibull distribution deterioration. A brief analysis of the cost involved is carried out by an example.
Closed form expressions for moments of the beta Weibull distribution
Directory of Open Access Journals (Sweden)
Gauss M Cordeiro
2011-06-01
Full Text Available The beta Weibull distribution was first introduced by Famoye et al. (2005 and studied by these authors and Lee et al. (2007. However, they do not give explicit expressions for the moments. In this article, we derive explicit closed form expressions for the moments of this distribution, which generalize results available in the literature for some sub-models. We also obtain expansions for the cumulative distribution function and Rényi entropy. Further, we discuss maximum likelihood estimation and provide formulae for the elements of the expected information matrix. We also demonstrate the usefulness of this distribution on a real data set.A distribuição beta Weibull (BW foi primeiramente introduzida por Famoye et al. (2005, e estudada por estes autores e Lee et al. (2007. No entanto, eles não fornecem expressões explícitas para os momentos. Neste artigo, nós obtemos expressões explícitas, em forma fechada, para os momentos desta distribuição, o que generaliza resultados disponíveis na literatura para alguns sub-modelos. Nós obtemos expansões para a função de distribuição acumulada e entropia de Rényi. Além disso, discutimos sobre estimação por máxima verossimilhança e fornecemos fórmulaspara os elementos da matriz de informação de Fisher. Nós também mostramos a utilidade desta distribuição em um conjunto de dados reais.
Goh, Segun; Kwon, H. W.; Choi, M. Y.
2014-06-01
We consider the Yule-type multiplicative growth and division process, and describe the ubiquitous emergence of Weibull and log-normal distributions in a single framework. With the help of the integral transform and series expansion, we show that both distributions serve as asymptotic solutions of the time evolution equation for the branching process. In particular, the maximum likelihood method is employed to discriminate between the emergence of the Weibull distribution and that of the log-normal distribution. Further, the detailed conditions for the distinguished emergence of the Weibull distribution are probed. It is observed that the emergence depends on the manner of the division process for the two different types of distribution. Numerical simulations are also carried out, confirming the results obtained analytically.
A Study on The Mixture of Exponentiated-Weibull Distribution
Directory of Open Access Journals (Sweden)
Adel Tawfik Elshahat
2016-12-01
Full Text Available Mixtures of measures or distributions occur frequently in the theory and applications of probability and statistics. In the simplest case it may, for example, be reasonable to assume that one is dealing with the mixture in given proportions of a finite number of normal populations with different means or variances. The mixture parameter may also be denumerable infinite, as in the theory of sums of a random number of random variables, or continuous, as in the compound Poisson distribution. The use of finite mixture distributions, to control for unobserved heterogeneity, has become increasingly popular among those estimating dynamic discrete choice models. One of the barriers to using mixture models is that parameters that could previously be estimated in stages must now be estimated jointly: using mixture distributions destroys any additive reparability of the log likelihood function. In this thesis, the maximum likelihood estimators have been obtained for the parameters of the mixture of exponentiated Weibull distribution when sample is available from censoring scheme. The maximum likelihood estimators of the parameters and the asymptotic variance covariance matrix have been also obtained. A numerical illustration for these new results is given.
Analysis of tensile bond strengths using Weibull statistics.
Burrow, Michael F; Thomas, David; Swain, Mike V; Tyas, Martin J
2004-09-01
Tensile strength tests of restorative resins bonded to dentin, and the resultant strengths of interfaces between the two, exhibit wide variability. Many variables can affect test results, including specimen preparation and storage, test rig design and experimental technique. However, the more fundamental source of variability, that associated with the brittle nature of the materials, has received little attention. This paper analyzes results from micro-tensile tests on unfilled resins and adhesive bonds between restorative resin composite and dentin in terms of reliability using the Weibull probability of failure method. Results for the tensile strengths of Scotchbond Multipurpose Adhesive (3M) and Clearfil LB Bond (Kuraray) bonding resins showed Weibull moduli (m) of 6.17 (95% confidence interval, 5.25-7.19) and 5.01 (95% confidence interval, 4.23-5.8). Analysis of results for micro-tensile tests on bond strengths to dentin gave moduli between 1.81 (Clearfil Liner Bond 2V) and 4.99 (Gluma One Bond, Kulzer). Material systems with m in this range do not have a well-defined strength. The Weibull approach also enables the size dependence of the strength to be estimated. An example where the bonding area was changed from 3.1 to 1.1 mm diameter is shown. Weibull analysis provides a method for determining the reliability of strength measurements in the analysis of data from bond strength and tensile tests on dental restorative materials.
A spatial scan statistic for survival data based on Weibull distribution.
Bhatt, Vijaya; Tiwari, Neeraj
2014-05-20
The spatial scan statistic has been developed as a geographical cluster detection analysis tool for different types of data sets such as Bernoulli, Poisson, ordinal, normal and exponential. We propose a scan statistic for survival data based on Weibull distribution. It may also be used for other survival distributions, such as exponential, gamma, and log normal. The proposed method is applied on the survival data of tuberculosis patients for the years 2004-2005 in Nainital district of Uttarakhand, India. Simulation studies reveal that the proposed method performs well for different survival distribution functions.
Designing a Repetitive Group Sampling Plan for Weibull Distributed Processes
Directory of Open Access Journals (Sweden)
Aijun Yan
2016-01-01
Full Text Available Acceptance sampling plans are useful tools to determine whether the submitted lots should be accepted or rejected. An efficient and economic sampling plan is very desirable for the high quality levels required by the production processes. The process capability index CL is an important quality parameter to measure the product quality. Utilizing the relationship between the CL index and the nonconforming rate, a repetitive group sampling (RGS plan based on CL index is developed in this paper when the quality characteristic follows the Weibull distribution. The optimal plan parameters of the proposed RGS plan are determined by satisfying the commonly used producer’s risk and consumer’s risk at the same time by minimizing the average sample number (ASN and then tabulated for different combinations of acceptance quality level (AQL and limiting quality level (LQL. The results show that the proposed plan has better performance than the single sampling plan in terms of ASN. Finally, the proposed RGS plan is illustrated with an industrial example.
Large-Scale Weibull Analysis of H-451 Nuclear- Grade Graphite Specimen Rupture Data
Nemeth, Noel N.; Walker, Andrew; Baker, Eric H.; Murthy, Pappu L.; Bratton, Robert L.
2012-01-01
A Weibull analysis was performed of the strength distribution and size effects for 2000 specimens of H-451 nuclear-grade graphite. The data, generated elsewhere, measured the tensile and four-point-flexure room-temperature rupture strength of specimens excised from a single extruded graphite log. Strength variation was compared with specimen location, size, and orientation relative to the parent body. In our study, data were progressively and extensively pooled into larger data sets to discriminate overall trends from local variations and to investigate the strength distribution. The CARES/Life and WeibPar codes were used to investigate issues regarding the size effect, Weibull parameter consistency, and nonlinear stress-strain response. Overall, the Weibull distribution described the behavior of the pooled data very well. However, the issue regarding the smaller-than-expected size effect remained. This exercise illustrated that a conservative approach using a two-parameter Weibull distribution is best for designing graphite components with low probability of failure for the in-core structures in the proposed Generation IV (Gen IV) high-temperature gas-cooled nuclear reactors. This exercise also demonstrated the continuing need to better understand the mechanisms driving stochastic strength response. Extensive appendixes are provided with this report to show all aspects of the rupture data and analytical results.
Directory of Open Access Journals (Sweden)
Lianxia Zhao
2016-01-01
Full Text Available An inventory model for Weibull-distributed deteriorating items is considered so as to minimize the total cost per unit time in this paper. The model starts with shortage, allowed partial backlogging, and trapezoidal demand rate. By analyzing the model, an efficient solution procedure is proposed to determine the optimal replenishment and the optimal order quantity and the average total costs are also obtained. Finally, numerical examples are provided to illustrate the theoretical results and a sensitivity analysis of the major parameters with respect to the stability of optimal solution is also carried out.
Sazuka, Naoya; Inoue, Jun-Ichi
2007-03-01
A Weibull distribution with power-law tails is confirmed as a good candidate to describe the first passage time process of foreign currency exchange rates. The Lorentz curve and the corresponding Gini coefficient for a Weibull distribution are derived analytically. We show that the coefficient is in good agreement with the same quantity calculated from the empirical data. We also calculate the average waiting time which is an important measure to estimate the time for customers to wait until the next price change after they login to their computer systems. By assuming that the first passage time distribution might change its shape from the Weibull to the power-law at some critical time, we evaluate the averaged waiting time by means of the renewal-reward theorem. We find that our correction of tails of the distribution makes the averaged waiting time much closer to the value obtained from empirical data analysis. We also discuss the deviation from the estimated average waiting time by deriving the waiting time distribution directly. These results make us conclude that the first passage process of the foreign currency exchange rates is well described by a Weibull distribution with power-law tails.
Gross, Bernard
1996-01-01
Material characterization parameters obtained from naturally flawed specimens are necessary for reliability evaluation of non-deterministic advanced ceramic structural components. The least squares best fit method is applied to the three parameter uniaxial Weibull model to obtain the material parameters from experimental tests on volume or surface flawed specimens subjected to pure tension, pure bending, four point or three point loading. Several illustrative example problems are provided.
Directory of Open Access Journals (Sweden)
P Bhattacharya
2016-09-01
Full Text Available The wind resource varies with of the day and the season of the year and even some extent from year to year. Wind energy has inherent variances and hence it has been expressed by distribution functions. In this paper, we present some methods for estimating Weibull parameters in case of a low wind speed characterization, namely, shape parameter (k, scale parameter (c and characterize the discrete wind data sample by the discrete Hilbert transform. We know that the Weibull distribution is an important distribution especially for reliability and maintainability analysis. The suitable values for both shape parameter and scale parameters of Weibull distribution are important for selecting locations of installing wind turbine generators. The scale parameter of Weibull distribution also important to determine whether a wind farm is good or not. Thereafter the use of discrete Hilbert transform (DHT for wind speed characterization provides a new era of using DHT besides its application in digital signal processing. Basically in this paper, discrete Hilbert transform has been applied to characterize the wind sample data measured on College of Engineering and Management, Kolaghat, East Midnapore, India in January 2011.
Directory of Open Access Journals (Sweden)
B.B. Sagar
2016-09-01
Full Text Available The aim of this paper was to estimate the number of defects in software and remove them successfully. This paper incorporates Weibull distribution approach along with inflection S-shaped Software Reliability Growth Models (SRGM. In this combination two parameter Weibull distribution methodology is used. Relative Prediction Error (RPE is calculated to predict the validity criterion of the developed model. Experimental results on actual data from five data sets are compared with two other existing models, which expose that the proposed software reliability growth model predicts better estimation to remove the defects. This paper presents best software reliability growth model with including feature of both Weibull distribution and inflection S-shaped SRGM to estimate the defects of software system, and provide help to researchers and software industries to develop highly reliable software products.
Directory of Open Access Journals (Sweden)
Aliashim Albani
2014-02-01
Full Text Available The demand for electricity in Malaysia is growing in tandem with its Gross Domestic Product (GDP growth. Malaysia is going to need even more energy as it strives to grow towards a high-income economy. Malaysia has taken steps to exploring the renewable energy (RE including wind energy as an alternative source for generating electricity. In the present study, the wind energy potential of the site is statistically analyzed based on 1-year measured hourly time-series wind speed data. Wind data were obtained from the Malaysian Meteorological Department (MMD weather stations at nine selected sites in Malaysia. The data were calculated by using the MATLAB programming to determine and generate the Weibull and Rayleigh distribution functions. Both Weibull and Rayleigh models are fitted and compared to the Field data probability distributions of year 2011. From the analysis, it was shown that the Weibull distribution is fitting the Field data better than the Rayleigh distribution for the whole year 2011. The wind power density of every site has been studied based on the Weibull and Rayleigh functions. The Weibull distribution shows a good approximation for estimation of wind power density in Malaysia.
Directory of Open Access Journals (Sweden)
Manna S.K.
2008-01-01
Full Text Available In this paper, we consider the problem of simultaneous determination of retail price and lot-size (RPLS under the assumption that the supplier offers a fixed credit period to the retailer. It is assumed that the item in stock deteriorates over time at a rate that follows a two-parameter Weibull distribution and that the price-dependent demand is represented by a constant-price-elasticity function of retail price. The RPLS decision model is developed and solved analytically. Results are illustrated with the help of a base example. Computational results show that the supplier earns more profits when the credit period is greater than the replenishment cycle length. Sensitivity analysis of the solution to changes in the value of input parameters of the base example is also discussed.
An EOQ Model with Two-Parameter Weibull Distribution Deterioration and Price-Dependent Demand
Mukhopadhyay, Sushanta; Mukherjee, R. N.; Chaudhuri, K. S.
2005-01-01
An inventory replenishment policy is developed for a deteriorating item and price-dependent demand. The rate of deterioration is taken to be time-proportional and the time to deterioration is assumed to follow a two-parameter Weibull distribution. A power law form of the price dependence of demand is considered. The model is solved analytically…
EFFECT OF NANOPOWDER ADDITION ON THE FLEXURAL STRENGTH OF ALUMINA CERAMIC - A WEIBULL MODEL ANALYSIS
Directory of Open Access Journals (Sweden)
Daidong Guo
2016-05-01
Full Text Available Alumina ceramics were prepared either with micrometer-sized alumina powder (MAP or with the addition of nanometer-sized alumina powder (NAP. The density, crystalline phase, flexural strength and the fracture surface of the two ceramics were measured and compared. Emphasis has been put on the influence of nanopowder addition on the flexural strength of Al₂O₃ ceramic. The analysis based on the Weibull distribution model suggests the distribution of the flexural strength of the NAP ceramic is more concentrated than that of the MAP ceramic. Therefore, the NAP ceramics will be more stable and reliable in real applications.
Modified Weibull Distribution for Analyzing the Tensile Strength of Bamboo Fibers
Directory of Open Access Journals (Sweden)
Fang Wang
2014-12-01
Full Text Available There is growing evidence that the standard Weibull strength distribution is not always accurate for the description of variability in tensile strength and its dependence on the gauge size of brittle fibers. In this work, a modified Weibull model by incorporating the diameter variation of bamboo fiber is proposed to investigate the effect of fiber length and diameter on the tensile strength. Fiber strengths are obtained for lengths ranging from 20 to 60 mm and diameters ranging from 196.6 to 584.3 μm through tensile tests. It is shown that as the within-fiber diameter variation increases, the fracture strength of the bamboo fiber decreases. In addition, the accuracy of using weak-link scaling predictions based on the standard and modified Weibull distribution are assessed, which indicates that the use of the modified distribution provides better correlation with the experimental data than the standard model. The result highlights the accuracy of the modified Weibull model for characterizing the strength and predicting the size dependence of bamboo fiber.
Scaling Analysis of the Tensile Strength of Bamboo Fibers Using Weibull Statistics
Directory of Open Access Journals (Sweden)
Jiaxing Shao
2013-01-01
Full Text Available This study demonstrates the effect of weak-link scaling on the tensile strength of bamboo fibers. The proposed model considers the random nature of fiber strength, which is reflected by using a two-parameter Weibull distribution function. Tension tests were performed on samples that could be scaled in length. The size effects in fiber length on the strength were analyzed based on Weibull statistics. The results verify the use of Weibull parameters from specimen testing for predicting the strength distributions of fibers of longer gauge lengths.
Transverse Momentum Distribution in Heavy Ion Collision using q-Weibull Formalism
Dash, Sadhana
2016-01-01
We have implemented the Tsallis q-statistics in the Weibull model of particle production known as the q-Weibull distribution to describe the transverse-momentum (pT ) distribution of the charged hadrons at mid-rapidity measured at RHIC and LHC energies. The model describes the data remarkably well for the entire pT range measured in nucleus-nucleus and nucleon-nucleon collisions. The proposed distribution is based on the non-extensive Tsallis q-statistics which replaces the usual thermal equilibrium assumption of hydrodynamical models. The parameters of the distribution can be related to the various aspects of complex dynamics associated with such collision process.
Pasari, S.
2013-05-01
Earthquake recurrence interval is one of the important ingredients towards probabilistic seismic hazard assessment (PSHA) for any location. Weibull, gamma, generalized exponential and lognormal distributions are quite established probability models in this recurrence interval estimation. Moreover these models share many important characteristics among themselves. In this paper, we aim to compare the effectiveness of these models in recurrence interval estimations and eventually in hazard analysis. To contemplate the appropriateness of these models, we use a complete and homogeneous earthquake catalogue of 20 events (M ≥ 7.0) spanning for the period 1846 to 1995 from North-East Himalayan region (200-320 N and 870-1000 E). The model parameters are estimated using modified maximum likelihood estimator (MMLE). No geological or geophysical evidences have been considered in this calculation. The estimated conditional probability reaches quite high after about a decade for an elapsed time of 17 years (i.e. 2012). Moreover, this study shows that the generalized exponential distribution fits the above data events more closely compared to the conventional models and hence it is tentatively concluded that generalized exponential distribution can be effectively considered in earthquake recurrence studies.
Directory of Open Access Journals (Sweden)
Sghaier T
2016-10-01
Full Text Available The objective of this study was to evaluate the effectiveness of both Normal and two-parameter Weibull distributions in describing diameter distribution of Tetraclinis articulata stands in north-east Tunisia. The parameters of the Weibull function were estimated using the moments method and maximum likelihood approaches. The data used in this study came from temporary plots. The three diameter distribution models were compared firstly by estimating the parameters of the distribution directly from individual tree measurements taken in each plot (parameter estimation method, and secondly by predicting the same parameters from stand variables (parameter prediction method. The comparison was based on bias, mean absolute error, mean square error and the Reynolds’ index error (as a percentage. On the basis of the parameter estimation method, the Normal distribution gave slightly better results, whereas the Weibull distribution with the maximum likelihood approach gave the best results for the parameter prediction method. Hence, in the latter case, the Weibull distribution with the maximum likelihood approach appears to be the most suitable to estimate the parameters for reducing the different comparison criteria for the distribution of trees by diameter class in Tetraclinis articulata forests in Tunisia.
Le, Cui; Wanxi, Peng; Zhengjun, Sun; Lili, Shang; Guoning, Chen
2014-07-01
Bamboo is a radial gradient variation composite material against parasitology and vector biology, but the vascular bundles in inner layer are evenly distributed. The objective is to determine the regular size pattern and Weibull statistical analysis of the vascular bundle tensile strength in inner layer of Moso bamboo. The size and shape of vascular bundles in inner layer are similar, with an average area about 0.1550 mm2. A statistical evaluation of the tensile strength of vascular bundle was conducted by means of Weibull statistics, the results show that the Weibull modulus m is 6.1121 and the accurate reliability assessment of vascular bundle is determined.
Comparison of Estimators for Exponentiated Inverted Weibull Distribution Based on Grouped Data Amal
2014-01-01
In many situations, instead of complete sample, data is available only in grouped form. This paper presents estimation of population parameters for the exponentiated inverted Weibull distribution based on grouped data with equi and unequi-spaced grouping. Several alternative estimation schemes, such as, the method of maximum likelihood, least lines, least squares, minimum chi-square, and modified minimum chi-square are considered. Since the different methods of estimation didn...
Huang, Zhi-Hui; Tang, Ying-Chun; Dai, Kai
2016-05-01
Semiconductor materials and Product qualified rate are directly related to the manufacturing costs and survival of the enterprise. Application a dynamic reliability growth analysis method studies manufacturing execution system reliability growth to improve product quality. Refer to classical Duane model assumptions and tracking growth forecasts the TGP programming model, through the failure data, established the Weibull distribution model. Combining with the median rank of average rank method, through linear regression and least squares estimation method, match respectively weibull information fusion reliability growth curve. This assumption model overcome Duane model a weakness which is MTBF point estimation accuracy is not high, through the analysis of the failure data show that the method is an instance of the test and evaluation modeling process are basically identical. Median rank in the statistics is used to determine the method of random variable distribution function, which is a good way to solve the problem of complex systems such as the limited sample size. Therefore this method has great engineering application value.
Directory of Open Access Journals (Sweden)
Islam Khandaker Dahirul
2016-01-01
Full Text Available This paper explores wind speed distribution using Weibull probability distribution and Rayleigh distribution methods that are proven to provide accurate and efficient estimation of energy output in terms of wind energy conversion systems. Two parameters of Weibull (shape and scale parameters k and c respectively and scale parameter of Rayleigh distribution have been determined based on hourly time-series wind speed data recorded from October 2014 to October 2015 at Saint Martin’s island, Bangladesh. This research has been carried out to examine three numerical methods namely Graphical Method (GM, Empirical Method (EM, Energy Pattern Factor method (EPF to estimate Weibull parameters. Also, Rayleigh distribution method has been analyzed throughout the study. The results in the research revealed that the Graphical method followed by Empirical method and Energy Pattern Factor method were the most accurate and efficient way for determining the value of k and c to approximate wind speed distribution in terms of estimating power error. Rayleigh distribution gives the most power error in the research. Potential for wind energy development in Saint Martin’s island, Bangladesh as found from the data analysis has been explained in this paper.
On the Performance Analysis of Digital Communications over Weibull-Gamma Channels
Ansari, Imran Shafique
2015-05-01
In this work, the performance analysis of digital communications over a composite Weibull-Gamma (WG) multipath-fading and shadowing channel is presented wherein WG distribution is appropriate for modeling fading environments when multipath is superimposed on shadowing. More specifically, in this work, exact closed-form expressions are derived for the probability density function, the cumulative distribution function, the moment generating function, and the moments of a composite WG channel. Capitalizing on these results, new exact closed-form expressions are offered for the outage probability, the higher- order amount of fading, the average error rate for binary and M-ary modulation schemes, and the ergodic capacity under various types of transmission policies, mostly in terms of Meijer\\'s G functions. These new analytical results were also verified via computer-based Monte-Carlo simulation results. © 2015 IEEE.
Comparison of Estimators for Exponentiated Inverted Weibull Distribution Based on Grouped Data Amal
Directory of Open Access Journals (Sweden)
S. Hassan
2014-04-01
Full Text Available In many situations, instead of complete sample, data is available only in grouped form. This paper presents estimation of population parameters for the exponentiated inverted Weibull distribution based on grouped data with equi and unequi-spaced grouping. Several alternative estimation schemes, such as, the method of maximum likelihood, least lines, least squares, minimum chi-square, and modified minimum chi-square are considered. Since the different methods of estimation didn't provide closed form solution, thus numerical procedure is applied. The root mean squared error resulting estimators used as comparison criterion to measure both the accuracy and the precision for each parameter.
Directory of Open Access Journals (Sweden)
Wu Kun-Shan
2002-01-01
Full Text Available In this paper, an EOQ inventory model is depleted not only by time varying demand but also by Weibull distribution deterioration, in which the inventory is permitted to start with shortages and end without shortages. A theory is developed to obtain the optimal solution of the problem; it is then illustrated with the aid of several numerical examples. Moreover, we also assume that the holding cost is a continuous, non-negative and non-decreasing function of time in order to extend the EOQ model. Finally, sensitivity of the optimal solution to changes in the values of different system parameters is also studied.
Directory of Open Access Journals (Sweden)
Navid Feroze
2016-03-01
Full Text Available The families of mixture distributions have a wider range of applications in different fields such as fisheries, agriculture, botany, economics, medicine, psychology, electrophoresis, finance, communication theory, geology and zoology. They provide the necessary flexibility to model failure distributions of components with multiple failure modes. Mostly, the Bayesian procedure for the estimation of parameters of mixture model is described under the scheme of Type-I censoring. In particular, the Bayesian analysis for the mixture models under doubly censored samples has not been considered in the literature yet. The main objective of this paper is to develop the Bayes estimation of the inverse Weibull mixture distributions under doubly censoring. The posterior estimation has been conducted under the assumption of gamma and inverse levy using precautionary loss function and weighted squared error loss function. The comparisons among the different estimators have been made based on analysis of simulated and real life data sets.
The Transmuted Geometric-Weibull distribution: Properties, Characterizations and Regression Models
Directory of Open Access Journals (Sweden)
Zohdy M Nofal
2017-06-01
Full Text Available We propose a new lifetime model called the transmuted geometric-Weibull distribution. Some of its structural properties including ordinary and incomplete moments, quantile and generating functions, probability weighted moments, Rényi and q-entropies and order statistics are derived. The maximum likelihood method is discussed to estimate the model parameters by means of Monte Carlo simulation study. A new location-scale regression model is introduced based on the proposed distribution. The new distribution is applied to two real data sets to illustrate its flexibility. Empirical results indicate that proposed distribution can be alternative model to other lifetime models available in the literature for modeling real data in many areas.
Behera, Nirbhay K; Naik, Bharati; Nandi, Basanta K; Pani, Tanmay
2016-01-01
The charged particle multiplicity distribution and the transverse energy distribution measured in heavy-ion collisions at top RHIC and LHC energies are described using the two-component model approach based on convolution of Monte Carlo Glauber model with the Weibull model for particle production. The model successfully describes the multiplicity and transverse energy distribution of minimum bias collision data for a wide range of energies. We also propose that Weibull-Glauber model can be used to determine the centrality classes in heavy-ion collision as an alternative to the conventional Negative Binomial distribution for particle production.
Santi, D. N.; Purnaba, I. G. P.; Mangku, I. W.
2016-01-01
Bonus-Malus system is said to be optimal if it is financially balanced for insurance companies and fair for policyholders. Previous research about Bonus-Malus system concern with the determination of the risk premium which applied to all of the severity that guaranteed by the insurance company. In fact, not all of the severity that proposed by policyholder may be covered by insurance company. When the insurance company sets a maximum bound of the severity incurred, so it is necessary to modify the model of the severity distribution into the severity bound distribution. In this paper, optimal Bonus-Malus system is compound of claim frequency component has geometric distribution and severity component has truncated Weibull distribution is discussed. The number of claims considered to follow a Poisson distribution, and the expected number λ is exponentially distributed, so the number of claims has a geometric distribution. The severity with a given parameter θ is considered to have a truncated exponential distribution is modelled using the Levy distribution, so the severity have a truncated Weibull distribution.
Directory of Open Access Journals (Sweden)
Abeer Abd-Alla EL-Helbawy
2016-09-01
Full Text Available The accelerated life tests provide quick information on the life time distributions by testing materials or products at higher than basic conditional levels of stress such as pressure, high temperature, vibration, voltage or load to induce failures. In this paper, the acceleration model assumed is log linear model. Constant stress tests are discussed based on Type I and Type II censoring. The Kumaraswmay Weibull distribution is used. The estimators of the parameters, reliability, hazard rate functions and p-th percentile at normal condition, low stress, and high stress are obtained. In addition, credible intervals for parameters of the models are constructed. Optimum test plan are designed. Some numerical studies are used to solve the complicated integrals such as Laplace and Markov Chain Monte Carlo methods.
Yusuf, Madaki Umar; Bakar, Mohd. Rizam B. Abu
2016-06-01
Models for survival data that includes the proportion of individuals who are not subject to the event under study are known as a cure fraction models or simply called long-term survival models. The two most common models used to estimate the cure fraction are the mixture model and the non-mixture model. in this work, we present mixture and the non-mixture cure fraction models for survival data based on the beta-Weibull distribution. This four parameter distribution has been proposed as an alternative extension of the Weibull distribution in the analysis of lifetime data. This approach allows the inclusion of covariates in the models, where the estimation of the parameters was obtained under a Bayesian approach using Gibbs sampling methods.
1981-12-01
the variance of point estimators are given by Mendenhall and Scheaffer (Ref 17:269), for both biased and unbiased estimations. In addition to this...Weibull Distribution. Thesis, Wright-Patterson AFB, Ohio: Air Force Institute of Technology, December 1980. 17. Mendenhall, W. and R. L. Scheaffer
Weibull analyses of bacterial interaction forces measured using AFM
van der Mei, Henderina; de Vries, Jacob; Busscher, Hendrik
2010-01-01
Statistically significant conclusions from interaction forces obtained by AFM are difficult to draw because of large data spreads. Weibull analysis, common in macroscopic bond-strength analyses, takes advantage of this spread to derive a Weibull distribution, yielding the probability of occurrence o
Comparison of Bayesian and Classical Analysis of Weibull Regression Model: A Simulation Study
Directory of Open Access Journals (Sweden)
İmran KURT ÖMÜRLÜ
2011-01-01
Full Text Available Objective: The purpose of this study was to compare performances of classical Weibull Regression Model (WRM and Bayesian-WRM under varying conditions using Monte Carlo simulations. Material and Methods: It was simulated the generated data by running for each of classical WRM and Bayesian-WRM under varying informative priors and sample sizes using our simulation algorithm. In simulation studies, n=50, 100 and 250 were for sample sizes, and informative prior values using a normal prior distribution with was selected for b1. For each situation, 1000 simulations were performed. Results: Bayesian-WRM with proper informative prior showed a good performance with too little bias. It was found out that bias of Bayesian-WRM increased while priors were becoming distant from reliability in all sample sizes. Furthermore, Bayesian-WRM obtained predictions with more little standard error than the classical WRM in both of small and big samples in the light of proper priors. Conclusion: In this simulation study, Bayesian-WRM showed better performance than classical method, when subjective data analysis performed by considering of expert opinions and historical knowledge about parameters. Consequently, Bayesian-WRM should be preferred in existence of reliable informative priors, in the contrast cases, classical WRM should be preferred.
Directory of Open Access Journals (Sweden)
H Akbari
2016-02-01
Full Text Available Introduction Temperature and water potential are two of the most important environmental factors regulating the seed germination. The germination response of a population of seeds to temperature and water potential can be described on the basis of hydrothermal time (HTT model. Regardless of the wide use of HTT models to simulate germination, little research has critically examined the assumption that the base water potential within these models is normally distributed. An alternative to the normal distribution that can fit a range of distribution types is the Weibull distribution. Using germination data of Castor bean (Ricinus communis L. over a range of water potential and sub-optimal temperature, we compared the utility of the normal and Weibull distribution in estimating base water potential (b. The accuracy of their respective HTT models in predicting germination percentage across the sub-optimal temperature range was also examined. Materials and Methods Castor bean seed germination was tested across a range of water potential (0, -0.3, -0.6 and -0.9 MPa at the sub-optimal range of temperature (ranging from 10 to 35 ˚C, with 5 ˚C intervals. Osmotic solutions were prepared by dissolving polyethylene glycol 8000 in distilled water according to the Michel (1983 equation for a given temperature. Seed germination was tested on 4 replicates of 50 seeds in moist paper towels in the incubator. The HTT models, based on the normal and Weibull distributions were fitted to data from all combinations of temperatures and water potentials using the PROC NLMIXED procedure in SAS. Results and Discussion Based on both normal and Weibull distribution functions, hydrotime constant and base water potential for castor bean seed germination were declined by increasing the temperature. Reducing the values of base water potential showed the greater need to water uptake for germination at lower temperatures and reducing hydrotime constant indicated an increase
Cheng, Mingjian; Zhang, Yixin; Gao, Jie; Wang, Fei; Zhao, Fengsheng
2014-06-20
We model the average channel capacity of optical wireless communication systems for cases of weak to strong turbulence channels, using the exponentiation Weibull distribution model. The joint effects of the beam wander and spread, pointing errors, atmospheric attenuation, and the spectral index of non-Kolmogorov turbulence on system performance are included. Our results show that the average capacity decreases steeply as the propagation length L changes from 0 to 200 m and decreases slowly down or tends to a stable value as the propagation length L is greater than 200 m. In the weak turbulence region, by increasing the detection aperture, we can improve the average channel capacity and the atmospheric visibility as an important issue affecting the average channel capacity. In the strong turbulence region, the increase of the radius of the detection aperture cannot reduce the effects of the atmospheric turbulence on the average channel capacity, and the effect of atmospheric visibility on the channel information capacity can be ignored. The effect of the spectral power exponent on the average channel capacity in the strong turbulence region is higher than weak turbulence region. Irrespective of the details determining the turbulent channel, we can say that pointing errors have a significant effect on the average channel capacity of optical wireless communication systems in turbulence channels.
Compressed data separation via dual frames based split-analysis with Weibull matrices
Institute of Scientific and Technical Information of China (English)
CAI Yun; LI Song
2013-01-01
In this paper, we consider data separation problem, where the original signal is composed of two distinct subcomponents, via dual frames based Split-analysis approach. We show that the two distinct subcomponents, which are sparse in two diff erent general frames respectively, can be exactly recovered with high probability, when the measurement matrix is a Weibull random matrix (not Gaussian) and the two frames satisfy a mutual coherence property. Our result may be significant for analysing Split-analysis model for data separation.
Institute of Scientific and Technical Information of China (English)
WANG Ronghua; FEI Heliang
2004-01-01
In this note, the tampered failure rate model is generalized from the step-stress accelerated life testing setting to the progressive stress accelerated life testing for the first time. For the parametric setting where the scale parameter satisfying the equation of the inverse power law is Weibull, maximum likelihood estimation is investigated.
Directory of Open Access Journals (Sweden)
Jose Javier Gorgoso-Varela
2016-04-01
Full Text Available Aim of study: In this study we compare the accuracy of three bivariate distributions: Johnson’s SBB, Weibull-2P and LL-2P functions for characterizing the joint distribution of tree diameters and heights.Area of study: North-West of Spain.Material and methods: Diameter and height measurements of 128 plots of pure and even-aged Tasmanian blue gum (Eucalyptus globulus Labill. stands located in the North-west of Spain were considered in the present study. The SBB bivariate distribution was obtained from SB marginal distributions using a Normal Copula based on a four-parameter logistic transformation. The Plackett Copula was used to obtain the bivariate models from the Weibull and Logit-logistic univariate marginal distributions. The negative logarithm of the maximum likelihood function was used to compare the results and the Wilcoxon signed-rank test was used to compare the related samples of these logarithms calculated for each sample plot and each distribution.Main results: The best results were obtained by using the Plackett copula and the best marginal distribution was the Logit-logistic.Research highlights: The copulas used in this study have shown a good performance for modeling the joint distribution of tree diameters and heights. They could be easily extended for modelling multivariate distributions involving other tree variables, such as tree volume or biomass.
Survival Analysis of Patients with Breast Cancer using Weibull Parametric Model.
Baghestani, Ahmad Reza; Moghaddam, Sahar Saeedi; Majd, Hamid Alavi; Akbari, Mohammad Esmaeil; Nafissi, Nahid; Gohari, Kimiya
2015-01-01
The Cox model is known as one of the most frequently-used methods for analyzing survival data. However, in some situations parametric methods may provide better estimates. In this study, a Weibull parametric model was employed to assess possible prognostic factors that may affect the survival of patients with breast cancer. We studied 438 patients with breast cancer who visited and were treated at the Cancer Research Center in Shahid Beheshti University of Medical Sciences during 1992 to 2012; the patients were followed up until October 2014. Patients or family members were contacted via telephone calls to confirm whether they were still alive. Clinical, pathological, and biological variables as potential prognostic factors were entered in univariate and multivariate analyses. The log-rank test and the Weibull parametric model with a forward approach, respectively, were used for univariate and multivariate analyses. All analyses were performed using STATA version 11. A P-value lower than 0.05 was defined as significant. On univariate analysis, age at diagnosis, level of education, type of surgery, lymph node status, tumor size, stage, histologic grade, estrogen receptor, progesterone receptor, and lymphovascular invasion had a statistically significant effect on survival time. On multivariate analysis, lymph node status, stage, histologic grade, and lymphovascular invasion were statistically significant. The one-year overall survival rate was 98%. Based on these data and using Weibull parametric model with a forward approach, we found out that patients with lymphovascular invasion were at 2.13 times greater risk of death due to breast cancer.
Weibull analysis and flexural strength of hot-pressed core and veneered ceramic structures.
Bona, Alvaro Della; Anusavice, Kenneth J; DeHoff, Paul H
2003-11-01
To test the hypothesis that the Weibull moduli of single- and multilayer ceramics are controlled primarily by the structural reliability of the core ceramic.Methods. Seven groups of 20 bar specimens (25 x 4 x 1.2 mm) were made from the following materials: (1) IPS Empress--a hot-pressed (HP) leucite-based core ceramic; (2) IPS Empress2--a HP lithia-based core ceramic; (3 and 7) Evision--a HP lithia-based core ceramic (ES); (4) IPS Empress2 body--a glass veneer; (5) ES (1.1 mm thick) plus a glaze layer (0.1 mm); and (6) ES (0.8 mm thick) plus veneer (0.3 mm) and glaze (0.1 mm). Each specimen was subjected to four-point flexure loading at a cross-head speed of 0.5 mm/min while immersed in distilled water at 37 degrees C, except for Group 7 that was tested in a dry environment. Failure loads were recorded and the fracture surfaces were examined using SEM. ANOVA and Duncan's multiple range test were used for statistical analysis. No significant differences were found between the mean flexural strength values of Groups 2, 3, 5, and 6 or between Groups 1 and 4 (p>0.05). However, significant differences were found for dry (Group 7) and wet (Groups 1-6) conditions. Glazing had no significant effect on the flexural strength or Weibull modulus. The strength and Weibull modulus of the ES ceramic were similar to those of Groups 5 and 6. The structural reliability of veneered core ceramic is controlled primarily by that of the core ceramic.
Heo, Jun-Haeng; Boes, D. C.; Salas, J. D.
2001-02-01
Parameter estimation in a regional flood frequency setting, based on a Weibull model, is revisited. A two parameter Weibull distribution at each site, with common shape parameter over sites that is rationalized by a flood index assumption, and with independence in space and time, is assumed. The estimation techniques of method of moments and method of probability weighted moments are studied by proposing a family of estimators for each technique and deriving the asymptotic variance of each estimator. Then a single estimator and its asymptotic variance for each technique, suggested by trying to minimize the asymptotic variance over the family of estimators, is obtained. These asymptotic variances are compared to the Cramer-Rao Lower Bound, which is known to be the asymptotic variance of the maximum likelihood estimator. A companion paper considers the application of this model and these estimation techniques to a real data set. It includes a simulation study designed to indicate the sample size required for compatibility of the asymptotic results to fixed sample sizes.
An Approach to Determine the Weibull Parameters for Wind Energy Analysis: The Case of Galicia (Spain
Directory of Open Access Journals (Sweden)
Camilo Carrillo
2014-04-01
Full Text Available The Weibull probability density function (PDF has mostly been used to fit wind speed distributions for wind energy applications. The goodness of fit of the results depends on the estimation method that was used and the wind type of the analyzed area. In this paper, a study on a particular area (Galicia was performed to test the performance of several fitting methods. The goodness of fit was evaluated by well-known indicators that use the wind speed or the available wind power density. However, energy production must be a critical parameter in wind energy applications. Hence, a fitting method that accounts for the power density distribution is proposed. To highlight the usefulness of this method, indicators that use energy production values are also presented.
A class of generalized beta distributions, Pareto power series and Weibull power series
Lemos de Morais, Alice
2009-01-01
Nesta dissertação trabalhamos com três classes de distribuições de probabilidade, sendo uma já conhecida na literatura, a Classe de Distribuições Generalizadas Beta (Beta-G) e duas outras novas classes introduzidas nesta tese, baseadas na composição das distribuições Pareto e Weibull com a classe de distribuições discretas power series. Fazemos uma revisão geral da classe Beta-G e introduzimos um caso especial, a distribuição beta logística generalizada do tipo IV (BGL(IV)). In...
Energy Technology Data Exchange (ETDEWEB)
Gabriel Filho, Luis Roberto Almeida [Universidade Estadual Paulista (CE/UNESP), Tupa, SP (Brazil). Coordenacao de Estagio; Cremasco, Camila Pires [Faculdade de Tecnologia de Presidente Prudente, SP (Brazil); Seraphim, Odivaldo Jose [Universidade Estadual Paulista (FCA/UNESP), Botucatu, SP (Brazil). Fac. de Ciencias Agronomicas; Cagnon, Jose Angelo [Universidade Estadual Paulista (FEB/UNESP), Bauru, SP (Brazil). Faculdade de Engenharia
2008-07-01
The wind behavior of a region can be described by frequency distribution that provide information and characteristics needed for a possible deployment of wind energy harvesting in the region. These characteristics, such as the annual average speed, the variance and shunting line standard of the registered speeds and the density of aeolian power average hourly, can be gotten by the frequency of occurrence of determined speed, that in turn must be studied through analytical expressions. The more adjusted analytical function for aeolian distributions is the function of density of Weibull, that can be determined by numerical methods and linear regressions. Once you have determined this function, all wind characteristics mentioned above may be determined accurately. The objective of this work is to characterize the aeolian behavior in the region of Botucatu-SP and to determine the energy potential for implementation of aeolian turbines. For the development of the present research, was used an Monitorial Young Wind anemometer of Campbell company installed a 10 meters of height. The experiment was developed in the Nucleus of Alternative Energies and Renewed - NEAR of the Laboratory of Agricultural Energize of the Department of Agricultural Engineering of the UNESP, Agronomy Sciences Faculty, Lageado Experimental Farm, located in the city of Botucatu - SP. The geographic localization is defined by the coordinates 22 deg 51' South latitude (S) and 48 deg 26' Longitude West (W) and average altitude of 786 meters above sea level. The analysis was carried through using registers of speed of the wind during the period of September of 2004 the September of 2005. After determined the distribution of frequencies of the hourly average speed of the wind, it was determined function of associated Weibull, thus making possible the determination of the annual average speed of the wind (2,77 m/s), of the shunting line standard of the registered speeds (0,55 m/s), of the
Directory of Open Access Journals (Sweden)
Ruben M. Mouangue
2014-05-01
Full Text Available The modeling of the wind speed distribution is of great importance for the assessment of wind energy potential and the performance of wind energy conversion system. In this paper, the choice of two determination methods of Weibull parameters shows theirs influences on the Weibull distribution performances. Because of important calm winds on the site of Ngaoundere airport, we characterize the wind potential using the approach of Weibull distribution with parameters which are determined by the modified maximum likelihood method. This approach is compared to the Weibull distribution with parameters which are determined by the maximum likelihood method and the hybrid distribution which is recommended for wind potential assessment of sites having nonzero probability of calm. Using data provided by the ASECNA Weather Service (Agency for the Safety of Air Navigation in Africa and Madagascar, we evaluate the goodness of fit of the various fitted distributions to the wind speed data using the Q – Q plots, the Pearson’s coefficient of correlation, the mean wind speed, the mean square error, the energy density and its relative error. It appears from the results that the accuracy of the Weibull distribution with parameters which are determined by the modified maximum likelihood method is higher than others. Then, this approach is used to estimate the monthly and annual energy productions of the site of the Ngaoundere airport. The most energy contribution is made in March with 255.7 MWh. It also appears from the results that a wind turbine generator installed on this particular site could not work for at least a half of the time because of higher frequency of calm. For this kind of sites, the modified maximum likelihood method proposed by Seguro and Lambert in 2000 is one of the best methods which can be used to determinate the Weibull parameters.
Roy, Aparna; Chakraborty, Sumit; Kundu, Sarada Prasad; Basak, Ratan Kumar; Majumder, Subhasish Basu; Adhikari, Basudam
2012-03-01
Chemically modified jute fibres are potentially useful as natural reinforcement in composite materials. Jute fibres were treated with 0.25%-1.0% sodium hydroxide (NaOH) solution for 0.5-48 h. The hydrophilicity, surface morphology, crystallinity index, thermal and mechanical characteristics of untreated and alkali treated fibres were studied.The two-parameter Weibull distribution model was applied to deal with the variation in mechanical properties of the natural fibres. Alkali treatment enhanced the tensile strength and elongation at break by 82% and 45%, respectively but decreased the hydrophilicity by 50.5% and the diameter of the fibres by 37%. Copyright © 2011 Elsevier Ltd. All rights reserved.
Klein, Claude A.; Miller, Richard P.
2001-09-01
For the purpose of assessing the strength of engineering ceramics, it is common practice to interpret the measured stresses at fracture in the light of a semi-empirical expression derived from Weibull's theory of brittle fracture, i.e., ln[-ln(1-P)]=-mln((sigma) N)+mln((sigma) ), where P is the cumulative failure probability, (sigma) is the applied tensile stress, m is the Weibull modulus, and (sigma) N is the nominal strength. The strength of (sigma) N, however, does not represent a true measure because it depends not only on the test method but also on the size of the volume or the surface subjected to tensile stresses. In this paper we intend to first clarify issues relating to the application of Weibull's theory of fracture and then make use of the theory to assess the results of equibiaxial flexure testing that was carried out on polycrystalline infrared-transmitting materials. These materials are brittle ceramics, which most frequently fail as a consequence of tensile stresses acting on surface flaws. Since equibiaxial flexure testing is the preferred method of measuring the strength of optical ceramics, we propose to formulate the failure-probability equation in terms of a characteristic strength, (sigma) C, for biaxial loadings, i.e., P=1-exp{-(pi) (ro/cm)2[(Gamma) (1+1/m)]m((sigma) /(sigma) C)m}, where ro is the radius of the loading ring (in centimeter) and (Gamma) (z) designates the gamma function. A Weibull statistical analysis of equibiaxial strength data thus amounts to obtaining the parameters m and (sigma) C, which is best done by directly fitting estimated Pi vs i data to the failure-probability equation; this procedure avoids distorting the distribution through logarithmic linearization and can be implemented by performing a non-linear bivariate regression. Concentric- ring fracture testing performed on five sets of Raytran materials validates the procedure in the sense that the two parameters model appears to describe the experimental failure
Directory of Open Access Journals (Sweden)
Asoke Kumar Bhunia
2014-06-01
Full Text Available In this paper, an attempt is made to develop two inventory models for deteriorating items with variable demand dependent on the selling price and frequency of advertisement of items. In the first model, shortages are not allowed whereas in the second, these are allowed and partially backlogged with a variable rate dependent on the duration of waiting time up to the arrival of next lot. In both models, the deterioration rate follows three-parameter Weibull distribution and the transportation cost is considered explicitly for replenishing the order quantity. This cost is dependent on the lot-size as well as the distance from the source to the destination. The corresponding models have been formulated and solved. Two numerical examples have been considered to illustrate the results and the significant features of the results are discussed. Finally, based on these examples, the effects of different parameters on the initial stock level, shortage level (in case of second model only, cycle length along with the optimal profit have been studied by sensitivity analyses taking one parameter at a time keeping the other parameters as same.
Transformation and Self-Similarity Properties of Gamma and Weibull Fragment Size Distributions
2015-12-01
Monte Carlo Estimates of the Distributions of the Random Polygons of the Voronoi Tessellation with Respect to a Poisson Process, Journal of...BELVOIR, VA 22060-6201 ATTN: DTIC/ OCA DEPARTMENT OF DEFENSE CONTRACTORS QUANTERION SOLUTIONS, INC. 1680 TEXAS STREET, SE KIRTLAND AFB, NM 87117-5669 ATTN: DTRIAC
Dependence of Weibull distribution parameters on the CNR threshold i wind lidar data
DEFF Research Database (Denmark)
Gryning, Sven-Erik; Batchvarova, Ekaterina; Floors, Rogier Ralph;
2015-01-01
The increase in height and area swept by the blades of wind turbines that harvest energy from the air flow in the lower atmosphere have raised a need for better understanding of the structure of the profiles of the wind, its gusts and the monthly to annual long-term, statistical distribution...
Rodrigues, Sinval A; Ferracane, Jack L; Della Bona, Alvaro
2008-03-01
The aim of the present study was to evaluate the flexural strength and the Weibull modulus of a microhybrid and a nanofill composite by means of 3- and 4-point bending tests. Thirty specimens of Filtek Z250 (3M/ESPE) and Filtek Supreme (3M/ESPE) were prepared for each test according to the ISO 4049/2000 specification. After 24h in distilled water at 37 degrees C the specimens were submitted to 3- and 4-point bending tests using a universal testing machine DL2000 (EMIC) with a crosshead speed of 1 mm/min. Flexural strength data were calculated and submitted to Student's t-test (alpha=0.05) and Weibull statistics. The fractured surfaces were analyzed based on fractographic principles. The two composites had equivalent strength in both test methods. However, the test designs significantly affected the flexural strength of the microhybrid and the nanofill composites. Weibull modulus (m) of Supreme was similar with both tests, while for Z250, a higher m was observed with the 3-point bending test. Critical flaws were most often associated with the specimen's surface (up to 90%) and were characterized as surface scratches/grooves, non-uniform distribution of phases, inclusions and voids. Flexural strength as measured by the 3-point bending test is higher than by the 4-point bending test, due to the smaller flaw containing area involved in the former. Despite the large difference in average filler size between the composites, the volume fraction of the filler in both materials is similar, which was probably the reason for similar mean flexural strength values and fracture behavior.
An EOQ Model for Items with Weibull Distribution Deterioration Rate%变质率呈Weibull分布的易变质物品的EOQ模型
Institute of Scientific and Technical Information of China (English)
王道平; 于俊娣; 李向阳
2011-01-01
基于需求和采购价格均为时变的EOQ模型,进一步考虑呈Weibull分布的变质对易变质物品库存管理的影响,建立了相应的EOQ模型,并对该模型进行仿真计算和主要参数的灵敏度分析.结果表明,该模型存在最优解且各主要参数对最优库存控制有不同程度的影响.%For deterioration items, the deterioration rate can be described by Weibull distribution. Basing on this assumption, a new economic order quantity (EOQ) model with time-varying demands and purchase prices is developed to analyze the effect of deteriorating items on inventory management. With this model,numerical analysis and parameter sensitivity analysis are done. It shows that an optimal solution for this problem exists and different parameters have different effect on the optimal inventory control policy.
Directory of Open Access Journals (Sweden)
Jinping Liu
2016-06-01
Full Text Available The topic of online product quality inspection (OPQI with smart visual sensors is attracting increasing interest in both the academic and industrial communities on account of the natural connection between the visual appearance of products with their underlying qualities. Visual images captured from granulated products (GPs, e.g., cereal products, fabric textiles, are comprised of a large number of independent particles or stochastically stacking locally homogeneous fragments, whose analysis and understanding remains challenging. A method of image statistical modeling-based OPQI for GP quality grading and monitoring by a Weibull distribution(WD model with a semi-supervised learning classifier is presented. WD-model parameters (WD-MPs of GP images’ spatial structures, obtained with omnidirectional Gaussian derivative filtering (OGDF, which were demonstrated theoretically to obey a specific WD model of integral form, were extracted as the visual features. Then, a co-training-style semi-supervised classifier algorithm, named COSC-Boosting, was exploited for semi-supervised GP quality grading, by integrating two independent classifiers with complementary nature in the face of scarce labeled samples. Effectiveness of the proposed OPQI method was verified and compared in the field of automated rice quality grading with commonly-used methods and showed superior performance, which lays a foundation for the quality control of GP on assembly lines.
Liu, Jinping; Tang, Zhaohui; Xu, Pengfei; Liu, Wenzhong; Zhang, Jin; Zhu, Jianyong
2016-01-01
The topic of online product quality inspection (OPQI) with smart visual sensors is attracting increasing interest in both the academic and industrial communities on account of the natural connection between the visual appearance of products with their underlying qualities. Visual images captured from granulated products (GPs), e.g., cereal products, fabric textiles, are comprised of a large number of independent particles or stochastically stacking locally homogeneous fragments, whose analysis and understanding remains challenging. A method of image statistical modeling-based OPQI for GP quality grading and monitoring by a Weibull distribution(WD) model with a semi-supervised learning classifier is presented. WD-model parameters (WD-MPs) of GP images’ spatial structures, obtained with omnidirectional Gaussian derivative filtering (OGDF), which were demonstrated theoretically to obey a specific WD model of integral form, were extracted as the visual features. Then, a co-training-style semi-supervised classifier algorithm, named COSC-Boosting, was exploited for semi-supervised GP quality grading, by integrating two independent classifiers with complementary nature in the face of scarce labeled samples. Effectiveness of the proposed OPQI method was verified and compared in the field of automated rice quality grading with commonly-used methods and showed superior performance, which lays a foundation for the quality control of GP on assembly lines. PMID:27367703
Sanford, W. E.
2015-12-01
Age distributions of base flow to streams are important to estimate for predicting the timing of water-quality responses to changes in distributed inputs of nutrients or pollutants at the land surface. Simple models of shallow aquifers will predict exponential age distributions, but more realistic 3-D stream-aquifer geometries will cause deviations from an exponential curve. In addition, in fractured rock terrains the dual nature of the effective and total porosity of the system complicates the age distribution further. In this study shallow groundwater flow and advective transport were simulated in two regions in the Eastern United States—the Delmarva Peninsula and the upper Potomac River basin. The former is underlain by layers of unconsolidated sediment, while the latter consists of folded and fractured sedimentary rocks. Transport of groundwater to streams was simulated using the USGS code MODPATH within 175 and 275 watersheds, respectively. For the fractured rock terrain, calculations were also performed along flow pathlines to account for exchange between mobile and immobile flow zones. Porosities at both sites were calibrated using environmental tracer data (3H, 3He, CFCs and SF6) in wells and springs, and with a 30-year tritium record from the Potomac River. Carbonate and siliciclastic rocks were calibrated to have mobile porosity values of one and six percent, and immobile porosity values of 18 and 12 percent, respectively. The age distributions were fitted to Weibull functions. Whereas an exponential function has one parameter that controls the median age of the distribution, a Weibull function has an extra parameter that controls the slope of the curve. A weighted Weibull function was also developed that potentially allows for four parameters, two that control the median age and two that control the slope, one of each weighted toward early or late arrival times. For both systems the two-parameter Weibull function nearly always produced a substantially
Modeling the reliability and maintenance costs of wind turbines using Weibull analysis
Energy Technology Data Exchange (ETDEWEB)
Vachon, W.A. [W.A. Vachon & Associates, Inc., Manchester, MA (United States)
1996-12-31
A general description is provided of the basic mathematics and use of Weibull statistical models for modeling component failures and maintenance costs as a function of time. The applicability of the model to wind turbine components and subsystems is discussed with illustrative examples of typical component reliabilities drawn from actual field experiences. Example results indicate the dominant role of key subsystems based on a combination of their failure frequency and repair/replacement costs. The value of the model is discussed as a means of defining (1) maintenance practices, (2) areas in which to focus product improvements, (3) spare parts inventory, and (4) long-term trends in maintenance costs as an important element in project cash flow projections used by developers, investors, and lenders. 6 refs., 8 figs., 3 tabs.
Institute of Scientific and Technical Information of China (English)
A.Suresh Babu; V.Jayabalan
2009-01-01
In recent times, conventional materials are replaced by metal matrix composites (MMCs) due to their high specific strength and modulus.Strength reliability, one of the key factors restricting wider use of composite materials in various applications, is commonly characterized by Weibull strength distribution function.In the present work, statistical analysis of the strength data of 15% volume alumina particle (mean size 15 μm)reinforced in aluminum alloy (1101 grade alloy) fabricated by stir casting method was carried out using Weibull probability model.Twelve tension tests were performed according to ASTM B577 standards and the test data, the corresponding Weibull distribution was obtained.Finally the reliability of the composite behavior in terms of its fracture strength was presented to ensure the reliability of composites for suitable applications.An important implication of the present study is that the Weibull distribution describes the experimentally measured strength data more appropriately.
Directory of Open Access Journals (Sweden)
S. Lakshmi
2016-12-01
Full Text Available In this paper, we introduce probability density function of four variate Weibull distributions. A multivariate survival function of Weibull Distribution is used for four variables. From the survival function, the probability density function and cumulative probability function are derived. Ghrelin may affect reproductive function in animals and humans.In the application part the experimental conditions of an acute injection of ghrelin (1g/kg to normal women, basal and GnRH-induced LH and FSH secretion were not affected and suggested that ghrelin does not play a major physiological role in gonadotrophin secretion in women.In the mathematical part, we have found that the Survival function of the curveis suddenly decreased in Mid-luteal phase compare with other phases. Pdf of the curve is suppressed in late follicular phase and it will be increased at the time of7min.Pdf for early follicular phase of control cycleis increased from 4 min .Also Pdf curve for early follicular phase with ghrelin administration and mid-luteal phase with ghrelin and GnRH are also increased at the time of 5 and 3 minutes respectively.
Energy Technology Data Exchange (ETDEWEB)
Lienkamp, M. (Technische Hochschule Darmstadt, Fachgebiet Physikalische Metallkunde, Fachbereich Materialwissenschaft (Germany)); Exner, H.E. (Technische Hochschule Darmstadt, Fachgebiet Physikalische Metallkunde, Fachbereich Materialwissenschaft (Germany))
1993-04-01
Present test methods used to determine the strength distribution of high performance fibres are either time consuming or not very reliable. A method is used which enables the derivation of the strength distribution function from one single tensile test. The load/elongation diagram of a bundle of fibres is taken from an elongation-controlled tensile test. From the ratio of the measured load to a fictive load, necessary to obtain an identical elongation in the bundle assuming all fibres are intact, the fraction of broken fibres for each point of the load/elongation diagram is determined. From this the strength distribution function and the Weibull parametes of the fibres can be calculated. Application of this simple, but very effective method, is demonstrated for a schematic example and for three fibre materials (carbon, aramid and ceramic fibres). (orig.)
The Weibull - log Weibull transition of interoccurrence times for synthetic and natural earthquakes
Hasumi, Tomohiro; Akimoto, Takuma; Aizawa, Yoji
2008-01-01
We have studied interoccurrence time distributions by analyzing the synthetic and three natural catalogs of the Japan Meteorological Agency (JMA), the Southern California Earthquake Data Center (SCEDC), and Taiwan Central Weather Bureau (TCWB) and revealed the universal feature of the interoccurrence time statistics, Weibull - log Weibull transition. This transition reinforces the view that the interoccurrence time statistics possess Weibull statistics and log-Weibull statistics. Here in this paper, the crossover magnitude from the superposition regime to the Weibull regime $m_c^2$ is proportional to the plate velocity. In addition, we have found the region-independent relation, $m_c^2/m_{max} = 0.54 \\pm 0.004$.
Polyzois, Gregory L; Lagouvardos, Panagiotis E; Frangou, Maria J
2012-06-01
The aim of this study was to (1) investigate the flexural strengths of three denture resins i.e. heat, photopolymerised and microwaved and how it was affected by relining with auto- and visible light-polymerised hard reliners, (2) investigate the bond strengths between denture resins and hard reliners and (3) interpret the results of both tests by utilising Weibull analysis. Specimens (65 × 10 × 2.5 mm) from denture resins, relined and bonded combinations were tested using a four-point bending test in a universal testing machine and a crosshead speed of 5 mm/min. Ten specimens for each bulk resin and denture resin-reliner combination for a total of 150 were tested. Statistical analysis indicated significant differences between bulk materials (p < 0.001) and between reliners (p < 0.001) for flexural and bond strength tests. was concluded that (1) the four-point flexural strength was different between the denture base materials, (2) flexure strength between bulk and relined or between relined with autopolymerised and photopolymerised bases was different, (3) flexural strength among relined denture bases was different and (4) bond strengths among relined denture bases were different. © 2011 The Gerodontology Society and John Wiley & Sons A/S.
Abaidoo-Ayin, Harold K; Boakye, Prince G; Jones, Kerby C; Wyatt, Victor T; Besong, Samuel A; Lumor, Stephen E
2017-08-01
This study investigated the compositional characteristics and shelf-life of Njangsa seed oil (NSO). Oil from Njangsa had a high polyunsaturated fatty acid (PUFA) content of which alpha eleostearic acid (α-ESA), an unusual conjugated linoleic acid was the most prevalent (about 52%). Linoleic acid was also present in appreciable amounts (approximately 34%). Our investigations also indicated that the acid-catalyzed transesterification of NSO resulted in lower yields of α-ESA methyl esters, due to isomerization, a phenomenon which was not observed under basic conditions. The triacylglycerol (TAG) profile analysis showed the presence of at least 1 α-ESA fatty acid chain in more than 95% of the oil's TAGs. Shelf-life was determined by the Weibull Hazard Sensory Method, where the end of shelf-life was defined as the time at which 50% of panelists found the flavor of NSO to be unacceptable. This was determined as 21 wk. Our findings therefore support the potential commercial viability of NSO as an important source of physiologically beneficial PUFAs. © 2017 Institute of Food Technologists®.
Directory of Open Access Journals (Sweden)
Janine Treter
2010-01-01
Full Text Available Saponins are natural soaplike foam-forming compounds widely used in foods, cosmetic and pharmaceutical preparations. In this work foamability and foam lifetime of foams obtained from Ilex paraguariensis unripe fruits were analyzed. Polysorbate 80 and sodium dodecyl sulfate were used as reference surfactants. Aiming a better data understanding a linearized 4-parameters Weibull function was proposed. The mate hydroethanolic extract (ME and a mate saponin enriched fraction (MSF afforded foamability and foam lifetime comparable to the synthetic surfactants. The linearization of the Weibull equation allowed the statistical comparison of foam decay curves, improving former mathematical approaches.
Statistical Analysis of a Weibull Extension with Bathtub-Shaped Failure Rate Function
Directory of Open Access Journals (Sweden)
Ronghua Wang
2014-01-01
Full Text Available We consider the parameter inference for a two-parameter life distribution with bathtub-shaped or increasing failure rate function. We present the point and interval estimations for the parameter of interest based on type-II censored samples. Through intensive Monte-Carlo simulations, we assess the performance of the proposed estimation methods by a comparison of precision. Example applications are demonstrated for the efficiency of the methods.
Aragao, Glaucia M F; Corradini, Maria G; Normand, Mark D; Peleg, Micha
2007-11-01
Published survival curves of Escherichia coli in two growth media, with and without the presence of salt, at various temperatures and in a Greek eggplant salad having various levels of essential oil, all had a characteristic downward concavity when plotted on semi logarithmic coordinates. Some also exhibited what appeared as a 'shoulder' of considerable length. Regardless of whether a shoulder was noticed, the survival pattern could be considered as a manifestation of an underlying unimodal distribution of the cells' death times. Mathematically, the data could be described equally well by the Weibull and log normal distribution functions, which had similar modes, means, standard deviations and coefficients of skewness. When plotted in their probability density function (PDF) form, the curves also appeared very similar visually. This enabled us to quantify and compare the effect of temperature or essential oil concentration on the organism's survival in terms of these temporal distributions' characteristics. Increased lethality was generally expressed in a shorter mean and mode, a smaller standard deviation and increased overall symmetry as judged by the distributions' degree of skewness. The 'shoulder', as expected, simply indicated that the distribution's standard deviation was much smaller than its mode. Rate models based on the two distribution functions could be used to predict non isothermal survival patterns. They were derived on the assumption that the momentary inactivation rate is the isothermal rate at the momentary temperature at a time that corresponds to the momentary survival ratio. In this application, however, the Weibullian model with a fixed power was not only simpler and more convenient mathematically than the one based on the log normal distribution, but it also provided more accurate estimates of the dynamic inactivation patterns.
Institute of Scientific and Technical Information of China (English)
蔡改贫; 郭进山; 夏刘洋
2016-01-01
为了对石灰石受冲击破碎后的颗粒粒度分布特征进行分析,采用Bond冲击破碎试验机对不同粒度的单个石灰石颗粒在不同摆锤冲击角度下进行冲击破碎试验.结果表明:Bond冲击破碎后石灰石颗粒粒度符合Weibull分布模型;破碎后颗粒的质量累积概率随冲击能量的增加而提高;破碎后颗粒的质量累积概率密度函数曲线峰值随着给矿粒度的增加而减小;冲击能量增加到一定数值后,冲击能量继续增加,破碎后石灰石各粒径颗粒的质量增加效果随给矿粒径增加而逐渐减弱;给矿粒度一定时,细粒径颗粒的增加幅度随着冲击能的增加而较小,破碎后颗粒的质量累积概率密度函数曲线的峰值随着冲击能的增加而提高;破碎后颗粒的质量累积概率密度函数曲线的宽度随给矿粒径的增加而增大.%To research the particle size distribution of limestone in the impact crusher,the experiments of single lime-stone of different size at different angles were carried out on Bond impact crushing test machine. The results showed that:the particle size of the limestone under Bond impact crushing is in line with the Weibull distribution. The mass cumulative proba-bility of the particles increases with the increase of the impact energy and the peak of broken particles ' mass accumulation probability density function decreases with the increase of feed size. When impact energy continues to increase,the quality in-crease effect of broken particle size of limestone is is gradually weakened as the feed size becomes largeness after the impact energy is increased to a certain value. While the feed size is certain,the increase amplitude of fine particle diminished and the peak of broken particles' mass accumulation probability density function increases as the impact energy increases,but the width of broken particles' mass accumulation probability density function curve widens with the increase of the feed size.
Institute of Scientific and Technical Information of China (English)
王永泉; 陈花玲; 赵建平; 朱子才
2013-01-01
提出一种针对MEMS(micro-electro-mechanical systems)器件机械失效进行可靠性建模与预测的概率方法.首先从材料力学性能的尺寸效应出发,介绍脆性材料断裂强度的不确定性及其Weibull概率分布；然后,针对典型的MEMS表面微加工工艺,推导得出基于牺牲层技术的淀积薄膜结构残余热应力表达式；在此基础上,以一种悬臂式MEMS多晶硅器件在冲击载荷下的断裂失效为研究实例,建立体现其尺度、工艺及载荷特性的可靠性分析模型,并利用相关文献对多晶硅力学性能的测试数据,对该器件的冲击可靠度进行定量计算.结果表明典型多晶硅MEMS结构具有高达103g ～104g数量级的抗冲击能力(g为重力加速度).同时可看出,MEMS可靠性受多种关联因素的综合影响,准确的可靠性建模及设计在很大程度上依赖于大量的微尺度下材料性能或行为的基础性实验数据.%A probabilistic approach to model and predict the reliability of MEMS ( micro-electro-mechanical systems) devices is proposed. Starting from the size effect of microstructures, the Weibull probability distribution for describing the uncertainty of brittle materials' fracture strength is presented at first. Then, aiming at a typical MEMS surface micro-machining process, which is characterized as the chemical vapor deposition and sacrificial layer technology, the thermal residual stress of thin films is derived. Based on this, the reliability assessment on the fracture failure of a polysilicon cantilevered device under shock load is performed as a case study. A reliability model of the device is established, which incorporates the scale, process and load characteristics to some extent. Using the testing data for the mechanical properties of polysilicon material provided by the relevant literatures, the quantitative shock reliability of the device is calculated. The analysis show that typical polysilicon MEMS structures can
Energy Technology Data Exchange (ETDEWEB)
Toure, S. [Cocody Univ. (Ivory Coast). Lab. d' Energie Solaire
2005-04-01
The 2-parameter Weibull distribution is the hypothesis that is widely used in the fitting studies of random series of wind speeds. Several procedures are used to find the set of the two fitting parameters k and c. From an experimental study, the fitting parameters were first determined by the regression method. The basic ideas of the Eigen-coordinates method were reported by previous works, in the case of the 4-parameter Stauffer distribution. In the present paper, the new method is applied to identify the 2-parameter Weibull distribution. The differential equation was identified. Then the study disclosed a linear relationship with two Eigen-coordinates. Two complemental errors {epsilon}{sub j} and e{sub j} were introduced, as criteria to assess the goodness-of-fit of the distribution. {epsilon}{sub j} was linked to the linear relationship. e{sub j} was used to test the goodness-of-fit between the observed and Weibull cumulative distribution functions. Then the fitting parameters were determined using the Eigen-coordinates method. The results showed a better reliability. (Author)
Lin, Wei-Shao; Ercoli, Carlo; Feng, Changyong; Morton, Dean
2012-07-01
The objective of this study was to compare the effect of veneering porcelain (monolithic or bilayer specimens) and core fabrication technique (heat-pressed or CAD/CAM) on the biaxial flexural strength and Weibull modulus of leucite-reinforced and lithium-disilicate glass ceramics. In addition, the effect of veneering technique (heat-pressed or powder/liquid layering) for zirconia ceramics on the biaxial flexural strength and Weibull modulus was studied. Five ceramic core materials (IPS Empress Esthetic, IPS Empress CAD, IPS e.max Press, IPS e.max CAD, IPS e.max ZirCAD) and three corresponding veneering porcelains (IPS Empress Esthetic Veneer, IPS e.max Ceram, IPS e.max ZirPress) were selected for this study. Each core material group contained three subgroups based on the core material thickness and the presence of corresponding veneering porcelain as follows: 1.5 mm core material only (subgroup 1.5C), 0.8 mm core material only (subgroup 0.8C), and 1.5 mm core/veneer group: 0.8 mm core with 0.7 mm corresponding veneering porcelain with a powder/liquid layering technique (subgroup 0.8C-0.7VL). The ZirCAD group had one additional 1.5 mm core/veneer subgroup with 0.7 mm heat-pressed veneering porcelain (subgroup 0.8C-0.7VP). The biaxial flexural strengths were compared for each subgroup (n = 10) according to ISO standard 6872:2008 with ANOVA and Tukey's post hoc multiple comparison test (p≤ 0.05). The reliability of strength was analyzed with the Weibull distribution. For all core materials, the 1.5 mm core/veneer subgroups (0.8C-0.7VL, 0.8C-0.7VP) had significantly lower mean biaxial flexural strengths (p strength (p= 0.004) than subgroup 0.8C-0.7VP. Nonetheless, both veneered ZirCAD groups showed greater flexural strength than the monolithic Empress and e.max groups, regardless of core thickness and fabrication techniques. Comparing fabrication techniques, Empress Esthetic/CAD, e.max Press/CAD had similar biaxial flexural strength (p= 0.28 for Empress pair; p= 0
Directory of Open Access Journals (Sweden)
Wahyu Widiyanto
2013-06-01
Full Text Available Wind characteristics especially the event probability have been more studied in the relation to wind energy availability in an area. Nevertheless, in the relation to coastal structure, it is still rare to be unveiled in a paper particulary in Indonesia. In this article, therefore, it is studied probability distribution commonly used to wind energy analysis i.e. Weibull and Rayleigh distribution. The distribution is applied to analyze wind data in Cilacap Coast. Wind data analyzed is from Board of Meteorology, Climatology and Geophysics, Cilacap branch, along two years (2009 – 2011. Mean, varians and standard deviation are founded to calculate shape factor (k and scale factor (c which must be available to arrange distribution function of Weibull and Rayleigh. In the region, it gains a result that wind speed probabilities follow Weibull and Rayleigh function fairly. Shape parameter value has been gotten k = 3,26, while scale parameter has been gotten respectively c = 3,64 for Weibull and Cr = 2,44 for Rayleigh. Value of k ≥ 3 indicates the region has regular and steady wind. Besides, mean speed of wind is 3,3 m/s.
SINGH, S. P.; G.C.Panda
2015-01-01
This paper derives an inventory model is developed for items that deteriorates at a generalized Weibull distributed rate when demand for the items is dependent on the selling price. Shortages are not allowed and price inflation is taken into consideration over finite planning horizon. A brief analysis of the cost involved is carried out by theoretical analysis.
Directory of Open Access Journals (Sweden)
S.P.Singh
2015-09-01
Full Text Available This paper derives an inventory model is developed for items that deteriorates at a generalized Weibull distributed rate when demand for the items is dependent on the selling price. Shortages are not allowed and price inflation is taken into consideration over finite planning horizon. A brief analysis of the cost involved is carried out by theoretical analysis.
Evaluation of the reliability of Si3N4-Al2O3 -CTR2O3 ceramics through Weibull analysis
Directory of Open Access Journals (Sweden)
Santos Claudinei dos
2003-01-01
Full Text Available The objective of this work has been to compare the reliability of two Si3N4 ceramics, with Y2O3/Al2O3 or CTR2O3/Al2O3 mixtures as additives, in regard to their 4-point bending strength and to confirm the potential of the rare earth oxide mixture, CTR2O3, produced at FAENQUIL, as an alternative, low cost sinter additive for pure Y2O3 in the sintering of Si3N4 ceramics. The oxide mixture CTR2O3 is a solid solution formed mainly by Y2O3, Er2O3, Yb2O3 and Dy2O3 with other minor constituents and is obtained at a cost of only 20% of pure Y2O3. Samples were sintered by a gas pressure sintering process at 1900 °C under a nitrogen pressure of 1.5 MPa and an isothermal holding time of 2 h. The obtained materials were characterized by their relative density, phase composition and bending strength. The Weibull analysis was used to describe the reliability of these materials. Both materials produced presented relative densities higher than 99.5%t.d., b-Si3N4 and Y3Al5O12 (YAG as cristalline phases and bending strengths higher than 650 MPa, thus demonstrating similar behaviors regarding their physical, chemical and mechanical characteristics. The statistical analysis of their strength also showed similar results for both materials, with Weibull moduli m of about 15 and characteristic stress values s o of about 700 MPa. These results confirmed the possibility of using the rare earth oxide mixture, CTR2O3, as sinter additive for high performance Si3N4 ceramics, without prejudice of the mechanical properties when compared to Si3N4 ceramics sintered with pure Y2O3.
Institute of Scientific and Technical Information of China (English)
王道平; 于俊娣; 李向阳
2011-01-01
基于需求和采购价格均为时变的EOQ模型,考虑物品的变质率呈更符合现实情况的三参数Weibull分布,同时考虑短缺量拖后和资金时值对易变质物品库存管理的影响,构建了相应的EOQ模型.应用数学软件Matlab对该库存模型进行仿真计算和主要影响参数的灵敏度分析.结果表明,该模型存在最优解,且各主要影响参数对最优库存控制各有不同程度的影响,资金时值对库存总成本净现值的影响程度要甚于短缺量拖后的影响,故在制定科学的库存策略时资金时值需要更加关.注.%For deteriorating items, the deterioration rate can be described by Weibull distribution that is reality-oriented. Basing on this assumption, a new economic order quantity (EOQ) model with time-varying demand and purchase price and partial backlogging and time-value of system cost is developed to analyze the effect of deteriorating items on inventory management. With this model, numerical analysis and parameter sensitivity analysis are done. It shows that an optimum solution for this problem exists and different parameters have different effect on the optimal inventory control policy. Effection of time-value of system cost on net total cost of inventory was more than partial backlogging, therefore the scientific police of inventory should be paid more attention to time-value of system cost.
Wind speed analysis in La Vainest, Mexico: a bimodal probability distribution case
Energy Technology Data Exchange (ETDEWEB)
Jaramillo, O.A.; Borja, M.A. [Energias No Convencionales, Morelos (Mexico). Instituto de Investigaciones Electricas
2004-08-01
The statistical characteristics of the wind speed in La Vainest, Oxoic, Mexico, have been analyzed by using wind speed data recorded by Instituto de Investigaciones Electricas (IIE). By grouping the observations by annual, seasonal and wind direction, we show that the wind speed distribution, with calms included, is not represented by the typical two-parameter Weibull function. A mathematical formulation by using a bimodal Weibull and Weibull probability distribution function (PDF) has been developed to analyse the wind speed frequency distribution in that region. The model developed here can be applied for similar regions where the wind speed distribution presents a bimodal PDF. The two-parameter Weibull wind speed distribution must not be generalised, since it is not accurate to represent some wind regimes as the case of La Ventosa, Mexico. The analysis of wind data shows that computing the capacity factor for wind power plants to be installed in La Ventosa must be carded out by means of a bimodal PDF instead of the typical Weibull PDF. Otherwise, the capacity factor will be underestimated. (author)
1981-12-01
evaluated at xlx 2,... ,x (20:303). For the Weibull pdf, the likelihood function can be represented by: L - f(xlx 2,...,x n:K,8,C) (16) Since the...vs A 2 Critical Values, Level-.Ol, n-30 128 , 0 6N m m • w - APPENDIX E Computer Prgrams 129 Program to Calculate the Cramer-von Mises Critical Values...1.E-4) 4,4,.30 30 CONTINUE 4 CONTINUE CsJ-C(3) T83-THETA (3) EKSJ-EK (3) 66 RETURN END *EOR SEOR *EOF 143 Program to Evaluate the Endpoints c C
Energy Technology Data Exchange (ETDEWEB)
Nuehlen, Jochen
2010-07-15
By means of the legislative framework, the use of wind power is anchored in the economic and ecological development of Germany. The development of the utilization of wind energy on inland locations requires a detailed survey of potential with results as realistic as possible. As part of the Climate Alliance Bamberg (Federal Republic of Germany) a potential analysis of all renewable energies is to be created. Under this aspect, the author of the contribution under consideration reports on a GIS-based analysis of the wind power. An exemplary demonstration of the results is done at the community Schesslitz (Bavaria, Federal Republic of Germany). Two different methods for the analysis of the potential of wind energy are presented.
Weibull-Based Design Methodology for Rotating Structures in Aircraft Engines
Directory of Open Access Journals (Sweden)
Erwin V. Zaretsky
2003-01-01
Full Text Available The NASA Energy-Efficient Engine (E3-Engine is used as the basis of a Weibull-based life and reliability analysis. Each component's life, and thus the engine's life, is defined by high-cycle fatigue or low-cycle fatigue. Knowing the cumulative life distribution of each of the components making up the engine as represented by a Weibull slope is a prerequisite to predicting the life and reliability of the entire engine. As the engine's Weibull slope increases, the predicted life decreases. The predicted engine lives L5 (95% probability of survival of approximately 17,000 and 32,000 hr do correlate with current engine-maintenance practices without and with refurbishment, respectively. The individual high-pressure turbine (HPT blade lives necessary to obtain a blade system life L0.1 (99.9% probability of survival of 9000 hr for Weibull slopes of 3, 6, and 9 are 47,391; 20,652; and 15,658 hr, respectively. For a design life of the HPT disks having probable points of failure equal to or greater than 36,000 hr at a probability of survival of 99.9%, the predicted disk system life L0.1 can vary from 9408 to 24,911 hr.
Pallocchia, G.; Laurenza, M.; Consolini, G.
2017-03-01
Some interplanetary shocks are associated with short-term and sharp particle flux enhancements near the shock front. Such intensity enhancements, known as shock-spike events (SSEs), represent a class of relatively energetic phenomena as they may extend to energies of some tens of MeV or even beyond. Here we present an SSE case study in order to shed light on the nature of the particle acceleration involved in this kind of event. Our observations refer to an SSE registered on 2011 October 3 at 22:23 UT, by STEREO B instrumentation when, at a heliocentric distance of 1.08 au, the spacecraft was swept by a perpendicular shock moving away from the Sun. The main finding from the data analysis is that a Weibull distribution represents a good fitting function to the measured particle spectrum over the energy range from 0.1 to 30 MeV. To interpret such an observational result, we provide a theoretical derivation of the Weibull spectrum in the framework of the acceleration by “killed” stochastic processes exhibiting power-law growth in time of the velocity expectation, such as the classical Fermi process. We find an overall coherence between the experimental values of the Weibull spectrum parameters and their physical meaning within the above scenario. Hence, our approach based on the Weibull distribution proves to be useful for understanding SSEs. With regard to the present event, we also provide an alternative explanation of the Weibull spectrum in terms of shock-surfing acceleration.
A meta-analysis of estimates of the AIDS incubation distribution.
Cooley, P C; Myers, L E; Hamill, D N
1996-06-01
Information from 12 studies is combined to estimate the AIDS incubation distribution with greater precision than is possible from a single study. The analysis uses a hierarchy of parametric models based on a four-parameter generalized F distribution. This general model contains four standard two-parameter distributions as special cases. The cases are the Weibull, gamma, log-logistic, lognormal distributions. These four special cases subsume three distinct asymptotic hazard behaviors. As time increases beyond the median of approximately 10 years, the hazard can increase to infinity (Weibull), can plateau at some constant level (gamma), or can decrease to zero (log-logistic and lognormal). The Weibull, gamma and 'log-logistic distributions' which represent the three distinct asymptotic hazard behaviors, all fit the data as well as the generalized F distribution at the 25 percent significance level. Hence, we conclude that incubation data is still too limited to ascertain the specific hazard assumption that should be utilized in studies of the AIDS epidemic. Accordingly, efforts to model the AIDS epidemic (e.g., back-calculation approaches) should allow the incubation distribution to take several forms to adequately represent HIV estimation uncertainty. It is recommended that, at a minimum, the specific Weibull, gamma and log-logistic distributions estimated in this meta-analysis should all be used in modeling the AIDS epidemic, to reflect this uncertainty.
Directory of Open Access Journals (Sweden)
A. Calzado
2013-04-01
Full Text Available Aim of study: The aim of this work was to model diameter distributions of Quercus suber stands. The ultimate goal was to construct models enabling the development of more affordable forest inventory methods. This is the first study of this type on cork oak forests in the area.Area of study: The area of study is “Los Alcornocales” Natural Park (Cádiz-Málaga, Spain.Material and methods: The diameter distributions of 100 permanent plots were modelled with the two-parameter Weibull function. Distribution parameters were fitted with the non-linear regression, maximum likelihood, moment and percentile-based methods. Goodness of fit with the different methods was compared in terms of number of plots rejected by the Kolmogorov-Smirnov test, bias, mean square error and mean absolute error. The scale and shape parameters in the Weibull function were related to the stand variables by using the parameter prediction model.Main results: The best fitting was obtained with the non-linear regression approach, using as initial values those obtained by maximum likelihood method, the percentage of rejections by the Kolmogorov-Smirnov test was 2% of the total number of cases. The scale parameter (b was successfully modelled in terms of the quadratic mean diameter under cork (R2 adj = 0.99. The shape parameter (c was modelled by using maximum diameter, minimum diameter and plot elevation (R2 adj = 0.40.Research highlights: The proposed model diameter distribution can be a highly useful tool for the inventorying and management of cork oak forests.Key words: maximum likelihood method; moment method; non linear regression approach; parameter prediction model; percentile method; scale parameter; shape parameter.
Institute of Scientific and Technical Information of China (English)
胡建军; 许洪斌; 高孝旺; 祖世华
2012-01-01
根据齿轮传动过程中普遍承受的三参数威布尔分布载荷谱,编制了试验用随机变幅疲劳载荷谱,在MTS电液伺服疲劳试验机上利用成组试验方法完成了该随机载荷作用下齿轮弯曲疲劳试验,得到了特定变异系数三参数威布尔分布载荷谱下齿轮弯曲强度的S-N曲线。试验结果证明,在服从三参数威布尔分布随机载荷谱下,随机变幅疲劳试验得出的轮齿疲劳寿命远低于恒载荷疲劳试验得出的疲劳寿命。对随机载荷下的齿轮设计的疲劳极限的理论值进行了预测,并与试验结果进行了比较。随机载荷下的理论值与试验结果相吻合,因此可以通过随机载荷谱的载荷比例系数去推断随机载荷下齿轮弯曲疲劳强度值。%Random-amplitude fatigue load spectrum for experiments is made according to the ubiquitous three-parameter Weibull distribution in gear transmission.Gear bending fatigue test under the random load is carried out on a MTS electro-hydraulic servo material fatigue tester by using group testing method,and the S-N curve of gear bending strength under three-parameter Weibull distribution with specific variation coefficients is obtained.The fatigue test results show the gear's endurance life under random load is far less than that under constant load when the load submits to three-parameter Weibull distribution random load spectrum.The theoretical value of fatigue limit for gear under random load is predicated and compared with test results.The theoretical value is in accordance with the test results.Therefore,the fatigue strength of gear bending under random load can be deduced according to the load ratio coefficient of random load spectrum.
Strength Distribution Analysis of Typical Staple Fibers
Institute of Scientific and Technical Information of China (English)
无
2005-01-01
The strength of staple fiber is an important property for yarns and fabrics. Usually there are variations in individual fiber strength, and this will affect the final strength of yarns and fabrics. In this study, Weibull distribution function is used to analyze the strength distribution of various staplefibers. The strengths of wool, silk, cotton, flax, acrylic, polyester, glass, aramid and carbon fiber are tested. It isfound that the strengths of cotton, polyester, glass, aramid and carbon fiber fit well with the two-factor Weibulldistribution, while those of wool and silk with the threefactir Weibull distribution. However, the strength distributionof flax cannot be expressed by either two- or three-factor Weibull distribution convincingly.
Modeling root reinforcement using root-failure Weibull survival function
Directory of Open Access Journals (Sweden)
M. Schwarz
2013-03-01
Full Text Available Root networks contribute to slope stability through complicated interactions that include mechanical compression and tension. Due to the spatial heterogeneity of root distribution and the dynamic of root turnover, the quantification of root reinforcement on steep slope is challenging and consequently the calculation of slope stability as well. Although the considerable advances in root reinforcement modeling, some important aspect remain neglected. In this study we address in particular to the role of root strength variability on the mechanical behaviors of a root bundle. Many factors may contribute to the variability of root mechanical properties even considering a single class of diameter. This work presents a new approach for quantifying root reinforcement that considers the variability of mechanical properties of each root diameter class. Using the data of laboratory tensile tests and field pullout tests, we calibrate the parameters of the Weibull survival function to implement the variability of root strength in a numerical model for the calculation of root reinforcement (RBMw. The results show that, for both laboratory and field datasets, the parameters of the Weibull distribution may be considered constant with the exponent equal to 2 and the normalized failure displacement equal to 1. Moreover, the results show that the variability of root strength in each root diameter class has a major influence on the behavior of a root bundle with important implications when considering different approaches in slope stability calculation. Sensitivity analysis shows that the calibration of the tensile force and the elasticity of the roots are the most important equations, as well as the root distribution. The new model allows the characterization of root reinforcement in terms of maximum pullout force, stiffness, and energy. Moreover, it simplifies the implementation of root reinforcement in slope stability models. The realistic quantification of root
A Weibull characterization for tensile fracture of multicomponent brittle fibers
Barrows, R. G.
1977-01-01
Necessary to the development and understanding of brittle fiber reinforced composites is a means to statistically describe fiber strength and strain-to-failure behavior. A statistical characterization for multicomponent brittle fibers is presented. The method, which is an extension of usual Weibull distribution procedures, statistically considers the components making up a fiber (e.g., substrate, sheath, and surface) as separate entities and taken together as in a fiber. Tensile data for silicon carbide fiber and for an experimental carbon-boron alloy fiber are evaluated in terms of the proposed multicomponent Weibull characterization.
Chakraborty, Snehasis; Rao, Pavuluri Srinivasa; Mishra, Hari Niwas
2015-10-15
High pressure inactivation of natural microbiota viz. aerobic mesophiles (AM), psychrotrophs (PC), yeasts and molds (YM), total coliforms (TC) and lactic acid bacteria (LAB) in pineapple puree was studied within the experimental domain of 0.1-600 MPa and 30-50 °C with a treatment time up to 20 min. A complete destruction of yeasts and molds was obtained at 500 MPa/50 °C/15 min; whereas no counts were detected for TC and LAB at 300 MPa/30 °C/15 min. A maximum of two log cycle reductions was obtained for YM during pulse pressurization at the severe process intensity of 600 MPa/50 °C/20 min. The Weibull model clearly described the non-linearity of the survival curves during the isobaric period. The tailing effect, as confirmed by the shape parameter (β) of the survival curve, was obtained in case of YM (β1) was observed for the other microbial groups. Analogous to thermal death kinetics, the activation energy (Ea, kJ·mol(-1)) and the activation volume (Va, mL·mol(-1)) values were computed further to describe the temperature and pressure dependencies of the scale parameter (δ, min), respectively. A higher δ value was obtained for each microbe at a lower temperature and it decreased with an increase in pressure. A secondary kinetic model was developed describing the inactivation rate (k, min(-1)) as a function of pressure (P, MPa) and temperature (T, K) including the dependencies of Ea and Va on P and T, respectively.
Comparison of Parameter Estimation Methods for Transformer Weibull Lifetime Modelling
Institute of Scientific and Technical Information of China (English)
ZHOU Dan; LI Chengrong; WANG Zhongdong
2013-01-01
Two-parameter Weibull distribution is the most widely adopted lifetime model for power transformers.An appropriate parameter estimation method is essential to guarantee the accuracy of a derived Weibull lifetime model.Six popular parameter estimation methods (i.e.the maximum likelihood estimation method,two median rank regression methods including the one regressing X on Y and the other one regressing Y on X,the Kaplan-Meier method,the method based on cumulative hazard plot,and the Li's method) are reviewed and compared in order to find the optimal one that suits transformer's Weibull lifetime modelling.The comparison took several different scenarios into consideration:10 000 sets of lifetime data,each of which had a sampling size of 40 ～ 1 000 and a censoring rate of 90％,were obtained by Monte-Carlo simulations for each scienario.Scale and shape parameters of Weibull distribution estimated by the six methods,as well as their mean value,median value and 90％ confidence band are obtained.The cross comparison of these results reveals that,among the six methods,the maximum likelihood method is the best one,since it could provide the most accurate Weibull parameters,i.e.parameters having the smallest bias in both mean and median values,as well as the shortest length of the 90％ confidence band.The maximum likelihood method is therefore recommended to be used over the other methods in transformer Weibull lifetime modelling.
Directory of Open Access Journals (Sweden)
Abul Kalam Azad
2014-05-01
Full Text Available The best Weibull distribution methods for the assessment of wind energy potential at different altitudes in desired locations are statistically diagnosed in this study. Seven different methods, namely graphical method (GM, method of moments (MOM, standard deviation method (STDM, maximum likelihood method (MLM, power density method (PDM, modified maximum likelihood method (MMLM and equivalent energy method (EEM were used to estimate the Weibull parameters and six statistical tools, namely relative percentage of error, root mean square error (RMSE, mean percentage of error, mean absolute percentage of error, chi-square error and analysis of variance were used to precisely rank the methods. The statistical fittings of the measured and calculated wind speed data are assessed for justifying the performance of the methods. The capacity factor and total energy generated by a small model wind turbine is calculated by numerical integration using Trapezoidal sums and Simpson’s rules. The results show that MOM and MLM are the most efficient methods for determining the value of k and c to fit Weibull distribution curves.
SEMI-COMPETING RISKS ON A TRIVARIATE WEIBULL SURVIVAL MODEL
Directory of Open Access Journals (Sweden)
Jenq-Daw Lee
2008-07-01
Full Text Available A setting of a trivairate survival function using semi-competing risks concept is proposed, in which a terminal event can only occur after other events. The Stanford Heart Transplant data is reanalyzed using a trivariate Weibull distribution model with the proposed survival function.
Directory of Open Access Journals (Sweden)
Jae Phil Park
2016-06-01
Full Text Available The typical experimental procedure for testing stress corrosion cracking initiation involves an interval-censored reliability test. Based on these test results, the parameters of a Weibull distribution, which is a widely accepted crack initiation model, can be estimated using maximum likelihood estimation or median rank regression. However, it is difficult to determine the appropriate number of test specimens and censoring intervals required to obtain sufficiently accurate Weibull estimators. In this study, we compare maximum likelihood estimation and median rank regression using a Monte Carlo simulation to examine the effects of the total number of specimens, test duration, censoring interval, and shape parameters of the true Weibull distribution on the estimator uncertainty. Finally, we provide the quantitative uncertainties of both Weibull estimators, compare them with the true Weibull parameters, and suggest proper experimental conditions for developing a probabilistic crack initiation model through crack initiation tests.
Directory of Open Access Journals (Sweden)
Luís R. A Gabriel Filho
2011-02-01
Full Text Available O regime eólico de uma região pode ser descrito por distribuição de frequências que fornecem informações e características extremamente necessárias para uma possível implantação de sistemas eólicos de captação de energia na região e consequentes aplicações no meio rural em regiões afastadas. Estas características, tais como a velocidade média anual, a variância das velocidades registradas e a densidade da potência eólica média horária, podem ser obtidas pela frequência de ocorrências de determinada velocidade, que por sua vez deve ser estudada através de expressões analíticas. A função analítica mais adequada para distribuições eólicas é a função de densidade de Weibull, que pode ser determinada por métodos numéricos e regressões lineares. O objetivo deste trabalho é caracterizar analítica e geometricamente todos os procedimentos metodológicos necessários para a realização de uma caracterização completa do regime eólico de uma região e suas aplicações na região de Botucatu - SP, visando a determinar o potencial energético para implementação de turbinas eólicas. Assim, foi possível estabelecer teoremas relacionados com a forma de caracterização do regime eólico, estabelecendo a metodologia concisa analiticamente para a definição dos parâmetros eólicos de qualquer região a ser estudada. Para o desenvolvimento desta pesquisa, utilizou-se um anemômetro da CAMPBELL.The wind regime of a region can be described by frequency distributions that provide information and features extremely necessary for a possible deployment of wind systems of energy capturing in the region and the resulting applications in rural areas in remote regions. These features, such as the annual average speed, variance of speed and hourly average of wind power density, can be obtained by the frequency of occurrences of certain speed, which in turn should be studied through analytical expressions. The analytic
An Extension to the Weibull Process Model
1981-11-01
Subt5l . TYPE OF REPORT & PERIOD COVERED AN EXTENSION+TO THE WEIBULL PROCESS MODEL 6. PERFORMING O’G. REPORT NUMBER I. AuTHOR() S. CONTRACT OR GRANT...indicatinq its imrportance to applications. 4 AN EXTENSION TO TE WEIBULL PROCESS MODEL 1. INTRODUCTION Recent papers by Bain and Engelhardt (1980)1 and Crow
Weibull distribution for modeling drying of grapes and its application%基于Weibull分布函数的葡萄干燥过程模拟及应用
Institute of Scientific and Technical Information of China (English)
白竣文; 王吉亮; 肖红伟; 巨浩羽; 刘嫣红; 高振江
2013-01-01
为了探究 Weibull 分布函数中各参数的影响因素及其在干燥中的应用，该文以不同干燥方法（气体射流冲击干燥、真空脉动干燥）、干燥温度（50、55、60和65℃）以及烫漂预处理（30、60、90、120 s）的葡萄干燥过程为研究对象，利用Weibull分布函数对其干燥动力学曲线进行模拟并分析。研究结果表明：Weibull分布函数能够很好的模拟葡萄在试验条件下的干燥过程；尺度参数α与干燥温度有关，并且随着干燥温度的升高而降低；形状参数β与干燥方式和物料状态有关，但干燥温度对形状参数β的影响很小。计算了葡萄在干燥过程中的水分扩散系数Dcal在0.2982×10-9~2.7700×10-9 m2/s 之间，并根据阿伦尼乌斯公式计算出热风干燥和真空脉动干燥方法的干燥活化能分别为72.87和61.43 kJ/mol。研究结果为Weibull分布函数在葡萄干燥过程的应用提供参考。%Grapes as a seasonal fruit, have relatively high sugar content and moisture content, and are very sensitive to microbial spoilage during storage. Therefore, grapes once harvested must be consumed or processed into various products within a few weeks in order to reduce economic losses. Drying grapes into raisins is the major processing method in almost all countries where grapes are grown. The knowledge of the drying mechanism is very necessary for heat and moisture transportation efficiency, energy savings and product quality. Several different empirical and semi-empirical drying models were used for describing and predicting drying curves. Some of these models could give a good fit to the drying curves, but the basic idea of process characterization was to consider the process as a ‘‘black box’’--the drying materials and drying conditions were difficult to be related to the parameters of these models used. In this study, the Weibull distribution model was applied to the drying process under different
DEFF Research Database (Denmark)
Gryning, Sven-Erik; Batchvarova, Ekaterina; Floors, Rogier Ralph
2014-01-01
by the root-mean-square error was about 10 % lower for the analysis compared to the forecast simulations. At the rural coastal site, the observed mean wind speeds above 60 m were underestimated by both the analysis and forecast model runs. For the inland suburban area, the mean wind speed is overestimated...... at a flat rural coastal site in western Denmark and at an inland suburban area near Hamburg in Germany. Simulations with the weather research and forecasting numerical model were carried out in both forecast and analysis configurations. The scatter between measured and modelled wind speeds expressed...
Abul Kalam Azad; Mohammad Golam Rasul; Talal Yusaf
2014-01-01
The best Weibull distribution methods for the assessment of wind energy potential at different altitudes in desired locations are statistically diagnosed in this study. Seven different methods, namely graphical method (GM), method of moments (MOM), standard deviation method (STDM), maximum likelihood method (MLM), power density method (PDM), modified maximum likelihood method (MMLM) and equivalent energy method (EEM) were used to estimate the Weibull parameters and six statistical tools, name...
Institute of Scientific and Technical Information of China (English)
刘方亮; 刘井泉; 刘伟
2011-01-01
核电站设备可靠性数据的处理是电站进行以可靠性为中心的维修(RCM)和寿期管理(LCM)的基础.在核电站失效数据的实际处理过程中,常会面临失效样本少、维修导致数据分布不独立等问题.为解决上述问题,本文提出以双参数威布尔分布作为寿命模型、采用贝叶斯方法来处理小样本失效数据的方法,并结合核电站运行数据进行验证.结果表明,本方法在处理样本较少以及存在维修老化问题时,具有更好的适用性和准确度.%The equipment reliability data processing is the basis of reliability centered maintenance (RCM) and life cycle management (LCM) in nuclear power plant. However, in actual failure data processing, the problems such as small-sample and non-independent data caused by maintenance are put forward. To resolve the problems, a processing method combined double-parameter Weibull distribution as the life model and Bayesian method for small samples was proposed, and was validated using actual nuclear power plant operating data. The results show that the processing method has better applicability and accuracy to deal with the situation of small samples and the problems of repairing and aging in nuclear power plant.
Institute of Scientific and Technical Information of China (English)
张建平; 韩熠; 刘宇; 朱群志
2015-01-01
为分析钠硫电池加热模块的温升过程，分别基于三维瞬态导热方程和Weibull函数建立了加热模块的理论模型和试验温升数据的拟合模型，数值模拟了钠硫电池加热模块温升过程与瞬态温度分布，探讨Weibull参数对升温曲线的影响规律。结果表明：Weibull拟合模型能够精确描述加热模块的温升过程，可靠度较高；模块内部整体温升率随时间和距离模块中心的长度均呈非线性降低趋势；形状参数和尺度参数分别决定了分段温升和整体温升的效率，这为钠硫电池加热模块以及其他加热装置的优化设计提供参考。%In order to analyze the temperature rise of the heating module for sodium⁃sulfur battery, the theoretical model of the heating module and fitting model of the experimental temperature data were established on the basis of 3D transient heat conduction equation and Weibull function, respectively, and also the temperature rise process and the transient temperature distribution of heating module for sodium⁃sulfur battery were numerically simulated, and the effects of Weibull parameters on the temperature rise curve were further investigated. The results indicate that the Weibull fitting model could accurately describe the temperature rise process of heating module with high reliability, and the temperature rise rate inside the whole heating module presents nonlinearly decreasing trend with the increase of time, as well as the length from the module center. Furthermore, shape and scale parameter dominate the efficiency of the sectional temperature rise and the overall one respectively, and the technical reference is provided for the optimal design of heating module for sodium⁃sulfur battery and other heating devices.
Bias in the Weibull Strength Estimation of a SiC Fiber for the Small Gauge Length Case
Morimoto, Tetsuya; Nakagawa, Satoshi; Ogihara, Shinji
It is known that the single-modal Weibull model describes well the size effect of brittle fiber tensile strength. However, some ceramic fibers have been reported that single-modal Weibull model provided biased estimation on the gauge length dependence. A hypothesis on the bias is that the density of critical defects is very small, thus, fracture probability of small gauge length samples distributes in discrete manner, which makes the Weibull parameters dependent on the gauge length. Tyranno ZMI Si-Zr-C-O fiber has been selected as an example fiber. The tensile tests have been done on several gauge lengths. The derived Weibull parameters have shown a dependence on the gauge length. Fracture surfaces were observed with SEM. Then we classified the fracture surfaces into the characteristic fracture patterns. Percentage of each fracture pattern was found dependent on the gauge length, too. This may be an important factor of the Weibull parameter dependence on the gauge length.
Institute of Scientific and Technical Information of China (English)
DONG Sheng; LI Fengli; JIAO Guiying
2003-01-01
Hydrologic frequency analysis plays an important role in coastal and ocean engineering for structural design and disaster prevention in coastal areas. This paper proposes a Nonlinear Least Squares Method (NLSM), which estimates the three unknown parameters of the Weibull distribution simultaneously by an iteration method. Statistical test shows that the NLSM fits each data sample well. The effects of different parameter-fitting methods, distribution models, and threshold values are also discussed in the statistical analysis of storm set-down elevation. The best-fitting probability distribution is given and the corresponding return values are estimated for engineering design.
Fanfani, Alessandra; Sanches, Jose Afonso; Andreeva, Julia; Bagliesi, Giusepppe; Bauerdick, Lothar; Belforte, Stefano; Bittencourt Sampaio, Patricia; Bloom, Ken; Blumenfeld, Barry; Bonacorsi, Daniele; Brew, Chris; Calloni, Marco; Cesini, Daniele; Cinquilli, Mattia; Codispoti, Giuseppe; D'Hondt, Jorgen; Dong, Liang; Dongiovanni, Danilo; Donvito, Giacinto; Dykstra, David; Edelmann, Erik; Egeland, Ricky; Elmer, Peter; Eulisse, Giulio; Evans, Dave; Fanzago, Federica; Farina, Fabio; Feichtinger, Derek; Fisk, Ian; Flix, Josep; Grandi, Claudio; Guo, Yuyi; Happonen, Kalle; Hernandez, Jose M; Huang, Chih-Hao; Kang, Kejing; Karavakis, Edward; Kasemann, Matthias; Kavka, Carlos; Khan, Akram; Kim, Bockjoo; Klem, Jukka; Koivumaki, Jesper; Kress, Thomas; Kreuzer, Peter; Kurca, Tibor; Kuznetsov, Valentin; Lacaprara, Stefano; Lassila-Perini, Kati; Letts, James; Linden, Tomas; Lueking, Lee; Maes, Joris; Magini, Nicolo; Maier, Gerhild; McBride, Patricia; Metson, Simon; Miccio, Vincenzo; Padhi, Sanjay; Pi, Haifeng; Riahi, Hassen; Riley, Daniel; Rossman, Paul; Saiz, Pablo; Sartirana, Andrea; Sciaba, Andrea; Sekhri, Vijay; Spiga, Daniele; Tuura, Lassi; Vaandering, Eric; Vanelderen, Lukas; Van Mulders, Petra; Vedaee, Aresh; Villella, Ilaria; Wicklund, Eric; Wildish, Tony; Wissing, Christoph; Wurthwein, Frank
2009-01-01
The CMS experiment expects to manage several Pbytes of data each year during the LHC programme, distributing them over many computing sites around the world and enabling data access at those centers for analysis. CMS has identified the distributed sites as the primary location for physics analysis to support a wide community with thousands potential users. This represents an unprecedented experimental challenge in terms of the scale of distributed computing resources and number of user. An overview of the computing architecture, the software tools and the distributed infrastructure is reported. Summaries of the experience in establishing efficient and scalable operations to get prepared for CMS distributed analysis are presented, followed by the user experience in their current analysis activities.
NEW DOCTORAL DEGREE Parameter estimation problem in the Weibull model
Marković, Darija
2009-01-01
In this dissertation we consider the problem of the existence of best parameters in the Weibull model, one of the most widely used statistical models in reliability theory and life data theory. Particular attention is given to a 3-parameter Weibull model. We have listed some of the many applications of this model. We have described some of the classical methods for estimating parameters of the Weibull model, two graphical methods (Weibull probability plot and hazard plot), and two analyt...
Directory of Open Access Journals (Sweden)
Emilio Gómez-Lázaro
2016-02-01
Full Text Available The Weibull probability distribution has been widely applied to characterize wind speeds for wind energy resources. Wind power generation modeling is different, however, due in particular to power curve limitations, wind turbine control methods, and transmission system operation requirements. These differences are even greater for aggregated wind power generation in power systems with high wind penetration. Consequently, models based on one-Weibull component can provide poor characterizations for aggregated wind power generation. With this aim, the present paper focuses on discussing Weibull mixtures to characterize the probability density function (PDF for aggregated wind power generation. PDFs of wind power data are firstly classified attending to hourly and seasonal patterns. The selection of the number of components in the mixture is analyzed through two well-known different criteria: the Akaike information criterion (AIC and the Bayesian information criterion (BIC. Finally, the optimal number of Weibull components for maximum likelihood is explored for the defined patterns, including the estimated weight, scale, and shape parameters. Results show that multi-Weibull models are more suitable to characterize aggregated wind power data due to the impact of distributed generation, variety of wind speed values and wind power curtailment.
Weibull-k Revisited: “Tall” Profiles and Height Variation of Wind Statistics
DEFF Research Database (Denmark)
Kelly, Mark C.; Troen, Ib; Ejsing Jørgensen, Hans
2014-01-01
with height is less understood. Previously we derived a probabilistic model based on similarity theory for calculating the effects of stability and planetary boundary-layer depth upon long-term mean wind profiles. However, some applications (e.g. wind energy estimation) require the Weibull shape parameter (k......-shape parameter. Further, an alternate model for the vertical profile of Weibull shape parameter is made, improving upon a basis set forth by Wieringa (Boundary-Layer Meteorol, 1989, Vol. 47, 85–110), and connecting with a newly-corrected corollary of the perturbed geostrophic-drag theory of Troen and Petersen...... (European Wind Atlas, 1989, Risø National Laboratory, Roskilde). Comparing the models for Weibull-k profiles, a new interpretation and explanation is given for the vertical variation of the shape of wind-speed distributions. Results of the modelling are shown for a number of sites, with a discussion...
Legger, Federica; The ATLAS collaboration
2015-01-01
The ATLAS experiment accumulated more than 140 PB of data during the first run of the Large Hadron Collider (LHC) at CERN. The analysis of such an amount of data for the distributed physics community is a challenging task. The Distributed Analysis (DA) system of the ATLAS experiment is an established and stable component of the ATLAS distributed computing operations. About half a million user jobs are daily running on DA resources, submitted by more than 1500 ATLAS physicists. The reliability of the DA system during the first run of the LHC and the following shutdown period has been high thanks to the continuous automatic validation of the distributed analysis sites and the user support provided by a dedicated team of expert shifters. During the LHC shutdown, the ATLAS computing model has undergone several changes to improve the analysis workflows, including the re-design of the production system, a new analysis data format and event model, and the development of common reduction and analysis frameworks. We r...
Directory of Open Access Journals (Sweden)
Brunham Robert C
2004-07-01
Full Text Available Abstract Background We establish that the occurrence of protein folds among genomes can be accurately described with a Weibull function. Systems which exhibit Weibull character can be interpreted with reliability theory commonly used in engineering analysis. For instance, Weibull distributions are widely used in reliability, maintainability and safety work to model time-to-failure of mechanical devices, mechanisms, building constructions and equipment. Results We have found that the Weibull function describes protein fold distribution within and among genomes more accurately than conventional power functions which have been used in a number of structural genomic studies reported to date. It has also been found that the Weibull reliability parameter β for protein fold distributions varies between genomes and may reflect differences in rates of gene duplication in evolutionary history of organisms. Conclusions The results of this work demonstrate that reliability analysis can provide useful insights and testable predictions in the fields of comparative and structural genomics.
Fetisova, Yu. A.; Ermolenko, B. V.; Ermolenko, G. V.; Kiseleva, S. V.
2017-04-01
We studied the information basis for the assessment of wind power potential on the territory of Russia. We described the methodology to determine the parameters of the Weibull function, which reflects the density of distribution of probabilities of wind flow speeds at a defined basic height above the surface of the earth using the available data on the average speed at this height and its repetition by gradations. The application of the least square method for determining these parameters, unlike the use of graphical methods, allows performing a statistical assessment of the results of approximation of empirical histograms by the Weibull formula. On the basis of the computer-aided analysis of the statistical data, it was shown that, at a fixed point where the wind speed changes at different heights, the range of parameter variation of the Weibull distribution curve is relatively small, the sensitivity of the function to parameter changes is quite low, and the influence of changes on the shape of speed distribution curves is negligible. Taking this into consideration, we proposed and mathematically verified the methodology of determining the speed parameters of the Weibull function at other heights using the parameter computations for this function at a basic height, which is known or defined by the average speed of wind flow, or the roughness coefficient of the geological substrate. We gave examples of practical application of the suggested methodology in the development of the Atlas of Renewable Energy Resources in Russia in conditions of deficiency of source meteorological data. The proposed methodology, to some extent, may solve the problem related to the lack of information on the vertical profile of repeatability of the wind flow speeds in the presence of a wide assortment of wind turbines with different ranges of wind-wheel axis heights and various performance characteristics in the global market; as a result, this methodology can become a powerful tool for
Energy Technology Data Exchange (ETDEWEB)
Bartsch, R.R.
1995-09-01
Key elements of the 36 MJ ATLAS capacitor bank have been evaluated for individual probabilities of failure. These have been combined to estimate system reliability which is to be greater than 95% on each experimental shot. This analysis utilizes Weibull or Weibull-like distributions with increasing probability of failure with the number of shots. For transmission line insulation, a minimum thickness is obtained and for the railgaps, a method for obtaining a maintenance interval from forthcoming life tests is suggested.
ATLAS Distributed Analysis Tools
Gonzalez de la Hoz, Santiago; Liko, Dietrich
2008-01-01
The ATLAS production system has been successfully used to run production of simulation data at an unprecedented scale. Up to 10000 jobs were processed in one day. The experiences obtained operating the system on several grid flavours was essential to perform a user analysis using grid resources. First tests of the distributed analysis system were then performed. In the preparation phase data was registered in the LHC File Catalog (LFC) and replicated in external sites. For the main test, few resources were used. All these tests are only a first step towards the validation of the computing model. The ATLAS management computing board decided to integrate the collaboration efforts in distributed analysis in only one project, GANGA. The goal is to test the reconstruction and analysis software in a large scale Data production using Grid flavors in several sites. GANGA allows trivial switching between running test jobs on a local batch system and running large-scale analyses on the Grid; it provides job splitting a...
Moment series for the coefficient of variation in Weibull sampling
Energy Technology Data Exchange (ETDEWEB)
Bowman, K.O.; Shenton, L.R.
1981-01-01
For the 2-parameter Weibull distribution function F(t) = 1 - exp(-t/b)/sup c/, t > 0, with c and b positive, a moment estimator c* for c is the solution of the equationGAMMA(1 + 2/c*)/GAMMA/sup 2/ (1 + 1/c*) = 1 + v*/sup 2/ where v* is the coefficient of variation in the form ..sqrt..m/sub 2//m/sub 1/', m/sub 1/' being the sample mean, m/sub 2/ the sample second central moment (it is trivial in the present context to replace m/sub 2/ by the variance). One approach to the moments of c* (Bowman and Shenton, 1981) is to set-up moment series for the scale-free v*. The series are apparently divergent and summation algorithms are essential; we consider methods due to Levin (1973) and one, introduced ourselves (Bowman and Shenton, 1976).
The effect of ignoring individual heterogeneity in Weibull log-normal sire frailty models.
Damgaard, L H; Korsgaard, I R; Simonsen, J; Dalsgaard, O; Andersen, A H
2006-06-01
The objective of this study was, by means of simulation, to quantify the effect of ignoring individual heterogeneity in Weibull sire frailty models on parameter estimates and to address the consequences for genetic inferences. Three simulation studies were evaluated, which included 3 levels of individual heterogeneity combined with 4 levels of censoring (0, 25, 50, or 75%). Data were simulated according to balanced half-sib designs using Weibull log-normal animal frailty models with a normally distributed residual effect on the log-frailty scale. The 12 data sets were analyzed with 2 models: the sire model, equivalent to the animal model used to generate the data (complete sire model), and a corresponding model in which individual heterogeneity in log-frailty was neglected (incomplete sire model). Parameter estimates were obtained from a Bayesian analysis using Gibbs sampling, and also from the software Survival Kit for the incomplete sire model. For the incomplete sire model, the Monte Carlo and Survival Kit parameter estimates were similar. This study established that when unobserved individual heterogeneity was ignored, the parameter estimates that included sire effects were biased toward zero by an amount that depended in magnitude on the level of censoring and the size of the ignored individual heterogeneity. Despite the biased parameter estimates, the ranking of sires, measured by the rank correlations between true and estimated sire effects, was unaffected. In comparison, parameter estimates obtained using complete sire models were consistent with the true values used to simulate the data. Thus, in this study, several issues of concern were demonstrated for the incomplete sire model.
Distributions of personal VOC exposures: a population-based analysis.
Jia, Chunrong; D'Souza, Jennifer; Batterman, Stuart
2008-10-01
Information regarding the distribution of volatile organic compound (VOC) concentrations and exposures is scarce, and there have been few, if any, studies using population-based samples from which representative estimates can be derived. This study characterizes distributions of personal exposures to ten different VOCs in the U.S. measured in the 1999--2000 National Health and Nutrition Examination Survey (NHANES). Personal VOC exposures were collected for 669 individuals over 2-3 days, and measurements were weighted to derive national-level statistics. Four common exposure sources were identified using factor analyses: gasoline vapor and vehicle exhaust, methyl tert-butyl ether (MBTE) as a gasoline additive, tap water disinfection products, and household cleaning products. Benzene, toluene, ethyl benzene, xylenes chloroform, and tetrachloroethene were fit to log-normal distributions with reasonably good agreement to observations. 1,4-Dichlorobenzene and trichloroethene were fit to Pareto distributions, and MTBE to Weibull distribution, but agreement was poor. However, distributions that attempt to match all of the VOC exposure data can lead to incorrect conclusions regarding the level and frequency of the higher exposures. Maximum Gumbel distributions gave generally good fits to extrema, however, they could not fully represent the highest exposures of the NHANES measurements. The analysis suggests that complete models for the distribution of VOC exposures require an approach that combines standard and extreme value distributions, and that carefully identifies outliers. This is the first study to provide national-level and representative statistics regarding the VOC exposures, and its results have important implications for risk assessment and probabilistic analyses.
Bayesian Estimation and Prediction for Flexible Weibull Model under Type-II Censoring Scheme
Directory of Open Access Journals (Sweden)
Sanjay Kumar Singh
2013-01-01
Full Text Available We have developed the Bayesian estimation procedure for flexible Weibull distribution under Type-II censoring scheme assuming Jeffrey's scale invariant (noninformative and Gamma (informative priors for the model parameters. The interval estimation for the model parameters has been performed through normal approximation, bootstrap, and highest posterior density (HPD procedures. Further, we have also derived the predictive posteriors and the corresponding predictive survival functions for the future observations based on Type-II censored data from the flexible Weibull distribution. Since the predictive posteriors are not in the closed form, we proposed to use the Monte Carlo Markov chain (MCMC methods to approximate the posteriors of interest. The performance of the Bayes estimators has also been compared with the classical estimators of the model parameters through the Monte Carlo simulation study. A real data set representing the time between failures of secondary reactor pumps has been analysed for illustration purpose.
The distribution of first-passage times and durations in FOREX and future markets
Sazuka, Naoya; Inoue, Jun-ichi; Scalas, Enrico
2009-07-01
Possible distributions are discussed for intertrade durations and first-passage processes in financial markets. The view-point of renewal theory is assumed. In order to represent market data with relatively long durations, two types of distributions are used, namely a distribution derived from the Mittag-Leffler survival function and the Weibull distribution. For the Mittag-Leffler type distribution, the average waiting time (residual life time) is strongly dependent on the choice of a cut-off parameter tmax, whereas the results based on the Weibull distribution do not depend on such a cut-off. Therefore, a Weibull distribution is more convenient than a Mittag-Leffler type if one wishes to evaluate relevant statistics such as average waiting time in financial markets with long durations. On the other hand, we find that the Gini index is rather independent of the cut-off parameter. Based on the above considerations, we propose a good candidate for describing the distribution of first-passage time in a market: The Weibull distribution with a power-law tail. This distribution compensates the gap between theoretical and empirical results more efficiently than a simple Weibull distribution. It should be stressed that a Weibull distribution with a power-law tail is more flexible than the Mittag-Leffler distribution, which itself can be approximated by a Weibull distribution and a power-law. Indeed, the key point is that in the former case there is freedom of choice for the exponent of the power-law attached to the Weibull distribution, which can exceed 1 in order to reproduce decays faster than possible with a Mittag-Leffler distribution. We also give a useful formula to determine an optimal crossover point minimizing the difference between the empirical average waiting time and the one predicted from renewal theory. Moreover, we discuss the limitation of our distributions by applying our distribution to the analysis of the BTP future and calculating the average waiting
Biological implications of the Weibull and Gompertz models of aging.
Ricklefs, Robert E; Scheuerlein, Alex
2002-02-01
Gompertz and Weibull functions imply contrasting biological causes of demographic aging. The terms describing increasing mortality with age are multiplicative and additive, respectively, which could result from an increase in the vulnerability of individuals to extrinsic causes in the Gompertz model and the predominance of intrinsic causes at older ages in the Weibull model. Experiments that manipulate extrinsic mortality can distinguish these biological models. To facilitate analyses of experimental data, we defined a single index for the rate of aging (omega) for the Weibull and Gompertz functions. Each function described the increase in aging-related mortality in simulated ages at death reasonably well. However, in contrast to the Weibull omega(W), the Gompertz omega(G) was sensitive to variation in the initial mortality rate independently of aging-related mortality. Comparisons between wild and captive populations appear to support the intrinsic-causes model for birds, but give mixed support for both models in mammals.
Indian Academy of Sciences (India)
SHIV KUMAR; ABHAY KUMAR SINGH; MANOJ KUMAR PATEL
2016-09-01
In this study, we have discussed the development of an inventory model when the deterioration rate of the item follows Weibull two parameter distributions under the effect of selling price and time dependent demand, since, not only the selling price, but also the time is a crucial factor to enhance the demand in the market as well as affecting the overall finance. In the present model, shortages are approved and also partially backlogged. Optimum inventory level, the optimal length of a cycle and the expressions for profit function under various cost considerations are obtained using differential equations. These are illustrated graphically with the help of numerical examples. The sensitivity analysis of the standards of the parameters has been performed tostudy the effect on inventory optimizations.
Energy Technology Data Exchange (ETDEWEB)
Lee, Jongk Uk; Lee, Kwan Hee; Kim, Sung Il; Yook, Dae Sik; Ahn, Sang Myeon [KINS, Daejeon (Korea, Republic of)
2016-05-15
Evaluation of the meteorological characteristics at the nuclear power plant and in the surrounding area should be performed in determining the site suitability for safe operation of the nuclear power plant. Under unexpected emergency condition, knowledge of meteorological information on the site area is important to provide the basis for estimating environmental impacts resulting from radioactive materials released in gaseous effluents during the accident condition. In the meteorological information, wind speed and direction are the important meteorological factors for examination of the safety analysis in the nuclear power plant area. Wind characteristics was analyzed on Hanbit NPP area. It was found that the Weibull parameters k and c vary 2.56 to 4.77 and 4.53 to 6.79 for directional wind speed distribution, respectively. Maximum wind frequency was NE and minimum was NNW.
Hirose, H
1997-01-01
This paper proposes a new treatment for electrical insulation degradation. Some types of insulation which have been used under various circumstances are considered to degrade at various rates in accordance with their stress circumstances. The cross-linked polyethylene (XLPE) insulated cables inspected by major Japanese electric companies clearly indicate such phenomena. By assuming that the inspected specimen is sampled from one of the clustered groups, a mixed degradation model can be constructed. Since the degradation of the insulation under common circumstances is considered to follow a Weibull distribution, a mixture model and a Weibull power law can be combined. This is called The mixture Weibull power law model. By using the maximum likelihood estimation for the newly proposed model to Japanese 22 and 33 kV insulation class cables, they are clustered into a certain number of groups by using the AIC and the generalized likelihood ratio test method. The reliability of the cables at specified years are assessed.
Bayesian analysis of general failure data from an ageing distribution: advances in numerical methods
Energy Technology Data Exchange (ETDEWEB)
Procaccia, H.; Villain, B. [Electricite de France (EDF), 93 - Saint-Denis (France); Clarotti, C.A. [ENEA, Casaccia (Italy)
1996-12-31
EDF and ENEA carried out a joint research program for developing the numerical methods and computer codes needed for Bayesian analysis of component-lives in the case of ageing. Early results of this study were presented at ESREL`94. Since then the following further steps have been gone: input data have been generalized to the case that observed lives are censored both on the right and on the left; allowable life distributions are Weibull and gamma - their parameters are both unknown and can be statistically dependent; allowable priors are histograms relative to different parametrizations of the life distribution of concern; first-and-second-order-moments of the posterior distributions can be computed. In particular the covariance will give some important information about the degree of the statistical dependence between the parameters of interest. An application of the code to the appearance of a stress corrosion cracking in a tube of the PWR Steam Generator system is presented. (authors). 10 refs.
Directory of Open Access Journals (Sweden)
J. Szymszal
2007-07-01
Full Text Available The first part of the study describes the methods used to determine Weibull modulus and the related reliability index of hypereutectic silumins containing about 17% Si, assigned for manufacture of high-duty castings to be used in automotive applications and aviation. The second part of the study discusses the importance of chemical composition, including the additions of 3% Cu, 1,5% Ni and 1,5% Mg, while in the third part attention was focussed on the effect of process history, including mould type (sand or metal as well as the inoculation process and heat treatment (solutioning and ageing applied to the cast AlSi17Cu3Mg1,5Ni1,5 alloy, on the run of Weibull distribution function and reliability index calculated for the tensile strength Rm of the investigated alloys.
A Weibull multi-state model for the dependence of progression-free survival and overall survival.
Li, Yimei; Zhang, Qiang
2015-07-30
In oncology clinical trials, overall survival, time to progression, and progression-free survival are three commonly used endpoints. Empirical correlations among them have been published for different cancers, but statistical models describing the dependence structures are limited. Recently, Fleischer et al. proposed a statistical model that is mathematically tractable and shows some flexibility to describe the dependencies in a realistic way, based on the assumption of exponential distributions. This paper aims to extend their model to the more flexible Weibull distribution. We derived theoretical correlations among different survival outcomes, as well as the distribution of overall survival induced by the model. Model parameters were estimated by the maximum likelihood method and the goodness of fit was assessed by plotting estimated versus observed survival curves for overall survival. We applied the method to three cancer clinical trials. In the non-small-cell lung cancer trial, both the exponential and the Weibull models provided an adequate fit to the data, and the estimated correlations were very similar under both models. In the prostate cancer trial and the laryngeal cancer trial, the Weibull model exhibited advantages over the exponential model and yielded larger estimated correlations. Simulations suggested that the proposed Weibull model is robust for data generated from a range of distributions.
An EOQ model with time dependent Weibull deterioration and ramp type demand ,
Directory of Open Access Journals (Sweden)
Chaitanya Kumar Tripathy
2011-04-01
Full Text Available This paper presents an order level inventory system with time dependent Weibull deterioration and ramp type demand rate where production and demand are time dependent. The proposed model of this paper considers economic order quantity under two different cases. The implementation of the proposed model is illustrated using some numerical examples. Sensitivity analysis is performed to show the effect of changes in the parameters on the optimum solution.
Institute of Scientific and Technical Information of China (English)
韩亮; 毕文广; 李红江; 师相; 李柄成
2011-01-01
露天矿爆破中爆堆形态的影响因素众多，为找出影响爆堆形态的主控因素，引入灰色关联理论，由于爆堆形态无法用数字参量表达，因此很难直接利用灰色关联理论对其进行计算，通过引入Weibull模型对实测爆堆形态曲线进行拟合计算，完成了爆堆形态参数的量化过程，并对黑岱沟露天煤矿的49组实例进行了灰色关联度计算，得到了各因素对爆堆形态影响的关联序列，并对其进行了分析。该研究对于露天矿爆堆形态的设计优化具有一定的指导意义。%There are many factors influencing the shape of blasted stockpile in the open -pit. In order to find the main influence factors, the grey correlation theory is introduced. However, it is difficult to calculate the shape of blasted stockpile directly using grey correlation theory due to it is unable to express the shape with digital parameters. In this paper, by leading into Weibull model, the fitting calculation of the actual shape curve of blasted stockpile was accomplished, which made quantification of parameters of blasted stockpile. The 49 examples from Heidaigou open -pit were calculated with grey correlation theory, correlativity sequence of influencing factors of the blasted stockpile shape was obtained, and the results were analyzed. The research has definited significance for optimizing the shape design of blasted stockpile in openpit.
Directory of Open Access Journals (Sweden)
Jae Phil Park
2016-12-01
Full Text Available It is extremely difficult to predict the initiation time of cracking due to a large time spread in most cracking experiments. Thus, probabilistic models, such as the Weibull distribution, are usually employed to model the initiation time of cracking. Therefore, the parameters of the Weibull distribution are estimated from data collected from a cracking test. However, although the development of a reliable cracking model under ideal experimental conditions (e.g., a large number of specimens and narrow censoring intervals could be achieved in principle, it is not straightforward to quantitatively assess the effects of the ideal experimental conditions on model estimation uncertainty. The present study investigated the effects of key experimental conditions, including the time-dependent effect of the censoring interval length, on the estimation uncertainties of the Weibull parameters through Monte Carlo simulations. The simulation results provided quantified estimation uncertainties of Weibull parameters in various cracking test conditions. Hence, it is expected that the results of this study can offer some insight for experimenters developing a probabilistic crack initiation model by performing experiments.
Use of MinMaxEnt distributions defined on basis of MaxEnt method in wind power study
Energy Technology Data Exchange (ETDEWEB)
Shamilov, Aladdin; Kantar, Yeliz Mert; Usta, Ilhan [Department of Statistics, Anadolu University, Eskisehir 26470 (Turkey)
2008-04-15
Knowledge of the wind speed distribution is an important information needed in evaluation of wind power potential. Several statistical distributions have been used to study wind data. The Weibull distribution is the most popular due to its ability to fit most accurately the variety of wind speed data measured at different geographical locations throughout the world. Recently, maximum entropy (MaxEnt) distributions based on the maximum entropy method have been widely used to determine wind speed distribution. Li and Li used the MaxEnt distribution for the first time in the wind energy field and proposed a theoretical approach to determine the distribution of wind speed data analytically. Ramirez and Carta discussed the use of wind probability distributions derived from the maximum entropy principle in the analysis of wind energy. In this study, MinMaxEnt distributions defined on the basis of the MaxEnt method are introduced and applied to find wind distribution and wind power density. A comparison of the MinMaxEnt and Weibull distributions on wind speed data taken from different sources and measured in various regions is conducted. The wind power densities of the considered regions obtained from the Weibull and MinMaxEnt distributions are also compared. The results indicate that the MinMaxEnt distributions obtained show better results than the known Weibull distribution for wind speed distributions and wind power density. Therefore, MinMaxEnt distributions can be used to estimate wind distributions and wind power potential. (author)
Distribution system modeling and analysis
Kersting, William H
2002-01-01
For decades, distribution engineers did not have the sophisticated tools developed for analyzing transmission systems-often they had only their instincts. Things have changed, and we now have computer programs that allow engineers to simulate, analyze, and optimize distribution systems. Powerful as these programs are, however, without a real understanding of the operating characteristics of a distribution system, engineers using the programs can easily make serious errors in their designs and operating procedures.Distribution System Modeling and Analysis helps prevent those errors. It gives re
An EOQ Model for Three parameter Weibull Deteriorating Item with Partial Backlogging
Directory of Open Access Journals (Sweden)
L.M. Pradhan
2013-03-01
Full Text Available Background: Business organisations are facing a lot of competition during these days. To withstand the competition and to remain in the front row, an enterprise should have optimum profitable plan for his business. Researchers in recent years have developed various inventory models for deteriorating items considering various practical situations. Partial backlogging is considerably a new concept introduced in developing various models for Weibull deteriorating items. Methodology: In this paper an inventory model has been developed considering three parameter Weibull deterioration of a single item with partial backlogging. Here demand rate is considered to be constant and lead time is zero. During the stock out period the backlogging rate is variable and is dependent on the length of the waiting time for the next replenishment. Results and conclusion: Optimal order quantity and total variable cost during a cycle has been derived for the proposed inventory model considering three parameter Weibull deteriorating item with partial backlogging. The results obtained in this paper are illustrated with the help of a numerical example and sensitivity analysis.
Fissure formation in coke. 3: Coke size distribution and statistical analysis
Energy Technology Data Exchange (ETDEWEB)
D.R. Jenkins; D.E. Shaw; M.R. Mahoney [CSIRO, North Ryde, NSW (Australia). Mathematical and Information Sciences
2010-07-15
A model of coke stabilization, based on a fundamental model of fissuring during carbonisation is used to demonstrate the applicability of the fissuring model to actual coke size distributions. The results indicate that the degree of stabilization is important in determining the size distribution. A modified form of the Weibull distribution is shown to provide a better representation of the whole coke size distribution compared to the Rosin-Rammler distribution, which is generally only fitted to the lump coke. A statistical analysis of a large number of experiments in a pilot scale coke oven shows reasonably good prediction of the coke mean size, based on parameters related to blend rank, amount of low rank coal, fluidity and ash. However, the prediction of measures of the spread of the size distribution is more problematic. The fissuring model, the size distribution representation and the statistical analysis together provide a comprehensive capability for understanding and predicting the mean size and distribution of coke lumps produced during carbonisation. 12 refs., 16 figs., 4 tabs.
Fitting the empirical distribution of intertrade durations
Politi, Mauro; Scalas, Enrico
2008-03-01
Based on the analysis of a tick-by-tick data set used in the previous work by one of the authors (DJIA stocks traded at NYSE in October 1999), in this paper, we reject the hypothesis that tails of the empirical intertrade distribution are described by a power law. We further argue that the Tsallis q-exponentials are a viable tool for fitting and describing the unconditional distribution of empirical intertrade durations and they compare well to the Weibull distribution.
Harun; Draisma; Frankena; Veeneklaas; Van Kampen M
1999-05-07
In this paper we tested the Weibull function and beta-binomial distribution to analyse and predict nest hatchability, using empirical data on hatchability in Muscovy duck (Cairina moschata) eggs under natural incubation (932 successfully incubated nests and 11 822 eggs). The estimated parameters of the Weibull function and beta-binomial model were compared with the logistic regression analysis. The maximum likelihood estimation of the parameters was used to quantify simultaneously the influence of the nesting behaviour and the duration of the reproduction cycle on hatchability. The estimated parameters showed that the hatchability was not affected in natural dump nests, but in artificial dump nests and in nests with non-term eggs the hatchability was reduced by 10 and 25%, respectively. Similar results were obtained using logistic regression. Both models provided a satisfactory description of the observed data set, but the beta-binomial model proved to have more parameters with practical and biological meaningful interpretations, because this model is able to quantify and incorporate the unexplained variation in a single parameter theta (which is a variance measure). Copyright 1999 Academic Press.
Distributed data analysis in LHCb
Paterson, S K
2008-01-01
The LHCb distributed data analysis system consists of the Ganga job submission front-end and the DIRAC Workload and Data Management System (WMS). Ganga is jointly developed with ATLAS and allows LHCb users to submit jobs on several backends including: several batch systems, LCG and DIRAC. The DIRAC API provides a transparent and secure way for users to run jobs to the Grid and is the default mode of submission for the LHCb Virtual Organisation (VO). This is exploited by Ganga to perform distributed user analysis for LHCb. This system provides LHCb with a consistent, efficient and simple user experience in a variety of heterogeneous environments and facilitates the incremental development of user analysis from local test jobs to the Worldwide LHC Computing Grid. With a steadily increasing number of users, the LHCb distributed analysis system has been tuned and enhanced over the past two years. This paper will describe the recent developments to support distributed data analysis for the LHCb experiment on WLCG.
Nadarajah, Saralees; Kotz, Samuel
2007-04-01
Various q-type distributions have appeared in the physics literature in the recent years, see e.g. L.C. Malacarne, R.S. Mendes, E. K. Lenzi, q-exponential distribution in urban agglomeration, Phys. Rev. E 65, (2002) 017106. S.M.D. Queiros, On a possible dynamical scenario leading to a generalised Gamma distribution, in xxx.lanl.gov-physics/0411111. U.M.S. Costa, V.N. Freire, L.C. Malacarne, R.S. Mendes, S. Picoli Jr., E.A. de Vasconcelos, E.F. da Silva Jr., An improved description of the dielectric breakdown in oxides based on a generalized Weibull distribution, Physica A 361, (2006) 215. S. Picoli, Jr., R.S. Mendes, L.C. Malacarne, q-exponential, Weibull, and q-Weibull distributions: an empirical analysis, Physica A 324 (2003) 678-688. A.M.C. de Souza, C. Tsallis, Student's t- and r- distributions: unified derivation from an entropic variational principle, Physica A 236 (1997) 52-57. It is pointed out in the paper that many of these are the same as or particular cases of what has been known in the statistics literature. Several of these statistical distributions are discussed and references provided. We feel that this paper could be of assistance for modeling problems of the type considered by L.C. Malacarne, R.S. Mendes, E. K. Lenzi, q-exponential distribution in urban agglomeration, Phys. Rev. E 65, (2002) 017106. S.M.D. Queiros, On a possible dynamical scenario leading to a generalised Gamma distribution, in xxx.lanl.gov-physics/0411111. U.M.S. Costa, V.N. Freire, L.C. Malacarne, R.S. Mendes, S. Picoli Jr., E.A. de Vasconcelos, E.F. da Silva Jr., An improved description of the dielectric breakdown in oxides based on a generalized Weibull distribution, Physica A 361, (2006) 215. S. Picoli, Jr., R.S. Mendes, L.C. Malacarne, q-exponential, Weibull, and q-Weibull distributions: an empirical analysis, Physica A 324 (2003) 678-688. A.M.C. de Souza, C. Tsallis, Student's t- and r- distributions: unified derivation from an entropic variational principle, Physica A 236
Liu, Guoqing; Huang, ShunJi; Torre, Andrea; Rubertone, Franco S.
1995-11-01
This paper deals with analysis of statistical properties of multi-look processed polarimetric SAR data. Based on an assumption that the multi-look polarimetric measurement is a product between a Gamma-distributed texture variable and a Wishart-distributed polarimetric speckle variable, it is shown that the multi-look polarimetric measurement from a nonhomogeneous region obeys a generalized K-distribution. In order to validate this statistical model, two of its varied versions, multi-look intensity and amplitude K-distributions are particularly compared with histograms of the observed multi-look SAR data of three terrain types, ocean, forest-like and city regions, and with four empirical distribution models, Gaussian, log-normal, gamma and Weibull models. A qualitative relation between the degree of nonhomogeneity of a textured scene and the well-fitting statistical model is then empirically established. Finally, a classifier with adaptive distributions guided by the order parameter of the texture distribution estimated with local statistics is introduced to perform terrain classification, experimental results with both multi-look fully polarimetric data and multi-look single-channel intensity/amplitude data indicate its effectiveness.
The ATLAS Distributed Analysis System
Legger, F; The ATLAS collaboration
2014-01-01
In the LHC operations era, analysis of the multi-petabyte ATLAS data sample by globally distributed physicists is a challenging task. To attain the required scale the ATLAS Computing Model was designed around the concept of grid computing, realized in the Worldwide LHC Computing Grid (WLCG), the largest distributed computational resource existing in the sciences. The ATLAS experiment currently stores over 140 PB of data and runs about 140,000 concurrent jobs continuously at WLCG sites. During the first run of the LHC, the ATLAS Distributed Analysis (DA) service has operated stably and scaled as planned. More than 1600 users submitted jobs in 2012, with 2 million or more analysis jobs per week, peaking at about a million jobs per day. The system dynamically distributes popular data to expedite processing and maximally utilize resources. The reliability of the DA service is high but steadily improving; grid sites are continually validated against a set of standard tests, and a dedicated team of expert shifters ...
The ATLAS Distributed Analysis System
Legger, F; The ATLAS collaboration; Pacheco Pages, A; Stradling, A
2013-01-01
In the LHC operations era, analysis of the multi-petabyte ATLAS data sample by globally distributed physicists is a challenging task. To attain the required scale the ATLAS Computing Model was designed around the concept of grid computing, realized in the Worldwide LHC Computing Grid (WLCG), the largest distributed computational resource existing in the sciences. The ATLAS experiment currently stores over 140 PB of data and runs about 140,000 concurrent jobs continuously at WLCG sites. During the first run of the LHC, the ATLAS Distributed Analysis (DA) service has operated stably and scaled as planned. More than 1600 users submitted jobs in 2012, with 2 million or more analysis jobs per week, peaking at about a million jobs per day. The system dynamically distributes popular data to expedite processing and maximally utilize resources. The reliability of the DA service is high but steadily improving; grid sites are continually validated against a set of standard tests, and a dedicated team of expert shifters ...
Statistical Analysis of Data for Timber Strengths
DEFF Research Database (Denmark)
Sørensen, John Dalsgaard
2003-01-01
Statistical analyses are performed for material strength parameters from a large number of specimens of structural timber. Non-parametric statistical analysis and fits have been investigated for the following distribution types: Normal, Lognormal, 2 parameter Weibull and 3-parameter Weibull....... The statistical fits have generally been made using all data and the lower tail of the data. The Maximum Likelihood Method and the Least Square Technique have been used to estimate the statistical parameters in the selected distributions. The results show that the 2-parameter Weibull distribution gives the best...... fits to the data available, especially if tail fits are used whereas the Log Normal distribution generally gives a poor fit and larger coefficients of variation, especially if tail fits are used. The implications on the reliability level of typical structural elements and on partial safety factors...
Performance Improvement in Spatially Multiplexed MIMO Systems over Weibull-Gamma Fading Channel
Tiwari, Keerti; Saini, Davinder S.; Bhooshan, Sunil V.
2016-11-01
In multiple-input multiple-output (MIMO) systems, spatial demultiplexing at the receiver has its own significance. Thus, several detection techniques have been investigated. There is a tradeoff between computational complexity and optimal performance in most of the detection techniques. One of the detection techniques which gives improved performance and acceptable level of complexity is ordered successive interference cancellation (OSIC) with minimum mean square error (MMSE). However, optimal performance can be achieved by maximum likelihood (ML) detection but at a higher complexity level. Therefore, MMSE-OSIC with candidates (OSIC2) detection is recommended as a solution. In this paper, spatial multiplexed (SM) MIMO systems are considered to evaluate error performance with different detection techniques such as MMSE-OSIC, ML and MMSE-OSIC2 in a composite fading i. e. Weibull-gamma (WG) fading environment. In WG distribution, Weibull and gamma distribution represent multipath and shadowing effects, respectively. Simulation results illustrate that MMSE-OSIC2 detection technique gives the improved symbol error rate (SER) performance which is similar to ML performance and its complexity level approaches to MMSE-OSIC.
Institute of Scientific and Technical Information of China (English)
Xiao Hailin; Nie Zaiping; Yang Shiwen
2007-01-01
The novel closed-form expressions for the average channel capacity of dual selection diversity is presented, as well as, the bit-error rate (BER) of several coherent and noncoherent digital modulation schemes in the correlated Weibull fading channels with nonidentical statistics.The results are expressed in terms of Meijer's Gfunction, which can be easily evaluated numerically.The simulation results are presented to validate the proposed theoretical analysis and to examine the effects of the fading severity on the concerned quantities.
Weibull Effective Area for Hertzian Ring Crack Initiation Stress
Energy Technology Data Exchange (ETDEWEB)
Jadaan, Osama M. [University of Wisconsin, Platteville; Wereszczak, Andrew A [ORNL; Johanns, Kurt E [ORNL
2011-01-01
Spherical or Hertzian indentation is used to characterize and guide the development of engineered ceramics under consideration for diverse applications involving contact, wear, rolling fatigue, and impact. Ring crack initiation can be one important damage mechanism of Hertzian indentation. It is caused by sufficiently-high, surface-located, radial tensile stresses in an annular ring located adjacent to and outside of the Hertzian contact circle. While the maximum radial tensile stress is known to be dependent on the elastic properties of the sphere and target, the diameter of the sphere, the applied compressive force, and the coefficient of friction, the Weibull effective area too will be affected by those parameters. However, the estimations of a maximum radial tensile stress and Weibull effective area are difficult to obtain because the coefficient of friction during Hertzian indentation is complex, likely intractable, and not known a priori. Circumventing this, the Weibull effective area expressions are derived here for the two extremes that bracket all coefficients of friction; namely, (1) the classical, frictionless, Hertzian case where only complete slip occurs, and (2) the case where no slip occurs or where the coefficient of friction is infinite.
Analysis of irregularly distributed points
DEFF Research Database (Denmark)
Hartelius, Karsten
1996-01-01
The present thesis is on the analysis of irregularly distributed points. The main part of the thesis is concerned with enterpolating and restoration of irregularly distribyted points. The least squares methods of kriging and Kalman filtering and the Bayesian restoration method of iterated...... is described as a robust estimator which may be appled straightfor- wardly to a wide range of point patterns and processes when the correlation structure is known. We give a qualitative and quantitative comparison of kriging, Kalman filter and iterated conditional modes. The Kalman filter have in a case study...... and represents an interesting contextuel classifier. Extended Kalman filtering on the other hand seems to be well suited for interpolation in gradually changing environments. Bayesian restoration is applied to a point matching problem, which consists of matching a grid to an image of (irregularly) distributed...
Directory of Open Access Journals (Sweden)
Ewa Wąsik
2016-06-01
Full Text Available The article presents the reliability of municipal sewage treatment plant in an area Niepołomicka Industrial Zone. The analysis is based on five indicators of pollution: BOD5, CODCr, total suspension, total nitrogen and total phosphorus. Samples of treated sewage were collected once a month in the period from January 2011 to December 2013. The paper presents an analysis of the effectiveness of individual indicators and identify their basic statistical characteristics. Studies have shown that wastewater treatment plant in Niepołomice is characterized by high efficiency of pollutants removal with mean effectiveness of BOD5 – 98.8%, CODCr – 97.0%, total suspension – 97.3%, total nitrogen – 88.6%, and total phosphorus – 97.0%. The calculated forecast reliability of the discussed treatment plant based on the distribution of indicators in treated wastewater using Weibull model showed, that the facility meet the requirements for removal of these indicators for 365 days in the case of BOD5, CODCr, suspended solids and total phosphorus, while for total nitrogen – 336 days a year.
Energy Technology Data Exchange (ETDEWEB)
Gorgoseo, J. J.; Rojo, A.; Camara-Obregon, A.; Dieguez-Aranda, U.
2012-07-01
The purpose of this study was to compare the accuracy of the Weibull, Johnson's SB and beta distributions, fitted with some of the most usual methods and with different fixed values for the location parameters, for describing diameter distributions in even-aged stands of Pinus pinaster, Pinus radiata and Pinus sylvestris in northwest Spain. A total of 155 permanent plots in Pinus sylvestris stands throughout Galicia, 183 plots in Pinus pinaster stands throughout Galicia and Asturias and 325 plots in Pinus radiata stands in both regions were measured to describe the diameter distributions. Parameters of the Weibull function were estimated by Moments and Maximum Likelihood approaches, those of Johnson's SB function by Conditional Maximum Likelihood and by Knoebel and Burkhart's method, and those of the beta function with the method based on the moments of the distribution. The beta and the Johnson's SB functions were slightly superior to Weibull function for Pinus pinaster stands; the Johnson's SB and beta functions were more accurate in the best fits for Pinus radiata stands, and the best results of the Weibull and the Johnson's SB functions were slightly superior to beta function for Pinus sylvestris stands. However, the three functions are suitable for this stands with an appropriate value of the location parameter and estimation of parameters method. (Author) 44 refs.
Distributed Data Analysis in ATLAS
Nilsson, P
2009-01-01
Data analysis using grid resources is one of the fundamental challenges to be addressed before the start of LHC data taking. The ATLAS detector will produce petabytes of data per year, and roughly one thousand users will need to run physics analyses on this data. Appropriate user interfaces and helper applications have been made available to ensure that the grid resources can be used without requiring expertise in grid technology. These tools enlarge the number of grid users from a few production administrators to potentially all participating physicists. ATLAS makes use of three grid infrastructures for the distributed analysis: the EGEE sites, the Open Science Grid, and NorduGrid. These grids are managed by the gLite workload management system, the PanDA workload management system, and ARC middleware; many sites can be accessed via both the gLite WMS and PanDA. Users can choose between two front-end tools to access the distributed resources. Ganga is a tool co-developed with LHCb to provide a common interfa...
Distributed Data Analysis in ATLAS
Nilsson, P; The ATLAS collaboration
2012-01-01
Data analysis using grid resources is one of the fundamental challenges to be addressed before the start of LHC data taking. The ATLAS detector will produce petabytes of data per year, and roughly one thousand users will need to run physics analyses on this data. Appropriate user interfaces and helper applications have been made available to ensure that the grid resources can be used without requiring expertise in grid technology. These tools enlarge the number of grid users from a few production administrators to potentially all participating physicists. ATLAS makes use of three grid infrastructures for the distributed analysis: the EGEE sites, the Open Science Grid, and NorduGrid. These grids are managed by the gLite workload management system, the PanDA workload management system, and ARC middleware; many sites can be accessed via both the gLite WMS and PanDA. Users can choose between two front-end tools to access the distributed resources. Ganga is a tool co-developed with LHCb to provide a common interfa...
Directory of Open Access Journals (Sweden)
Amany E. Aly
2016-04-01
Full Text Available When a system consisting of independent components of the same type, some appropriate actions may be done as soon as a portion of them have failed. It is, therefore, important to be able to predict later failure times from earlier ones. One of the well-known failure distributions commonly used to model component life, is the modified Weibull distribution (MWD. In this paper, two pivotal quantities are proposed to construct prediction intervals for future unobservable lifetimes based on generalized order statistics (gos from MWD. Moreover, a pivotal quantity is developed to reconstruct missing observations at the beginning of experiment. Furthermore, Monte Carlo simulation studies are conducted and numerical computations are carried out to investigate the efficiency of presented results. Finally, two illustrative examples for real data sets are analyzed.
Kim, Kyu Rang; Kim, Mijin; Choe, Ho-Seong; Han, Mae Ja; Lee, Hye-Rim; Oh, Jae-Won; Kim, Baek-Jo
2016-07-01
Pollen is an important cause of respiratory allergic reactions. As individual sanitation has improved, allergy risk has increased, and this trend is expected to continue due to climate change. Atmospheric pollen concentration is highly influenced by weather conditions. Regression analysis and modeling of the relationships between airborne pollen concentrations and weather conditions were performed to analyze and forecast pollen conditions. Traditionally, daily pollen concentration has been estimated using regression models that describe the relationships between observed pollen concentrations and weather conditions. These models were able to forecast daily concentrations at the sites of observation, but lacked broader spatial applicability beyond those sites. To overcome this limitation, an integrated modeling scheme was developed that is designed to represent the underlying processes of pollen production and distribution. A maximum potential for airborne pollen is first determined using the Weibull probability density function. Then, daily pollen concentration is estimated using multiple regression models. Daily risk grade levels are determined based on the risk criteria used in Korea. The mean percentages of agreement between the observed and estimated levels were 81.4-88.2 % and 92.5-98.5 % for oak and Japanese hop pollens, respectively. The new models estimated daily pollen risk more accurately than the original statistical models because of the newly integrated biological response curves. Although they overestimated seasonal mean concentration, they did not simulate all of the peak concentrations. This issue would be resolved by adding more variables that affect the prevalence and internal maturity of pollens.
Kim, Kyu Rang; Kim, Mijin; Choe, Ho-Seong; Han, Mae Ja; Lee, Hye-Rim; Oh, Jae-Won; Kim, Baek-Jo
2017-02-01
Pollen is an important cause of respiratory allergic reactions. As individual sanitation has improved, allergy risk has increased, and this trend is expected to continue due to climate change. Atmospheric pollen concentration is highly influenced by weather conditions. Regression analysis and modeling of the relationships between airborne pollen concentrations and weather conditions were performed to analyze and forecast pollen conditions. Traditionally, daily pollen concentration has been estimated using regression models that describe the relationships between observed pollen concentrations and weather conditions. These models were able to forecast daily concentrations at the sites of observation, but lacked broader spatial applicability beyond those sites. To overcome this limitation, an integrated modeling scheme was developed that is designed to represent the underlying processes of pollen production and distribution. A maximum potential for airborne pollen is first determined using the Weibull probability density function. Then, daily pollen concentration is estimated using multiple regression models. Daily risk grade levels are determined based on the risk criteria used in Korea. The mean percentages of agreement between the observed and estimated levels were 81.4-88.2 % and 92.5-98.5 % for oak and Japanese hop pollens, respectively. The new models estimated daily pollen risk more accurately than the original statistical models because of the newly integrated biological response curves. Although they overestimated seasonal mean concentration, they did not simulate all of the peak concentrations. This issue would be resolved by adding more variables that affect the prevalence and internal maturity of pollens.
Directory of Open Access Journals (Sweden)
Luiz Claudio Pardini
2002-10-01
Full Text Available Carbon fibres and glass fibres are reinforcements for advanced composites and the fiber strength is the most influential factor on the strength of the composites. They are essentially brittle and fail with very little reduction in cross section. Composites made with these fibres are characterized by a high strength/density ratio and their properties are intrisically related to their microstructure, i.e., amount and orientation of the fibres, surface treatment, among other factors. Processing parameters have an important role in the fibre mechanical behaviour (strength and modulus. Cracks, voids and impurities in the case of glass fibres and fibrillar misalignments in the case of carbon fibres are created during processing. Such inhomogeneities give rise to an appreciable scatter in properties. The most used statistical tool that deals with this characteristic variability in properties is the Weibull distribution. The present work investigates the influence of the testing gage length on the strength, Young's modulus and Weibull modulus of carbon fibres and glass fibres. The Young's modulus is calculated by two methods: (i ASTM D 3379M, and (ii interaction between testing equipment/specimen The first method resulted in a Young modulus of 183 GPa for carbon fibre, and 76 GPa for glass fibre. The second method gave a Young modulus of 250 GPa for carbon fibre and 50 GPa for glass fibre. These differences revelead differences on how the interaction specimen/testing machine can interfere in the Young modulus calculations. Weibull modulus can be a tool to evaluate the fibre's homogeneity in terms of properties and it is a good quality control parameter during processing. In the range of specimen gage length tested the Weibull modulus for carbon fibre is ~ 3.30 and for glass fibres is ~ 5.65, which indicates that for the batch of fibres tested, the glass fibre is more uniform in properties.
An EOQ Model for Time Dependent Weibull Deterioration with Linear Demand and Shortages
Directory of Open Access Journals (Sweden)
Umakanta Mishra
2012-06-01
Full Text Available Background. The study of control and maintenance of production inventories of deteriorating items with and without shortages has grown in its importance recently. The effect of deterioration is very important in many inventory systems. Deterioration is defined as decay or damage such that the item cannot be used for its original purpose. Methods: In this article order level inventory models have been developed for deteriorating items with linear demand and Weibull deterioration. In developing the model we have assumed that the production rate and the demand rate are time dependent. The unit production cost is inversely proportional to demand. Inventory-production system has two parameters Weibull deterioration. Results and conclusions: Two models have been developed considering without shortage cases and with shortage case where the shortages are completely backlogged. The objective of the model is to develop an optimal policy that minimizes the total average cost. Sensitivity analysis has been carried out to show the effect of changes in the parameter on the optimum total average cost.
Directory of Open Access Journals (Sweden)
J. Piątkowski
2012-12-01
Full Text Available Purpose: The main purpose of the study was to determine methodology for estimation of the operational reliability based on the statistical results of abrasive wear testing.Design/methodology/approach: For research, a traditional tribological system, i.e. a friction pair of the AlSi17CuNiMg silumin in contact with the spheroidal graphite cast iron of EN-GJN-200 grade, was chosen. Conditions of dry friction were assumed. This system was chosen based on mechanical cooperation between the cylinder (silumin and piston rings (spheroidal graphite cast iron in conventional internal combustion piston engines with spark ignition.Findings: Using material parameters of the cylinder and piston rings, nominal losses qualifying the cylinder for repair and the maximum weight losses that can be smothered were determined. Based on the theoretical number of engine revolutions to repair and stress acting on the cylinder bearing surface, the maximum distance that the motor vehicle can travel before the seizure of the cylinder occurs was calculated. These results were the basis for statistical analysis carried out with the Weibull modulus, the end result of which was the estimation of material reliability (the survival probability of tribological system and the determination of a pre-operation warranty period of the tribological system.Research limitations/implications: The analysis of Weibull distribution modulus will estimate the reliability of a tribological cylinder-ring system enabled the determination of an approximate theoretical time of the combustion engine failure-free running.Originality/value: The results are valuable statistical data and methodology proposed in this paper can be used to determine a theoretical life time of the combustion engine.
Comparative Distributions of Hazard Modeling Analysis
Directory of Open Access Journals (Sweden)
Rana Abdul Wajid
2006-07-01
Full Text Available In this paper we present the comparison among the distributions used in hazard analysis. Simulation technique has been used to study the behavior of hazard distribution modules. The fundamentals of Hazard issues are discussed using failure criteria. We present the flexibility of the hazard modeling distribution that approaches to different distributions.
Test Population Selection from Weibull-Based, Monte Carlo Simulations of Fatigue Life
Vlcek, Brian L.; Zaretsky, Erwin V.; Hendricks, Robert C.
2012-01-01
Fatigue life is probabilistic and not deterministic. Experimentally establishing the fatigue life of materials, components, and systems is both time consuming and costly. As a result, conclusions regarding fatigue life are often inferred from a statistically insufficient number of physical tests. A proposed methodology for comparing life results as a function of variability due to Weibull parameters, variability between successive trials, and variability due to size of the experimental population is presented. Using Monte Carlo simulation of randomly selected lives from a large Weibull distribution, the variation in the L10 fatigue life of aluminum alloy AL6061 rotating rod fatigue tests was determined as a function of population size. These results were compared to the L10 fatigue lives of small (10 each) populations from AL2024, AL7075 and AL6061. For aluminum alloy AL6061, a simple algebraic relationship was established for the upper and lower L10 fatigue life limits as a function of the number of specimens failed. For most engineering applications where less than 30 percent variability can be tolerated in the maximum and minimum values, at least 30 to 35 test samples are necessary. The variability of test results based on small sample sizes can be greater than actual differences, if any, that exists between materials and can result in erroneous conclusions. The fatigue life of AL2024 is statistically longer than AL6061 and AL7075. However, there is no statistical difference between the fatigue lives of AL6061 and AL7075 even though AL7075 had a fatigue life 30 percent greater than AL6061.
Weibull Parameters Estimation Based on Physics of Failure Model
DEFF Research Database (Denmark)
Kostandyan, Erik; Sørensen, John Dalsgaard
2012-01-01
Reliability estimation procedures are discussed for the example of fatigue development in solder joints using a physics of failure model. The accumulated damage is estimated based on a physics of failure model, the Rainflow counting algorithm and the Miner’s rule. A threshold model is used...... distribution. Methods from structural reliability analysis are used to model the uncertainties and to assess the reliability for fatigue failure. Maximum Likelihood and Least Square estimation techniques are used to estimate fatigue life distribution parameters....
Distributed Analysis with Java and Objectivity
Institute of Scientific and Technical Information of China (English)
MANSJeremiah
2001-01-01
New experiments including those at the LHC will require analysis of very large datasets which are best handled with distributed computation.We present the design and development of a prototype framework using Java and Objectivity.Our framework solves such analysis-specific problems as selecting event samples from large distributed databases.producing varialbe distributions,and negotiating between multiple analysis service providers.Examples from the successful application of the prototype to the analysis of data from the L3 experiment will also be presented.
Transient stability analysis of a distribution network with distributed generators
Xyngi, I.; Ishchenko, A.; Popov, M.; Van der Sluis, L.
2009-01-01
This letter describes the transient stability analysis of a 10-kV distribution network with wind generators, microturbines, and CHP plants. The network being modeled in Matlab/Simulink takes into account detailed dynamic models of the generators. Fault simulations at various locations are investigat
Directory of Open Access Journals (Sweden)
Quintana Alicia Esther
2015-01-01
Full Text Available Manufacturing with optimal quality standards is underpinned to the high reliability of its equipment and systems, among other essential pillars. Maintenance Engineering is responsible for planning control and continuous improvement of its critical equipment by any approach, such as Six Sigma. This is nourished by numerous statistical tools highlighting, among them, statistical process control charts. While their first applications were in production, other designs have emerged to adapt to new needs as monitoring equipment and systems in the manufacturing environment. The time between failures usually fits an exponential or Weibull model. The t chart and adjusted t chart, with probabilistic control limits, are suitable alternatives to monitor the mean time between failures. Unfortunately, it is difficult to find publications of them applied to the models Weibull, very useful in contexts such as maintenance. In addition, literature limits the study of their performance to the analysis of the standard metric average run length, thus giving a partial view. The aim of this paper is to explore the performance of the t chart and adjusted t chart using three metrics, two unconventional. To do this, it incorporates the concept of lateral variability, in their forms left and right variability. Major precisions of the behavior of these charts allow to understand the conditions under which are suitable: if the main objective of monitoring lies in detecting deterioration, the t chart with adjustment is recommended. On the other hand, when the priority is to detect improvements, the t chart without adjustment is the best choice. However, the response speed of both charts is very variable from run to run.
Directory of Open Access Journals (Sweden)
A Lakshmana Rao
2015-02-01
Full Text Available Inventory models play an important role in determining the optimal ordering and pricing policies. Much work has been reported in literature regarding inventory models with finite or infinite replenishment. But in many practical situations the replenishment is governed by random factors like procurement, transportation, environmental condition, availability of raw material etc., Hence, it is needed to develop inventory models with random replenishment. In this paper, an EPQ model for deteriorating items is developed and analyzed with the assumption that the replenishment is random and follows a Weibull distribution. It is further assumed that the life time of a commodity is random and follows a generalized Pareto distribution and demand is a function of on hand inventory. Using the differential equations, the instantaneous state of inventory is derived. With suitable cost considerations, the total cost function is obtained. By minimizing the total cost function, the optimal ordering policies are derived. Through numerical illustrations, the sensitivity analysis is carried. The sensitivity analysis of the model reveals that the random replenishment has significant influence on the ordering and pricing policies of the model. This model also includes some of the earlier models as particular cases for specific values of the parameters.
Weibull Analysis and Area Scaling for Infrared Window Materials (U)
2016-08-01
Published by ..................................................................... Technical Communication Office Collation ...are ground and polished by the same methods used to make the window. Even if machining of coupons is matched as well as possible to that of the
Directory of Open Access Journals (Sweden)
L. M. Pradhan
2012-01-01
Full Text Available This paper deals with the development of an inventory model for Weibull deteriorating items with constant demand when delay in payments is allowed to the retailer to settle the account against the purchases made. Shortages are not allowed and the salvage value is associated with the deteriorated units. In this paper, we consider two cases; those are for the case payment within the permissible time and for payment after the expiry of permissible time with interest. Numerical examples are provided to illustrate our results. Sensitivity analysis are carried out to analyze the effect of changes in the optimal solution with respect to change in one parameter at a time.
ANALYSIS OF ACIDIC PROPERTIES OF DISTRIBUTION ...
African Journals Online (AJOL)
user
ANALYSIS OF ACIDIC PROPERTIES OF DISTRIBUTION TRANSFORMER OIL. INSULATION: A ... rated above 500 kVA are classed as power transformers. ..... Transformer Thermal Behavior and Aging in Local- delivery .... Cable (Isolated).
Hirose, Hideo
1998-01-01
TYPES OF THE DISTRIBUTION:13;Normal distribution (2-parameter)13;Uniform distribution (2-parameter)13;Exponential distribution ( 2-parameter)13;Weibull distribution (2-parameter)13;Gumbel Distribution (2-parameter)13;Weibull/Frechet Distribution (3-parameter)13;Generalized extreme-value distribution (3-parameter)13;Gamma distribution (3-parameter)13;Extended Gamma distribution (3-parameter)13;Log-normal distribution (3-parameter)13;Extended Log-normal distribution (3-parameter)13;Generalized ...
Hirose, Hideo
1998-01-01
TYPES OF THE DISTRIBUTION:13;Normal distribution (2-parameter)13;Uniform distribution (2-parameter)13;Exponential distribution ( 2-parameter)13;Weibull distribution (2-parameter)13;Gumbel Distribution (2-parameter)13;Weibull/Frechet Distribution (3-parameter)13;Generalized extreme-value distribution (3-parameter)13;Gamma distribution (3-parameter)13;Extended Gamma distribution (3-parameter)13;Log-normal distribution (3-parameter)13;Extended Log-normal distribution (3-parameter)13;Generalized ...
Balakrishnan, Narayanaswamy; Pal, Suvra
2016-08-01
Recently, a flexible cure rate survival model has been developed by assuming the number of competing causes of the event of interest to follow the Conway-Maxwell-Poisson distribution. This model includes some of the well-known cure rate models discussed in the literature as special cases. Data obtained from cancer clinical trials are often right censored and expectation maximization algorithm can be used in this case to efficiently estimate the model parameters based on right censored data. In this paper, we consider the competing cause scenario and assuming the time-to-event to follow the Weibull distribution, we derive the necessary steps of the expectation maximization algorithm for estimating the parameters of different cure rate survival models. The standard errors of the maximum likelihood estimates are obtained by inverting the observed information matrix. The method of inference developed here is examined by means of an extensive Monte Carlo simulation study. Finally, we illustrate the proposed methodology with a real data on cancer recurrence.
Gobinda Chandra Panda; Pravat Kumar Sukla
2013-01-01
Background: Physical decay or deterioration of goods in stock is an important feature of real inventory systems. Material and methods: In the present paper, we discuss an production inventory model for a Weibull deteriorating item over a finite planning horizon with a linearly time-varying demand rate and a uniform production rate, allowing shortages, which are completely backlogged. Results and conclusions: A production inventory model is developed for a Weibull deteriorating...
Linear vs. piecewise Weibull model for genetic evaluation of sires for longevity in Simmental cattle
Directory of Open Access Journals (Sweden)
Nikola Raguž
2014-09-01
Full Text Available This study was focused on genetic evaluation of longevity in Croatian Simmental cattle using linear and survival models. The main objective was to create a genetic model that is most appropriate to describe the longevity data. Survival analysis, using piecewise Weibull proportional hazards model, used all information on the length of productive life including censored as well as uncensored observations. Linear models considered culled animals only. The relative milk production within herd had a highest impact on cows’ longevity. In comparison of estimated genetic parameters among methods, survival analysis yielded higher heritability value (0.075 than linear sire (0.037 and linear animal model (0.056. When linear models were used, genetic trend of Simmental bulls for longevity was slightly increasing over the years, unlike a decreasing trend in case of survival analysis methodology. Average reliability of bulls’ breeding values was higher in case of survival analysis. The rank correlations between survival analysis and linear models bulls’ breeding values for longevity were ranged between 0.44 and 0.46 implying huge differences in ranking of sires.
Generalized Analysis of a Distribution Separation Method
Directory of Open Access Journals (Sweden)
Peng Zhang
2016-04-01
Full Text Available Separating two probability distributions from a mixture model that is made up of the combinations of the two is essential to a wide range of applications. For example, in information retrieval (IR, there often exists a mixture distribution consisting of a relevance distribution that we need to estimate and an irrelevance distribution that we hope to get rid of. Recently, a distribution separation method (DSM was proposed to approximate the relevance distribution, by separating a seed irrelevance distribution from the mixture distribution. It was successfully applied to an IR task, namely pseudo-relevance feedback (PRF, where the query expansion model is often a mixture term distribution. Although initially developed in the context of IR, DSM is indeed a general mathematical formulation for probability distribution separation. Thus, it is important to further generalize its basic analysis and to explore its connections to other related methods. In this article, we first extend DSM’s theoretical analysis, which was originally based on the Pearson correlation coefficient, to entropy-related measures, including the KL-divergence (Kullback–Leibler divergence, the symmetrized KL-divergence and the JS-divergence (Jensen–Shannon divergence. Second, we investigate the distribution separation idea in a well-known method, namely the mixture model feedback (MMF approach. We prove that MMF also complies with the linear combination assumption, and then, DSM’s linear separation algorithm can largely simplify the EM algorithm in MMF. These theoretical analyses, as well as further empirical evaluation results demonstrate the advantages of our DSM approach.
Analysis of distribution of critical current of bent-damaged Bi2223 composite tape
Energy Technology Data Exchange (ETDEWEB)
Ochiai, S; Okuda, H; Hojo, M [Graduate School of Engineering, Kyoto University, Yoshida, Sakyo-ku, Kyoto 606- 8501 (Japan); Sugano, M [Graduate School of Engineering, Kyoto University, Kyoto-Daigaku Katsura, Nishikyo-ku, Kyoto 615-8530 (Japan); Osamura, K [Research Institute for Applied Sciences, Sakyo-ku, Kyoto 606-8202 (Japan); Kuroda, T; Kumakura, H; Kitaguchi, H; Itoh, K; Wada, H, E-mail: shojiro.ochiai@materials.mbox.media.kyoto-u.ac.jp [National Institute for Materials Science, 1-2-1, Sengen, Tsukuba, Ibaraki 305-0047 (Japan)
2011-10-29
Distributions of critical current of damaged Bi2223 tape specimens bent by 0.6, 0.8 and 1.0% were investigated analytically with a modelling approach based on the correlation of damage evolution to distribution of critical current. It was revealed that the distribution of critical current is described by three parameter Weibull distribution function through the distribution of the tensile damage strain of Bi2223 filaments that determines the damage front in bent-composite tape. Also it was shown that the measured distribution of critical current values can be reproduced successfully by a Monte Carlo simulation using the distributions of tensile damage strain of filaments and original critical current.
Distributed Beamforming with Feedback: Convergence Analysis
Lin, C; Meyn, S
2008-01-01
The focus of this work is on the analysis of transmit beamforming schemes with a low-rate feedback link in wireless sensor/relay networks, where nodes in the network need to implement beamforming in a distributed manner. Specifically, the problem of distributed phase alignment is considered, where neither the transmitters nor the receiver has perfect channel state information, but there is a low-rate feedback link from the receiver to the transmitters. In this setting, a framework for systematically analyzing the performance of a general set of distributed beamforming schemes is proposed. To illustrate the advantage of this framework, a simple adaptive distributed beamforming scheme that was recently proposed by Mudambai et al. is studied. Two important properties for the received signal magnitude function are derived. Using these properties and the systematic framework, it is shown that the adaptive distributed beamforming scheme converges both in probability and in mean. Furthermore, it is established that ...
Distributed Algorithms for Time Optimal Reachability Analysis
DEFF Research Database (Denmark)
Zhang, Zhengkui; Nielsen, Brian; Larsen, Kim Guldstrand
2016-01-01
. We propose distributed computing to accelerate time optimal reachability analysis. We develop five distributed state exploration algorithms, implement them in \\uppaal enabling it to exploit the compute resources of a dedicated model-checking cluster. We experimentally evaluate the implemented...... algorithms with four models in terms of their ability to compute near- or proven-optimal solutions, their scalability, time and memory consumption and communication overhead. Our results show that distributed algorithms work much faster than sequential algorithms and have good speedup in general....
Models and analysis for distributed systems
Haddad, Serge; Pautet, Laurent; Petrucci, Laure
2013-01-01
Nowadays, distributed systems are increasingly present, for public software applications as well as critical systems. software applications as well as critical systems. This title and Distributed Systems: Design and Algorithms - from the same editors - introduce the underlying concepts, the associated design techniques and the related security issues.The objective of this book is to describe the state of the art of the formal methods for the analysis of distributed systems. Numerous issues remain open and are the topics of major research projects. One current research trend consists of pro
Distributed Algorithms for Time Optimal Reachability Analysis
DEFF Research Database (Denmark)
Zhang, Zhengkui; Nielsen, Brian; Larsen, Kim Guldstrand
2016-01-01
Time optimal reachability analysis is a novel model based technique for solving scheduling and planning problems. After modeling them as reachability problems using timed automata, a real-time model checker can compute the fastest trace to the goal states which constitutes a time optimal schedule....... We propose distributed computing to accelerate time optimal reachability analysis. We develop five distributed state exploration algorithms, implement them in \\uppaal enabling it to exploit the compute resources of a dedicated model-checking cluster. We experimentally evaluate the implemented...... algorithms with four models in terms of their ability to compute near- or proven-optimal solutions, their scalability, time and memory consumption and communication overhead. Our results show that distributed algorithms work much faster than sequential algorithms and have good speedup in general....
silicon bipolar distributed oscillator design and analysis
African Journals Online (AJOL)
common emitter (CE) with distributed output and analysis is carried out. The general .... the base and collector lines are not equal, so it is important to have the correct source ... Using the data obtained for the output characteristics, a graph of ( ) ...
The Analysis and Design of Distributed Systems
Aksit, Mehmet
1992-01-01
The design of distributed object-oriented systems involves a number of considerations that rarely arise in sequential object-oriented design or in non-object-oriented languages. The tutorial describes analysis and design techniques for data abstraction, inheritance, delegation, persistence,
Performance optimisations for distributed analysis in ALICE
Betev, L; Gheata, M; Grigoras, C; Hristov, P
2014-01-01
Performance is a critical issue in a production system accommodating hundreds of analysis users. Compared to a local session, distributed analysis is exposed to services and network latencies, remote data access and heterogeneous computing infrastructure, creating a more complex performance and efficiency optimization matrix. During the last 2 years, ALICE analysis shifted from a fast development phase to the more mature and stable code. At the same time, the framewo rks and tools for deployment, monitoring and management of large productions have evolved considerably too. The ALICE Grid production system is currently used by a fair share of organized and individual user analysis, consuming up to 30% or the available r esources and ranging from fully I/O - bound analysis code to CPU intensive correlations or resonances studies. While the intrinsic analysis performance is unlikely to improve by a large factor during the LHC long shutdown (LS1), the overall efficiency of the system has still to be improved by a...
Shuttle Electrical Power Analysis Program (SEPAP) distribution circuit analysis report
Torina, E. M.
1975-01-01
An analysis and evaluation was made of the operating parameters of the shuttle electrical power distribution circuit under load conditions encountered during a normal Sortie 2 Mission with emphasis on main periods of liftoff and landing.
Directory of Open Access Journals (Sweden)
L. M. Vas
2012-12-01
Full Text Available The short and long term creep behavior is one of the most important properties of polymers used for engineering applications. In order to study this kind of behavior of PP tensile and short term creep measurements were performed and analyzed using long term creep behavior estimating method based on short term tensile and creep tests performed at room temperature, viscoelastic behavior, and variable transformations. Applying Weibull distribution based approximations for the measured curves predictions for the creep strain to failure depending on the creep load were determined and the parameters were found by fitting the measurements. The upper, mean, and lower estimations as well as the confidence interval for the means give a possibility for designers' calculations at arbitrary creep load levels.
The Weibull functional form for the energetic particle spectrum at interplanetary shock waves
Laurenza, M.; Consolini, G.; Storini, M.; Pallocchia, G.; Damiani, A.
2016-11-01
Transient interplanetary shock waves are often associated with high energy particle enhancements, which are called energetic storm particle (ESP) events. Here we present a case study of an ESP event, recorded by the SEPT, LET and HET instruments onboard the STEREO B spacecraft, on 3 October 2011, in a wide energy range from 0.1 MeV to ∼ 30 MeV. The obtained particle spectrum is found to be reproduced by a Weibull like shape. Moreover, we show that the Weibull spectrum can be theoretically derived as the asymptotic steady state solution of the diffusion loss equation by assuming anomalous diffusion for particle velocity. The evaluation of Weibull's parameters obtained from particle observations and the power spectral density of the turbulent fluctations in the shock region, support this scenario and suggest that stochastic acceleration can contribute significantly to the acceleration of high energetic particles at collisioness shock waves.
Vanfleteren, J R; De Vreese, A; Braeckman, B P
1998-11-01
We have fitted Gompertz, Weibull, and two- and three-parameter logistic equations to survival data obtained from 77 cohorts of Caenorhabditis elegans in axenic culture. Statistical analysis showed that the fitting ability was in the order: three-parameter logistic > two-parameter logistic = Weibull > Gompertz. Pooled data were better fit by the logistic equations, which tended to perform equally well as population size increased, suggesting that the third parameter is likely to be biologically irrelevant. Considering restraints imposed by the small population sizes used, we simply conclude that the two-parameter logistic and Weibull mortality models for axenically grown C. elegans generally provided good fits to the data, whereas the Gompertz model was inappropriate in many cases. The survival curves of several short- and long-lived mutant strains could be predicted by adjusting only the logistic curve parameter that defines mean life span. We conclude that life expectancy is genetically determined; the life span-altering mutations reported in this study define a novel mean life span, but do not appear to fundamentally alter the aging process.
Complexity Analysis of Peat Soil Density Distribution
Sampurno, Joko; Diah Faryuni, Irfana; Dzar Eljabbar Latief, Fourier; Srigutomo, Wahyu
2016-08-01
The distributions of peat soil density have been identified using fractal analysis method. The study was conducted on 5 peat soil samples taken from a ground field in Pontianak, West Kalimantan, at the coordinates (0 ° 4 '2:27 "S, 109 ° 18' 48.59" E). In this study, we used micro computerized tomography (pCT Scanner) at 9.41 micro meter per pixel resolution under peat soil samples to provide 2-D high-resolution images L1-L5 (200 200 pixels) that were used to detect the distribution of peat soil density. The method for determining the fractal dimension and intercept was the 2-D Fourier analysis method. The method was used to obtain the log log-plot of magnitude with frequency. Fractal dimension was obtained from the straight regression line that interpolated the points in the interval with the largest coefficient determination. Intercept defined by the point of intersection on the -axis. The conclusion was that the distributions of peat soil density showing the fractal behaviour with the heterogeneity of the samples from the highest to the lowest were L5, L1, L4, L3 and L2. Meanwhile, the range of density values of the samples from the highest to the lowest was L3, L2, L4, L5 and L1. The study also concluded that the behaviour of the distribution of peat soil density was a weakly anisotropic.
Weibull approximation of LiDAR waveforms for estimating the beam attenuation coefficient.
Montes-Hugo, Martin A; Vuorenkoski, Anni K; Dalgleish, Fraser R; Ouyang, Bing
2016-10-03
Tank experiments were performed at different water turbidities to examine relationships between the beam attenuation coefficient (c) and Weibull shape parameters derived from LiDAR waveforms measured with the Fine Structure Underwater LiDAR (FSUIL). Optical inversions were made at 532 nm, within a c range of 0.045-1.52 m-1, and based on a LiDAR system having two field-of-view (15 and 75.7 mrad) and two linear polarizations. Consistently, the Weibull scale parameter or P2 showed the strongest covariation with c and was a more accurate proxy with respect to the LiDAR attenuation coefficient.
Analysis of SAW distributed feedback resonators
Vandewege, J.; Lagasse, P. E.
1981-01-01
The main characteristics and advantages of the surface acoustic wave (SAW) distributed feedback resonator are discussed. A coupled mode analysis provides physical insight and simple formulas for the resonant frequency, the quality factor, and the input impedance. Those results are verified by means of a transmission line computer model and by a number of measurements in the frequency range 30-250 MHz. On YZ LiNbO3 substrates, quality factors of the order of 5,000 are routinely obtained.
Analysis and Modelling of Extreme Wind Speed Distributions in Complex Mountainous Regions
Laib, Mohamed; Kanevski, Mikhail
2016-04-01
Modelling of wind speed distributions in complex mountainous regions is an important and challenging problem which interests many scientists from several fields. In the present research, high frequency (10 min) Swiss wind speed monitoring data (IDAWEB service, Meteosuisse) are analysed and modelled with different parametric distributions (Weibull, GEV, Gamma, etc.) using maximum likelihood method. In total, 111 stations placed in different geomorphological units and at different altitude (from 203 to 3580 meters) are studied. Then, this information is used for training machine learning algorithms (Extreme Learning Machines, Support vector machine) to predict the distribution at new places, potentially useful for aeolian energy generation. An important part of the research deals with the construction and application of a high dimensional input feature space, generated from digital elevation model. A comprehensive study was carried out using feature selection approach to get the best model for the prediction. The main results are presented as spatial patterns of distributions' parameters.
Distributed analysis in ATLAS using GANGA
Elmsheuser, Johannes; Brochu, Frederic; Cowan, Greig; Egede, Ulrik; Gaidioz, Benjamin; Lee, Hurng-Chun; Maier, Andrew; Móscicki, Jakub; Pajchel, Katarina; Reece, Will; Samset, Bjorn; Slater, Mark; Soroko, Alexander; Vanderster, Daniel; Williams, Michael
2010-04-01
Distributed data analysis using Grid resources is one of the fundamental applications in high energy physics to be addressed and realized before the start of LHC data taking. The needs to manage the resources are very high. In every experiment up to a thousand physicists will be submitting analysis jobs to the Grid. Appropriate user interfaces and helper applications have to be made available to assure that all users can use the Grid without expertise in Grid technology. These tools enlarge the number of Grid users from a few production administrators to potentially all participating physicists. The GANGA job management system (http://cern.ch/ganga), developed as a common project between the ATLAS and LHCb experiments, provides and integrates these kind of tools. GANGA provides a simple and consistent way of preparing, organizing and executing analysis tasks within the experiment analysis framework, implemented through a plug-in system. It allows trivial switching between running test jobs on a local batch system and running large-scale analyzes on the Grid, hiding Grid technicalities. We will be reporting on the plug-ins and our experiences of distributed data analysis using GANGA within the ATLAS experiment. Support for all Grids presently used by ATLAS, namely the LCG/EGEE, NDGF/NorduGrid, and OSG/PanDA is provided. The integration and interaction with the ATLAS data management system DQ2 into GANGA is a key functionality. An intelligent job brokering is set up by using the job splitting mechanism together with data-set and file location knowledge. The brokering is aided by an automated system that regularly processes test analysis jobs at all ATLAS DQ2 supported sites. Large numbers of analysis jobs can be sent to the locations of data following the ATLAS computing model. GANGA supports amongst other things tasks of user analysis with reconstructed data and small scale production of Monte Carlo data.
Objective Bayesian Analysis of Skew- t Distributions
BRANCO, MARCIA D'ELIA
2012-02-27
We study the Jeffreys prior and its properties for the shape parameter of univariate skew-t distributions with linear and nonlinear Student\\'s t skewing functions. In both cases, we show that the resulting priors for the shape parameter are symmetric around zero and proper. Moreover, we propose a Student\\'s t approximation of the Jeffreys prior that makes an objective Bayesian analysis easy to perform. We carry out a Monte Carlo simulation study that demonstrates an overall better behaviour of the maximum a posteriori estimator compared with the maximum likelihood estimator. We also compare the frequentist coverage of the credible intervals based on the Jeffreys prior and its approximation and show that they are similar. We further discuss location-scale models under scale mixtures of skew-normal distributions and show some conditions for the existence of the posterior distribution and its moments. Finally, we present three numerical examples to illustrate the implications of our results on inference for skew-t distributions. © 2012 Board of the Foundation of the Scandinavian Journal of Statistics.
Optimization of a small passive wind turbine based on mixed Weibull-turbulence statistics of wind
2008-01-01
A "low cost full passive structure" of wind turbine system is proposed. The efficiency of such device can be obtained only if the design parameters are mutually adapted through an optimization design approach. An original wind profile generation process mixing Weibull and turbulence statistics is presented. The optimization results are compared with those obtained from a particular but typical time cycle of wind speed.
Weibull statistics effective area and volume in the ball-on-ring testing method
DEFF Research Database (Denmark)
Frandsen, Henrik Lund
2014-01-01
The ball-on-ring method is together with other biaxial bending methods often used for measuring the strength of plates of brittle materials, because machining defects are remote from the high stresses causing the failure of the specimens. In order to scale the measured Weibull strength...
Directory of Open Access Journals (Sweden)
Lalit Mohan Pradhan
2014-03-01
Full Text Available Background: In the present competitive business scenario researchers have developed various inventory models for deteriorating items considering various practical situations for better inventory control. Permissible delay in payments with various demands and deteriorations is considerably a new concept introduced in developing various inventory models. These models are very useful for both the consumers and the manufacturer. Methods: In the present work an inventory model has been developed for a three parameter Weibull deteriorating item with ramp type demand and salvage value under trade credit system. Here we have considered a single item for developing the model. Results and conclusion: Optimal order quantity, optimal cycle time and total variable cost during a cycle have been derived for the proposed inventory model. The results obtained in this paper have been illustrated with the help of numerical examples and sensitivity analysis.
Pal, Suvra; Balakrishnan, N
2017-05-16
In this paper, we develop likelihood inference based on the expectation maximization (EM) algorithm for the Box- Cox transformation cure rate model assuming the lifetimes to follow a Weibull distribution. A simulation study is carried out to demonstrate the performance of the proposed estimation method. Through Monte Carlo simulations, we also study the effect of model mis-specification on the estimate of cure rate. Finally, we analyze a well-known data on melanoma with the model and the inferential method developed here.
Analysis of Jingdong Mall Logistics Distribution Model
Shao, Kang; Cheng, Feng
In recent years, the development of electronic commerce in our country to speed up the pace. The role of logistics has been highlighted, more and more electronic commerce enterprise are beginning to realize the importance of logistics in the success or failure of the enterprise. In this paper, the author take Jingdong Mall for example, performing a SWOT analysis of their current situation of self-built logistics system, find out the problems existing in the current Jingdong Mall logistics distribution and give appropriate recommendations.
Node-based analysis of species distributions
DEFF Research Database (Denmark)
Borregaard, Michael Krabbe; Rahbek, Carsten; Fjeldså, Jon;
2014-01-01
The integration of species distributions and evolutionary relationships is one of the most rapidly moving research fields today and has led to considerable advances in our understanding of the processes underlying biogeographical patterns. Here, we develop a set of metrics, the specific overrepre......The integration of species distributions and evolutionary relationships is one of the most rapidly moving research fields today and has led to considerable advances in our understanding of the processes underlying biogeographical patterns. Here, we develop a set of metrics, the specific...... with case studies on two groups with well-described biogeographical histories: a local-scale community data set of hummingbirds in the North Andes, and a large-scale data set of the distribution of all species of New World flycatchers. The node-based analysis of these two groups generates a set...... of intuitively interpretable patterns that are consistent with current biogeographical knowledge.Importantly, the results are statistically tractable, opening many possibilities for their use in analyses of evolutionary, historical and spatial patterns of species diversity. The method is implemented...
Analysis and control of distributed cooperative systems.
Energy Technology Data Exchange (ETDEWEB)
Feddema, John Todd; Parker, Eric Paul; Wagner, John S.; Schoenwald, David Alan
2004-09-01
As part of DARPA Information Processing Technology Office (IPTO) Software for Distributed Robotics (SDR) Program, Sandia National Laboratories has developed analysis and control software for coordinating tens to thousands of autonomous cooperative robotic agents (primarily unmanned ground vehicles) performing military operations such as reconnaissance, surveillance and target acquisition; countermine and explosive ordnance disposal; force protection and physical security; and logistics support. Due to the nature of these applications, the control techniques must be distributed, and they must not rely on high bandwidth communication between agents. At the same time, a single soldier must easily direct these large-scale systems. Finally, the control techniques must be provably convergent so as not to cause undo harm to civilians. In this project, provably convergent, moderate communication bandwidth, distributed control algorithms have been developed that can be regulated by a single soldier. We have simulated in great detail the control of low numbers of vehicles (up to 20) navigating throughout a building, and we have simulated in lesser detail the control of larger numbers of vehicles (up to 1000) trying to locate several targets in a large outdoor facility. Finally, we have experimentally validated the resulting control algorithms on smaller numbers of autonomous vehicles.
Buffered Communication Analysis in Distributed Multiparty Sessions
Deniélou, Pierre-Malo; Yoshida, Nobuko
Many communication-centred systems today rely on asynchronous messaging among distributed peers to make efficient use of parallel execution and resource access. With such asynchrony, the communication buffers can happen to grow inconsiderately over time. This paper proposes a static verification methodology based on multiparty session types which can efficiently compute the upper bounds on buffer sizes. Our analysis relies on a uniform causality audit of the entire collaboration pattern - an examination that is not always possible from each end-point type. We extend this method to design algorithms that allocate communication channels in order to optimise the memory requirements of session executions. From these analyses, we propose two refinements methods which respect buffer bounds: a global protocol refinement that automatically inserts confirmation messages to guarantee stipulated buffer sizes and a local protocol refinement to optimise asynchronous messaging without buffer overflow. Finally our work is applied to overcome a buffer overflow problem of the multi-buffering algorithm.
Noise analysis in power distribution systems
Danisor, Alin
2016-12-01
This paper proposes an analysis, especially in time domain, of the electrical noise existent on the power distribution lines. This study is important for the use of powerlines as a channel of information transmissions. This information may refer to analog signals and as well to digital signals. The main problem addressed in this paper consists in the characterization of the background noise and to establish his statistical proprieties. It is very important to know if the noise induced in the transmission channel is a stationary one, or even an ergodic one. The main parameters like the mean value, the mean square value were determined in this paper. The approximation of the probability density function of each statistical parameter was studied. The pulses induced in the transmission channel by the transient phenomena of the power electrical systems were considered deterministic signals and their contributions were not included in this study.
Psychotherapy and distributive justice: a Rawlsian analysis.
Wilmot, Stephen
2009-03-01
In this paper I outline an approach to the distribution of resources between psychotherapy modalities in the context of the UK's health care system, using recent discussions of Cognitive Behavioural Psychotherapy as a way of highlighting resourcing issues. My main goal is to offer an approach that is just, and that accommodates the diversity of different schools of psychotherapy. In order to do this I draw extensively on the theories of Justice and of Political Liberalism developed by the late John Rawls, and adapt these to the particular requirements of psychotherapy resourcing. I explore some of the implications of this particular analysis, and consider how the principles of Rawlsian justice might translate into ground rules for deliberation and decision-making.
Sazuka, N
2006-01-01
We analyze waiting times for price changes in a foreign currency exchange rate. Recent empirical studies of high frequency financial data support that trades in financial markets do not follow a Poisson process and the waiting times between trades are not exponentially distributed. Here we show that our data is well approximated by a Weibull distribution rather than an exponential distribution in a non-asymptotic regime. Moreover, we quantitatively evaluate how much an empirical data is far from an exponential distribution using a Weibull fit. Finally, we discuss a phase transition between a Weibull-law and a power-law in the asymptotic long waiting time regime.
ATLAS Distributed Data Analysis: challenges and performance
Fassi, Farida; The ATLAS collaboration
2015-01-01
In the LHC operations era the key goal is to analyse the results of the collisions of high-energy particles as a way of probing the fundamental forces of nature. The ATLAS experiment at the LHC at CERN is recording and simulating several 10's of PetaBytes of data per year. The ATLAS Computing Model was designed around the concepts of Grid Computing. Large data volumes from the detectors and simulations require a large number of CPUs and storage space for data processing. To cope with this challenge a global network known as the Worldwide LHC Computing Grid (WLCG) was built. This is the most sophisticated data taking and analysis system ever built. ATLAS accumulated more than 140 PB of data between 2009 and 2014. To analyse these data ATLAS developed, deployed and now operates a mature and stable distributed analysis (DA) service on the WLCG. The service is actively used: more than half a million user jobs run daily on DA resources, submitted by more than 1500 ATLAS physicists. A significant reliability of the...
ATLAS Distributed Data Analysis: performance and challenges
Fassi, Farida; The ATLAS collaboration
2015-01-01
In the LHC operations era the key goal is to analyse the results of the collisions of high-energy particles as a way of probing the fundamental forces of nature. The ATLAS experiment at the LHC at CERN is recording and simulating several 10's of PetaBytes of data per year. The ATLAS Computing Model was designed around the concepts of Grid Computing. Large data volumes from the detectors and simulations require a large number of CPUs and storage space for data processing. To cope with this challenge a global network known as the Worldwide LHC Computing Grid (WLCG) was built. This is the most sophisticated data taking and analysis system ever built. ATLAS accumulated more than 140 PB of data between 2009 and 2014. To analyse these data ATLAS developed, deployed and now operates a mature and stable distributed analysis (DA) service on the WLCG. The service is actively used: more than half a million user jobs run daily on DA resources, submitted by more than 1500 ATLAS physicists. A significant reliability of the...
Institute of Scientific and Technical Information of China (English)
SHA Yun-dong; GUO Xiao-peng; LIAO Lian-fang; XIE Li-juan
2011-01-01
As to the sonic fatigue problem of an aero-engine combustor liner structure under the random acoustic loadings, an effective method for predicting the fatigue life of a structure under random loadings was studied. Firstly, the probability distribution of Von Mises stress of thin-walled structure under random loadings was studied, analysis suggested that probability density function of Von Mises stress process accord approximately with two-parameter Weibull distribution. The formula for calculating Weibull parameters were given. Based on the Miner linear theory, the method for predicting the random sonic fatigue life based on the stress probability density was developed, and the model for fatigue life prediction was constructed. As an example, an aero-engine combustor liner structure was considered. The power spectrum density (PSD) of the vibrational stress response was calculated by using the coupled FEM/BEM (finite element method/boundary element method) model, the fatigue life was estimated by using the constructed model. And considering the influence of the wide frequency band, the calculated results were modified. Comparetive analysis shows that the estimated results of sonic fatigue of the combustor liner structure by using Weibull distribution of Von Mises stress are more conservative than using Dirlik distribution to some extend. The results show that the methods presented in this paper are practical for the random fatigue life analysis of the aeronautical thin-walled structures.
Global resilience analysis of water distribution systems.
Diao, Kegong; Sweetapple, Chris; Farmani, Raziyeh; Fu, Guangtao; Ward, Sarah; Butler, David
2016-12-01
Evaluating and enhancing resilience in water infrastructure is a crucial step towards more sustainable urban water management. As a prerequisite to enhancing resilience, a detailed understanding is required of the inherent resilience of the underlying system. Differing from traditional risk analysis, here we propose a global resilience analysis (GRA) approach that shifts the objective from analysing multiple and unknown threats to analysing the more identifiable and measurable system responses to extreme conditions, i.e. potential failure modes. GRA aims to evaluate a system's resilience to a possible failure mode regardless of the causal threat(s) (known or unknown, external or internal). The method is applied to test the resilience of four water distribution systems (WDSs) with various features to three typical failure modes (pipe failure, excess demand, and substance intrusion). The study reveals GRA provides an overview of a water system's resilience to various failure modes. For each failure mode, it identifies the range of corresponding failure impacts and reveals extreme scenarios (e.g. the complete loss of water supply with only 5% pipe failure, or still meeting 80% of demand despite over 70% of pipes failing). GRA also reveals that increased resilience to one failure mode may decrease resilience to another and increasing system capacity may delay the system's recovery in some situations. It is also shown that selecting an appropriate level of detail for hydraulic models is of great importance in resilience analysis. The method can be used as a comprehensive diagnostic framework to evaluate a range of interventions for improving system resilience in future studies.
Directory of Open Access Journals (Sweden)
Soontorn Boonta
2013-01-01
Full Text Available In this study, we applied Randomized Neighborhood Search (RNS to estimate the Weibull parameters to determine the severity of fire accidents; the data were provided by the Thai Reinsurance Public Co., Ltd. We compared this technique with other frequently-used techniques: the Maximum Likelihood Estimator (MLE, the Method of Moments (MOM, the Least Squares Method (LSM and the weighted least squares method (WLSM and found that RNS estimates the parameters more accurately than do MLE, MOM, LSM or WLSM."
Valoración de derivados europeos con mixtura de distribuciones Weibull
Directory of Open Access Journals (Sweden)
Andrés Mauricio Molina
2015-07-01
Full Text Available El modelo Black-Scholes para valoración de opciones europeas se usa bastante en el mercado por su fácil ejecución. Sin embargo, empieza a ser poco preciso en diferentes activos cuya dinámica no es de una distribución lognormal, por lo que se necesita buscar nuevas distribuciones para valorar opciones emitidas sobre diferentes activos subyacentes. Varios investigadores han trabajado en nuevas fórmulas de valoración de derivados suponiendo diferentes distribuciones ya sea para el precio del activo subyacente o para su retorno. Este artículo presenta dos fórmulas para valoración de activos: una modifica la fórmula usando una distribución de Weibull de dos parámetros propuesta por Savickas (2002 añadiendo dos nuevos parámetros (escala y localización y otra supone que la distribución del activo es una mixtura de distribuciones de Weibull. Se presentan también comparaciones de estos modelos con otros ya existentes como Black-Scholes y el modelo de Savickas con distribución Weibull simple.
Distribution entropy analysis of epileptic EEG signals.
Li, Peng; Yan, Chang; Karmakar, Chandan; Liu, Changchun
2015-01-01
It is an open-ended challenge to accurately detect the epileptic seizures through electroencephalogram (EEG) signals. Recently published studies have made elaborate attempts to distinguish between the normal and epileptic EEG signals by advanced nonlinear entropy methods, such as the approximate entropy, sample entropy, fuzzy entropy, and permutation entropy, etc. Most recently, a novel distribution entropy (DistEn) has been reported to have superior performance compared with the conventional entropy methods for especially short length data. We thus aimed, in the present study, to show the potential of DistEn in the analysis of epileptic EEG signals. The publicly-accessible Bonn database which consisted of normal, interictal, and ictal EEG signals was used in this study. Three different measurement protocols were set for better understanding the performance of DistEn, which are: i) calculate the DistEn of a specific EEG signal using the full recording; ii) calculate the DistEn by averaging the results for all its possible non-overlapped 5 second segments; and iii) calculate it by averaging the DistEn values for all the possible non-overlapped segments of 1 second length, respectively. Results for all three protocols indicated a statistically significantly increased DistEn for the ictal class compared with both the normal and interictal classes. Besides, the results obtained under the third protocol, which only used very short segments (1 s) of EEG recordings showed a significantly (p entropy algorithm. The capability of discriminating between the normal and interictal EEG signals is of great clinical relevance since it may provide helpful tools for the detection of a seizure onset. Therefore, our study suggests that the DistEn analysis of EEG signals is very promising for clinical and even portable EEG monitoring.
Sensitivity analysis of distributed volcanic source inversion
Cannavo', Flavio; Camacho, Antonio G.; González, Pablo J.; Puglisi, Giuseppe; Fernández, José
2016-04-01
A recently proposed algorithm (Camacho et al., 2011) claims to rapidly estimate magmatic sources from surface geodetic data without any a priori assumption about source geometry. The algorithm takes the advantages of fast calculation from the analytical models and adds the capability to model free-shape distributed sources. Assuming homogenous elastic conditions, the approach can determine general geometrical configurations of pressured and/or density source and/or sliding structures corresponding to prescribed values of anomalous density, pressure and slip. These source bodies are described as aggregation of elemental point sources for pressure, density and slip, and they fit the whole data (keeping some 3D regularity conditions). Although some examples and applications have been already presented to demonstrate the ability of the algorithm in reconstructing a magma pressure source (e.g. Camacho et al., 2011,Cannavò et al., 2015), a systematic analysis of sensitivity and reliability of the algorithm is still lacking. In this explorative work we present results from a large statistical test designed to evaluate the advantages and limitations of the methodology by assessing its sensitivity to the free and constrained parameters involved in inversions. In particular, besides the source parameters, we focused on the ground deformation network topology, and noise in measurements. The proposed analysis can be used for a better interpretation of the algorithm results in real-case applications. Camacho, A. G., González, P. J., Fernández, J. & Berrino, G. (2011) Simultaneous inversion of surface deformation and gravity changes by means of extended bodies with a free geometry: Application to deforming calderas. J. Geophys. Res. 116. Cannavò F., Camacho A.G., González P.J., Mattia M., Puglisi G., Fernández J. (2015) Real Time Tracking of Magmatic Intrusions by means of Ground Deformation Modeling during Volcanic Crises, Scientific Reports, 5 (10970) doi:10.1038/srep
Institute of Scientific and Technical Information of China (English)
FAN Xiaoyi; QIAO Jianping
2006-01-01
The landslide data were calculated in the Three Gorges Area of northeast Chongqing. The results showed that landslide frequency distributions of gradients accorded with the Weibull probability density distribution function. The landslide hazard ratios of gradients were acquired by Weibull accumulation probability distribution function in the different geological units. There was discord between landslide hazard ratio of different geological units and variance of landslide gradient. But they were approximate homology in the strata of Jurassic. The results indicate that the Weibull distribution can quantitatively evaluate the landslide hazard ratios of gradients of the different strata in the Three Gorges Area.
Corroded scale analysis from water distribution pipes
Directory of Open Access Journals (Sweden)
Rajaković-Ognjanović Vladana N.
2011-01-01
Full Text Available The subject of this study was the steel pipes that are part of Belgrade's drinking water supply network. In order to investigate the mutual effects of corrosion and water quality, the corrosion scales on the pipes were analyzed. The idea was to improve control of corrosion processes and prevent impact of corrosion on water quality degradation. The instrumental methods for corrosion scales characterization used were: scanning electron microscopy (SEM, for the investigation of corrosion scales of the analyzed samples surfaces, X-ray diffraction (XRD, for the analysis of the presence of solid forms inside scales, scanning electron microscopy (SEM, for the microstructural analysis of the corroded scales, and BET adsorption isotherm for the surface area determination. Depending on the composition of water next to the pipe surface, corrosion of iron results in the formation of different compounds and solid phases. The composition and structure of the iron scales in the drinking water distribution pipes depends on the type of the metal and the composition of the aqueous phase. Their formation is probably governed by several factors that include water quality parameters such as pH, alkalinity, buffer intensity, natural organic matter (NOM concentration, and dissolved oxygen (DO concentration. Factors such as water flow patterns, seasonal fluctuations in temperature, and microbiological activity as well as water treatment practices such as application of corrosion inhibitors can also influence corrosion scale formation and growth. Therefore, the corrosion scales found in iron and steel pipes are expected to have unique features for each site. Compounds that are found in iron corrosion scales often include goethite, lepidocrocite, magnetite, hematite, ferrous oxide, siderite, ferrous hydroxide, ferric hydroxide, ferrihydrite, calcium carbonate and green rusts. Iron scales have characteristic features that include: corroded floor, porous core that contains
RELIABILITY ANALYSIS OF POWER DISTRIBUTION SYSTEMS
Directory of Open Access Journals (Sweden)
Popescu V.S.
2012-04-01
Full Text Available Power distribution systems are basic parts of power systems and reliability of these systems at present is a key issue for power engineering development and requires special attention. Operation of distribution systems is accompanied by a number of factors that produce random data a large number of unplanned interruptions. Research has shown that the predominant factors that have a significant influence on the reliability of distribution systems are: weather conditions (39.7%, defects in equipment(25% and unknown random factors (20.1%. In the article is studied the influence of random behavior and are presented estimations of reliability of predominantly rural electrical distribution systems.
Shark: Fast Data Analysis Using Coarse-grained Distributed Memory
2013-05-01
Shark : Fast Data Analysis Using Coarse-grained Distributed Memory Clifford Engle Electrical Engineering and Computer Sciences University of...TYPE 3. DATES COVERED 00-00-2013 to 00-00-2013 4. TITLE AND SUBTITLE Shark : Fast Data Analysis Using Coarse-grained Distributed Memory 5a...NUMBER(S) 12. DISTRIBUTION/AVAILABILITY STATEMENT Approved for public release; distribution unlimited 13. SUPPLEMENTARY NOTES 14. ABSTRACT Shark is a
Weakest-Link Scaling and Finite Size Effects on Recurrence Times Distribution
Hristopulos, Dionissios T; Kaniadakis, Giorgio
2013-01-01
Tectonic earthquakes result from the fracturing of the Earth's crust due to the loading induced by the motion of the tectonic plates. Hence, the statistical laws of earthquakes must be intimately connected to the statistical laws of fracture. The Weibull distribution is a commonly used model of earthquake recurrence times (ERT). Nevertheless, deviations from Weibull scaling have been observed in ERT data and in fracture experiments on quasi-brittle materials. We propose that the weakest-link-scaling theory for finite-size systems leads to the kappa-Weibull function, which implies a power-law tail for the ERT distribution. We show that the ERT hazard rate function decreases linearly after a waiting time which is proportional to the system size (in terms of representative volume elements) raised to the inverse of the Weibull modulus. We also demonstrate that the kappa-Weibull can be applied to strongly correlated systems by means of simulations of a fiber bundle model.
Scaling analysis of meteorite shower mass distributions
DEFF Research Database (Denmark)
Oddershede, Lene; Meibom, A.; Bohr, Jakob
1998-01-01
Meteorite showers are the remains of extraterrestrial objects which are captivated by the gravitational field of the Earth. We have analyzed the mass distribution of fragments from 16 meteorite showers for scaling. The distributions exhibit distinct scaling behavior over several orders of magnetude...
Web Based Distributed Coastal Image Analysis System Project
National Aeronautics and Space Administration — This project develops Web based distributed image analysis system processing the Moderate Resolution Imaging Spectroradiometer (MODIS) data to provide decision...
Mixture Distribution Approach In Financial Risk Analysis
Kocak, Keziban; Calis, Nazif; Unal, Deniz
2014-01-01
In recent years, major changes occurred in the prices of stock exchange appeared the necessity of measuring the financial risk. Nowadays, Value-atRisk (VaR) is often used to calculate the financial risk. Parametric methods which need normality are mostly used in the calculation of VaR.If the financial data does not fit the normal distribution, mixture of normal distribution models can be fitted to this data. In this study, the financial risk is calculated by using normal mixture distribution ...
A Distributional Analysis of Green Tax Reforms
Gilbert E. Metcalf
1999-01-01
I measure the distributional impact of a shift toward greater reliance on environmental taxes (a green tax reform) using both annual and lifetime income measures to rank households. An environmental tax reform can be designed that has a negligible impact on the income distribution when the funds are rebated to households through reductions in the payroll tax and personal income tax. I also analyze trade-offs among competing goals of efficiency, equity, and ease of administration in the design...
Microbubble Size Distributions Data Collection and Analysis
2016-06-13
ABSTRACT A technique for determining the size distribution of micron-size bubbles from underway measurements at sea is described. A camera...Blank TM 841204 INTRODUCTION Properties of micron-sized bubble aggregates in sea water were investigated to determine their influence on the...problem during this study. This paper will discuss bubble size and size distribution measurements in sea water while underway. A technique to detect
Survival Analysis of Patients with End Stage Renal Disease
Urrutia, J. D.; Gayo, W. S.; Bautista, L. A.; Baccay, E. B.
2015-06-01
This paper provides a survival analysis of End Stage Renal Disease (ESRD) under Kaplan-Meier Estimates and Weibull Distribution. The data were obtained from the records of V. L. MakabaliMemorial Hospital with respect to time t (patient's age), covariates such as developed secondary disease (Pulmonary Congestion and Cardiovascular Disease), gender, and the event of interest: the death of ESRD patients. Survival and hazard rates were estimated using NCSS for Weibull Distribution and SPSS for Kaplan-Meier Estimates. These lead to the same conclusion that hazard rate increases and survival rate decreases of ESRD patient diagnosed with Pulmonary Congestion, Cardiovascular Disease and both diseases with respect to time. It also shows that female patients have a greater risk of death compared to males. The probability risk was given the equation R = 1 — e-H(t) where e-H(t) is the survival function, H(t) the cumulative hazard function which was created using Cox-Regression.
Some challenges of wind modelling for modern wind turbines: The Weibull distribution
DEFF Research Database (Denmark)
Gryning, Sven-Erik; Batchvarova, Ekatarina; Floors, Rogier;
2012-01-01
Wind power assessments, as well as forecast of wind energy production, are key issues in wind energy and grid related studies. However the hub height of today’s wind turbines is well above the surface layer. Wind profiles studies based on mast data show that the wind profile above the surface layer...... depends on the planetary boundary layer (PBL) structure and height, thus parameters that are not accounted for in today’s traditional applied flow simulation models and parameterizations. Here we report on one year of measurements of the wind profile performed by use of a long range wind lidar (WSL 70) up...... to a height of 600 meters with 50 meters resolution. The lidar is located at a flat coastal site. The applicability of the WRF model to predict some of the important parameters for wind energy has been investigated. In this presentation, some general results on the ability of WRF to predict the wind profile...
Some challenges of wind modelling for modern wind turbines: The Weibull distribution
Gryning, Sven-Erik; Batchvarova, Ekatarina; Floors, Rogier; Pena Diaz, Alfredo
2012-01-01
Wind power assessments, as well as forecast of wind energy production, are key issues in wind energy and grid related studies. However the hub height of today’s wind turbines is well above the surface layer. Wind profiles studies based on mast data show that the wind profile above the surface layer depends on the planetary boundary layer (PBL) structure and height, thus parameters that are not accounted for in today’s traditional applied flow simulation models and parameterizations. Here we r...
Application of Small Sample Analysis in Life Estimation of Aeroengine Components
Institute of Scientific and Technical Information of China (English)
NIE Ting
2010-01-01
The samples of fatigue life tests for aeroengine components are usually less than 5, so the evaluation of these samples belongs to small sample analysis. The Weibull distribution is known to describe the life data accurately, and the Weibayes method (developed from Bayesian method) expands on the experiential data in the small sample analysis of fatigue life in aeroengine. Based on the Weibull analysis, a program was developed to improve the efficiency of the reliability analysis for aeroengine compgnents. This program has complete cycle fatigue life was evaluated by this program. From the results, the following conclusions were drawn: (a) that this program could be used for the engineering applications, and (b) while a lack of former test data lowered the validity of evaluation results, the Weibayes method ensured the results of small sample analysis did not deviate from the truth.
Short circuit analysis of distribution system with integration of DG
DEFF Research Database (Denmark)
Su, Chi; Liu, Zhou; Chen, Zhe
2014-01-01
Integration of distributed generation (DG) such as wind turbines into distribution system is increasing all around the world, because of the flexible and environmentally friendly characteristics. However, DG integration may change the pattern of the fault currents in the distribution system...... as well. The results in this paper are based on mathematical analysis and simulation study in DIgSILENT PowerFactory....
Analysis of Temperature Distributions in Nighttime Inversions
Telyak, Oksana; Krasouski, Aliaksandr; Svetashev, Alexander; Turishev, Leonid; Barodka, Siarhei
2015-04-01
Adequate prediction of temperature inversion in the atmospheric boundary layer is one of prerequisites for successful forecasting of meteorological parameters and severe weather events. Examples include surface air temperature and precipitation forecasting as well as prediction of fog, frosts and smog with hazardous levels of atmospheric pollution. At the same time, reliable forecasting of temperature inversions remains an unsolved problem. For prediction of nighttime inversions over some specific territory, it is important to study characteristic features of local circulation cells formation and to properly take local factors into account to develop custom modeling techniques for operational use. The present study aims to investigate and analyze vertical temperature distributions in tropospheric inversions (isotherms) over the territory of Belarus. We study several specific cases of formation, evolution and decay of deep nighttime temperature inversions in Belarus by means of mesoscale numerical simulations with WRF model, considering basic mechanisms of isothermal and inverse temperature layers formation in the troposphere and impact of these layers on local circulation cells. Our primary goal is to assess the feasibility of advance prediction of inversions formation with WRF. Modeling results reveal that all cases under consideration have characteristic features of radiative inversions (e.g., their formation times, development phases, inversion intensities, etc). Regions of "blocking" layers formation are extensive and often spread over the entire territory of Belarus. Inversions decay starts from the lowermost (near surface) layer (altitudes of 5 to 50 m). In all cases, one can observe formation of temperature gradients that substantially differ from the basic inversion gradient, i.e. the layer splits into smaller layers, each having a different temperature stratification (isothermal, adiabatic, etc). As opposed to various empirical techniques as well as
Field distribution analysis in deflecting structures
Energy Technology Data Exchange (ETDEWEB)
Paramonov, V.V. [Joint Inst. for Nuclear Research, Moscow (Russian Federation)
2013-02-15
Deflecting structures are used now manly for bunch rotation in emittance exchange concepts, bunch diagnostics and to increase the luminosity. The bunch rotation is a transformation of a particles distribution in the six dimensional phase space. Together with the expected transformations, deflecting structures introduce distortions due to particularities - aberrations - in the deflecting field distribution. The distributions of deflecting fields are considered with respect to non linear additions, which provide emittance deteriorations during a transformation. The deflecting field is treated as combination of hybrid waves HE{sub 1} and HM{sub 1}. The criteria for selection and formation of deflecting structures with minimized level of aberrations are formulated and applied to known structures. Results of the study are confirmed by comparison with results of numerical simulations.
Hammou Elotmany; M'Hamed Eddahbi
2015-01-01
Hammou El-otmany, M'hamed Eddahbi Facult{\\'e} des Sciences et Techniques Marrakech-Maroc Laboratoire de m{\\'e}thodes stochastiques appliqu{\\'e}e a la finance et actuariat (LaMsaFA) Abstract. In the present paper we propose a new stochastic diffusion process with drift proportional to the Weibull density function defined as X $\\epsilon$ = x, dX t = $\\gamma$ t (1 -- t $\\gamma$+1) -- t $\\gamma$ X t dt + $\\sigma$X t dB t , t \\textgreater{} 0, with parameters $\\gamma$ \\textgreater{} 0 and $\\sigma$...
The effect of ignoring individual heterogeneity in Weibull log-normal sire frailty models
DEFF Research Database (Denmark)
Damgaard, Lars Holm; Korsgaard, Inge Riis; Simonsen, J;
2006-01-01
The objective of this study was, by means of simulation, to quantify the effect of ignoring individual heterogeneity in Weibull sire frailty models on parameter estimates and to address the consequences for genetic inferences. Three simulation studies were evaluated, which included 3 levels...... the software Survival Kit for the incomplete sire model. For the incomplete sire model, the Monte Carlo and Survival Kit parameter estimates were similar. This study established that when unobserved individual heterogeneity was ignored, the parameter estimates that included sire effects were biased toward zero...
On the Fourier Spectra of Distributions in Clifford Analysis
Institute of Scientific and Technical Information of China (English)
Fred BRACKX; Bram De KNOCK; Hennie De SCHEPPER
2006-01-01
In recent papers by Brackx, Delanghe and Sommen, some fundamental higher dimensional distributions have been reconsidered in the framework of Clifford analysis,eventually leading to the introduction of four broad classes of new distributions in Euclidean space. In the current paper we continue the in-depth study of these distributions, more specifically the study of their behaviour in frequency space, thus extending classical results of harmonic analysis.
Response Time Analysis of Distributed Web Systems Using QPNs
Directory of Open Access Journals (Sweden)
Tomasz Rak
2015-01-01
Full Text Available A performance model is used for studying distributed Web systems. Performance evaluation is done by obtaining load test measurements. Queueing Petri Nets formalism supports modeling and performance analysis of distributed World Wide Web environments. The proposed distributed Web systems modeling and design methodology have been applied in the evaluation of several system architectures under different external loads. Furthermore, performance analysis is done to determine the system response time.
Analysis of refrigerant mal-distribution
DEFF Research Database (Denmark)
Kærn, Martin Ryhl; Elmegaard, Brian
2009-01-01
to be two straight tubes. The refrigerant maldistribution is then induced to the evaporator by varying the vapor quality at the inlet to each tube and the air-flow across each tube. Finally it is shown that mal-distribution can be compensated by an intelligent distributor, that ensures equal superheat...
Analysis of refrigerant mal-distribution
DEFF Research Database (Denmark)
Kærn, Martin Ryhl; Elmegaard, Brian
2009-01-01
is developed in the object-oriented modeling language Modelica. The evaporator model is a dynamic distributed one-dimensional homogeneous model, but will be used here to present results in steady state. Fin-and-tube evaporators usually have a complex circuitry, however the evaporator will be simplified...
Integer sparse distributed memory: analysis and results.
Snaider, Javier; Franklin, Stan; Strain, Steve; George, E Olusegun
2013-10-01
Sparse distributed memory is an auto-associative memory system that stores high dimensional Boolean vectors. Here we present an extension of the original SDM, the Integer SDM that uses modular arithmetic integer vectors rather than binary vectors. This extension preserves many of the desirable properties of the original SDM: auto-associativity, content addressability, distributed storage, and robustness over noisy inputs. In addition, it improves the representation capabilities of the memory and is more robust over normalization. It can also be extended to support forgetting and reliable sequence storage. We performed several simulations that test the noise robustness property and capacity of the memory. Theoretical analyses of the memory's fidelity and capacity are also presented.
Nonlinear Progressive Collapse Analysis Including Distributed Plasticity
Mohamed Osama Ahmed; Imam Zubair Syed; Khattab Rania
2016-01-01
This paper demonstrates the effect of incorporating distributed plasticity in nonlinear analytical models used to assess the potential for progressive collapse of steel framed regular building structures. Emphasis on this paper is on the deformation response under the notionally removed column, in a typical Alternate Path (AP) method. The AP method employed in this paper is based on the provisions of the Unified Facilities Criteria – Design of Buildings to Resist Progressive Collapse, develop...
Economic analysis of efficient distribution transformer trends
Energy Technology Data Exchange (ETDEWEB)
Downing, D.J.; McConnell, B.W.; Barnes, P.R.; Hadley, S.W.; Van Dyke, J.W.
1998-03-01
This report outlines an approach that will account for uncertainty in the development of evaluation factors used to identify transformer designs with the lowest total owning cost (TOC). The TOC methodology is described and the most highly variable parameters are discussed. The model is developed to account for uncertainties as well as statistical distributions for the important parameters. Sample calculations are presented. The TOC methodology is applied to data provided by two utilities in order to test its validity.
Nonlinear Progressive Collapse Analysis Including Distributed Plasticity
Directory of Open Access Journals (Sweden)
Mohamed Osama Ahmed
2016-01-01
Full Text Available This paper demonstrates the effect of incorporating distributed plasticity in nonlinear analytical models used to assess the potential for progressive collapse of steel framed regular building structures. Emphasis on this paper is on the deformation response under the notionally removed column, in a typical Alternate Path (AP method. The AP method employed in this paper is based on the provisions of the Unified Facilities Criteria – Design of Buildings to Resist Progressive Collapse, developed and updated by the U.S. Department of Defense [1]. The AP method is often used for to assess the potential for progressive collapse of building structures that fall under Occupancy Category III or IV. A case study steel building is used to examine the effect of incorporating distributed plasticity, where moment frames were used on perimeter as well as the interior of the three dimensional structural system. It is concluded that the use of moment resisting frames within the structural system will enhance resistance to progressive collapse through ductile deformation response and that it is conserative to ignore the effects of distributed plasticity in determining peak displacement response under the notionally removed column.
Cortellini, Davide; Canale, Angelo; Souza, Rodrigo O A; Campos, Fernanda; Lima, Julia C; Özcan, Mutlu
2015-12-01
The aim of this study was to evaluate the durability of lithium disilicate crowns bonded on abutments prepared with two types of finish lines after long-term cyclic loading. Pressed lithium disilicate all-ceramic molar crowns were bonded (Variolink II) to epoxy abutments (height: 5.5 mm, Ø: 7.5 mm, conicity: 6°) (N = 20) with either knife-edge (KE) or large chamfer (LC) finish lines. Each assembly was submitted to cyclic loading (1,200,000×; 200 N; 1 Hz) in water and then tested until fracture in a universal testing machine (1 mm/min). Failure types were classified and further evaluated under stereomicroscope and SEM. The data (N) were analyzed using one-way ANOVA. Weibull distribution values including the Weibull modulus (m), characteristic strength (0), probability of failure at 5% (0.05), 1% (0.01), and correlation coefficient were calculated. Type of finish line did not significantly influence the mean fracture strength of pressed ceramic crowns (KE: 1655 ± 353 N; LC: 1618 ± 263 N) (p = 0.7898). Weibull distribution presented lower shape value (m) of KE (m = 5.48; CI: 3.5 to 8.6) compared to LC (m = 7.68; CI: 5.2 to 11.3). Characteristic strengths (0) (KE: 1784.9 N; LC: 1712.1 N) were higher than probability of failure at 5% (0.05) (KE: 1038.1 N; LC: 1163.4 N) followed by 1% (0.01) (KE: 771 N; LC: 941.1 N), with a correlation coefficient of 0.966 for KE and 0.924 for LC. Type V failures (severe fracture of the crown and/or tooth) were more common in both groups. SEM findings showed that fractures occurred mainly from the cement/ceramic interface at the occlusal side of the crowns. Lithium disilicate ceramic crowns bonded onto abutment teeth with KE preparation resulted in similar fracture strength to those bonded on abutments with LC finish line. Pressed lithium disilicate ceramic crowns may not require invasive finish line preparations since finish line type did not impair the strength after aging conditions. © 2015 by the American College of
Empirical analysis for Distributed Energy Resources' impact on future distribution network
DEFF Research Database (Denmark)
Han, Xue; Sandels, Claes; Zhu, Kun;
2012-01-01
operation will be alternated. In this paper, quantitative results in terms of how the future distribution grid will be changed by the deployment of distributed generation, active demand and electric vehicles, are presented. The analysis is based on the conditions for both a radial and a meshed distribution...... network. The input parameters are based on the current and envisioned DER deployment scenarios proposed for Sweden....
A Comparison of Distribution Free and Non-Distribution Free Factor Analysis Methods
Ritter, Nicola L.
2012-01-01
Many researchers recognize that factor analysis can be conducted on both correlation matrices and variance-covariance matrices. Although most researchers extract factors from non-distribution free or parametric methods, researchers can also extract factors from distribution free or non-parametric methods. The nature of the data dictates the method…
Transferability of Charpy Absorbed Energy to Fracture Toughness Based on Weibull Stress Criterion
Institute of Scientific and Technical Information of China (English)
Hongyang JING; Lianyong XU; Lixing HUO; Fumiyoshi Minami
2005-01-01
The relationship between Charpy absorbed energy and the fracture toughness by means of the (crack tip opening displacement (CTOD)) method was analyzed based on the Weibull stress criterion. The Charpy absorbed energy and the fracture toughness were measured for the SN490B steel under the ductile-brittle transition temperature region. For the instrumented Charpy impact test, the curves between the loading point displacement and the load against time were recorded. The critical Weibull stress was taken as a fracture controlled parameter, and it could not be affected by the specimen configuration and the loading pattern based on the local approach. The parameters controlled brittle fracture are obtained from the Charpy absorbed energy results, then the fracture toughness for the compact tension (CT) specimen is predicted. It is found that the results predicted are in good agreement with the experimental. The fracture toughness could be evaluated by the Charpy absorbed energy, because the local approach gives a good description for the brittle fracture even though the Charpy impact specimen or the CT specimen is used for the given material.
Institute of Scientific and Technical Information of China (English)
潘青松; 彭刚; 胡伟华; 徐鑫
2015-01-01
为了解混凝土在不同加载速率下的力学特性，采用微机控制电液伺服大型多功能动静力三轴仪，对强度等级为 C15、边长为150 mm 的立方体混凝土试件在不同加载速率为10－5／s，10－4／s，10－3／s，5×10－3／s 下进行了单轴压缩试验，对不同加载速率下单轴压缩混凝土的抗压强度、变形、基于修正后的 Weibull 统计理论的应力应变全曲线模型参数等进行了研究和分析。结果表明：修正后的 Weibull 统计理论模型能较好地拟合混凝土试件在不同加载速率下的全曲线模型；材料的强度硬化特性可以通过 Weibull 本构模型中的参数 m 和 E 值表征；应变软化特性可以通过 Weibull 本构模型中的参数 c 值表征。%In order to understand the mechanical properties of concrete under different loading rates,we conducted uniaxial compression test on cubic concrete specimens (strength C15,side length 150mm)under different loading rates (10 -5 /s,10 -4 /s,10 -3 /s,5 ×10 -3 /s).The test was carried out by micro-computer controlled electro-hydrau-lic servo static and dynamic multifunctional triaxial apparatus.The compressive strength of concrete under uniaxial compression,deformation,and stress-strain curve model parameters based on the statistical theory of modified Weibull were studied and analyzed.Results revealed that the modified Weibull strain curve model parameters well fit the complete curve model of concrete specimens under different loading rates.The strength hardening properties could be characterized by the values of parameter m and E in the Weibull constitutive model,and the strain soften-ing behavior can be expressed by parameter c in the constitutive Weibull model.
Reliability Distribution of Numerical Control Lathe Based on Correlation Analysis
Institute of Scientific and Technical Information of China (English)
Xiaoyan Qi; Guixiang Shen; Yingzhi Zhang; Shuguang Sun; Bingkun Chen
2016-01-01
Combined Reliability distribution with correlation analysis, a new method has been proposed to make Reliability distribution where considering the elements about structure correlation and failure correlation of subsystems. Firstly, we make a sequence for subsystems by means of TOPSIS which comprehends the considerations of Reliability allocation, and introducing a Copula connecting function to set up a distribution model based on structure correlation, failure correlation and target correlation, and then acquiring reliability target area of all subsystems by Matlab. In this method, not only the traditional distribution considerations are concerned, but also correlation influences are involved, to achieve supplementing information and optimizing distribution.
Energy Technology Data Exchange (ETDEWEB)
Sun, Huarui, E-mail: huarui.sun@bristol.ac.uk; Bajo, Miguel Montes; Uren, Michael J.; Kuball, Martin [Center for Device Thermography and Reliability (CDTR), H. H. Wills Physics Laboratory, University of Bristol, Tyndall Avenue, Bristol BS8 1TL (United Kingdom)
2015-01-26
Gate leakage degradation of AlGaN/GaN high electron mobility transistors under OFF-state stress is investigated using a combination of electrical, optical, and surface morphology characterizations. The generation of leakage “hot spots” at the edge of the gate is found to be strongly temperature accelerated. The time for the formation of each failure site follows a Weibull distribution with a shape parameter in the range of 0.7–0.9 from room temperature up to 120 °C. The average leakage per failure site is only weakly temperature dependent. The stress-induced structural degradation at the leakage sites exhibits a temperature dependence in the surface morphology, which is consistent with a surface defect generation process involving temperature-associated changes in the breakdown sites.
Vulnerability Analysis in Web Distributed Applications
Directory of Open Access Journals (Sweden)
Ion Ivan
2011-03-01
Full Text Available The paper analyze vulnerabilities found on web based distributed applications from different perspectives. Classes of vulnerabilities types are identified in order to cope with their different characteristics that each one develops. Methods for analyzing vulnerabilities of an authentication process are developed and solutions are proposed. A model for vulnerability minimization is discussed based on an indicator built on the amount of sensitive data revealed to the end users. Risks are analyzed together with the vulnerabilities that they exploit and measures are identified to combat these pairs.
Statistical wind analysis for near-space applications
Roney, Jason A.
2007-09-01
Statistical wind models were developed based on the existing observational wind data for near-space altitudes between 60 000 and 100 000 ft (18 30 km) above ground level (AGL) at two locations, Akon, OH, USA, and White Sands, NM, USA. These two sites are envisioned as playing a crucial role in the first flights of high-altitude airships. The analysis shown in this paper has not been previously applied to this region of the stratosphere for such an application. Standard statistics were compiled for these data such as mean, median, maximum wind speed, and standard deviation, and the data were modeled with Weibull distributions. These statistics indicated, on a yearly average, there is a lull or a “knee” in the wind between 65 000 and 72 000 ft AGL (20 22 km). From the standard statistics, trends at both locations indicated substantial seasonal variation in the mean wind speed at these heights. The yearly and monthly statistical modeling indicated that Weibull distributions were a reasonable model for the data. Forecasts and hindcasts were done by using a Weibull model based on 2004 data and comparing the model with the 2003 and 2005 data. The 2004 distribution was also a reasonable model for these years. Lastly, the Weibull distribution and cumulative function were used to predict the 50%, 95%, and 99% winds, which are directly related to the expected power requirements of a near-space station-keeping airship. These values indicated that using only the standard deviation of the mean may underestimate the operational conditions.
Comparison of methods for analysis of selective genotyping survival data
Directory of Open Access Journals (Sweden)
Dekkers Jack CM
2006-11-01
Full Text Available Abstract Survival traits and selective genotyping datasets are typically not normally distributed, thus common models used to identify QTL may not be statistically appropriate for their analysis. The objective of the present study was to compare models for identification of QTL associated with survival traits, in particular when combined with selective genotyping. Data were simulated to model the survival distribution of a population of chickens challenged with Marek disease virus. Cox proportional hazards (CPH, linear regression (LR, and Weibull models were compared for their appropriateness to analyze the data, ability to identify associations of marker alleles with survival, and estimation of effects when all individuals were genotyped (full genotyping and when selective genotyping was used. Little difference in power was found between the CPH and the LR model for low censoring cases for both full and selective genotyping. The simulated data were not transformed to follow a Weibull distribution and, as a result, the Weibull model generally resulted in less power than the other two models and overestimated effects. Effect estimates from LR and CPH were unbiased when all individuals were genotyped, but overestimated when selective genotyping was used. Thus, LR is preferred for analyzing survival data when the amount of censoring is low because of ease of implementation and interpretation. Including phenotypic data of non-genotyped individuals in selective genotyping analysis increased power, but resulted in LR having an inflated false positive rate, and therefore the CPH model is preferred for this scenario, although transformation of the data may also make the Weibull model appropriate for this case. The results from the research presented herein are directly applicable to interval mapping analyses.
Spatial distribution analysis on climatic variables in northeast China
Institute of Scientific and Technical Information of China (English)
无
2001-01-01
Information ecology is a new research area of modern ecology.Here describes spatial distribution analysis methods of four sorts of climatic variables, i.e. temperature, precipitation, relative humidity and sunshine fraction on Northeast China. First,Digital terrain models was built with large-scale maps and vector data. Then trend surface analysis and interpolation method were used to analyze the spatial distribution of these four kinds of climatic variables at three temporal scale: (1) monthly data; (2)mean monthly data of thirty years, and (3) mean annual data of thirty years. Ecological information system were used for graphics analysis on the spatial distribution of these climate variables.
Likelihood analysis of earthquake focal mechanism distributions
Kagan, Y Y
2014-01-01
In our paper published earlier we discussed forecasts of earthquake focal mechanism and ways to test the forecast efficiency. Several verification methods were proposed, but they were based on ad-hoc, empirical assumptions, thus their performance is questionable. In this work we apply a conventional likelihood method to measure a skill of forecast. The advantage of such an approach is that earthquake rate prediction can in principle be adequately combined with focal mechanism forecast, if both are based on the likelihood scores, resulting in a general forecast optimization. To calculate the likelihood score we need to compare actual forecasts or occurrences of predicted events with the null hypothesis that the mechanism's 3-D orientation is random. For double-couple source orientation the random probability distribution function is not uniform, which complicates the calculation of the likelihood value. To better understand the resulting complexities we calculate the information (likelihood) score for two rota...
Analysis of urea distribution volume in hemodialysis.
Maduell, F; Sigüenza, F; Caridad, A; Miralles, F; Serrato, F
1994-01-01
According to the urea kinetic model it is considered that the urea distribution volume (V) is that of body water, and that it is distributed in only one compartment. Since the V value is different to measure, it is normal to use 58% of body weight, in spite of the fact that it may range from 35 to 75%. In this study, we have calculated the value of V by using an accurate method based on the total elimination of urea from the dialysate. We have studied the V, and also whether the different dialysis characteristics modify it. Thirty-five patients were included in this study, 19 men and 16 women, under a chronic hemodialysis programme. The dialysate was collected in a graduated tank, and the concentration of urea in plasma and in dialysate were determined every hour. Every patient received six dialysis sessions, changing the blood flow (250 or 350 ml/min), the ultrafiltration (0.5 or 1.5 l/h), membrane (cuprophane or polyacrylonitrile) and/or buffer (bicarbonate or acetate). At the end of the hemodialysis session, the V value ranged from 43 to 72% of body weight; nevertheless, this value was practically constant in every patient. The V value gradually increased throughout the dialysis session, 42.1 +/- 6.9% of body weight in the first hour, 50.7 +/- 7.5% in the second hour and 55.7 +/- 7.9% at the end of the dialysis session. The change of blood flow, ultrafiltration, membrane or buffer did not alter the results. The V value was significantly higher in men in comparison with women, 60.0 +/- 6.6% vs. 50.5 +/- 5.9% of body weight (p < 0.001).
SPHERICAL MEANS, DISTRIBUTIONS AND CONVOLUTION OPERATORS IN CLIFFORD ANALYSIS
Institute of Scientific and Technical Information of China (English)
无
2003-01-01
New higher dimensional distributions are introduced in the framework of Clifford analysis.They complete the picture already established in previous work, offering unity and structuralclarity. Amongst them are the building blocks of the principal value distribution, involvingspherical harmonics, considered by Horvath and Stein.
Analysis and Comparison of Typical Models within Distribution Network Design
DEFF Research Database (Denmark)
Jørgensen, Hans Jacob; Larsen, Allan; Madsen, Oli B.G.
a number of important issues which have been identified when addressing the Distribution Network Design problem from a modelling angle. More specifically, we present an analysis of the research which has been performed in utilizing operational research in developing and optimising distribution systems....
Asynchronous Distributed Estimation of Topic Models for Document Analysis
2010-03-01
document retrieval and classification. An alternative approximate inference technique for LDA is variational Bayesian inference ( VB ) [1], where the true...posterior distribution is approximated by a fully factorized posterior distribution in order to simplify inference. In VB , the negative marginal log...Prabhakar, D. Shah , Gossip algorithms: Design, analysis and applications, in: INFOCOM, 2005, pp. 1653–1664. [16] I. Jolliffe, Principal component analysis
Directory of Open Access Journals (Sweden)
Nkongho Ayuketang Arreyndip
2016-01-01
Full Text Available The method of generalized extreme value family of distributions (Weibull, Gumbel, and Frechet is employed for the first time to assess the wind energy potential of Debuncha, South-West Cameroon, and to study the variation of energy over the seasons on this site. The 29-year (1983–2013 average daily wind speed data over Debuncha due to missing values in the years 1992 and 1994 is gotten from NASA satellite data through the RETScreen software tool provided by CANMET Canada. The data is partitioned into min-monthly, mean-monthly, and max-monthly data and fitted using maximum likelihood method to the two-parameter Weibull, Gumbel, and Frechet distributions for the purpose of determining the best fit to be used for assessing the wind energy potential on this site. The respective shape and scale parameters are estimated. By making use of the P values of the Kolmogorov-Smirnov statistic (K-S and the standard error (s.e analysis, the results show that the Frechet distribution best fits the min-monthly, mean-monthly, and max-monthly data compared to the Weibull and Gumbel distributions. Wind speed distributions and wind power densities of both the wet and dry seasons are compared. The results show that the wind power density of the wet season was higher than in the dry season. The wind speeds at this site seem quite low; maximum wind speeds are listed as between 3.1 and 4.2 m/s, which is below the cut-in wind speed of many modern turbines (6–10 m/s. However, we recommend the installation of low cut-in wind turbines like the Savonius or Aircon (10 KW for stand-alone low energy need.
PERFORMANCE EVALUATION OF OSSO-CFAR WITH BINARY INTEGRATION IN WEIBULL BACKGROUND
Institute of Scientific and Technical Information of China (English)
Meng Xiangwei
2013-01-01
The performance of the Ordered-Statistic Smallest Of (OSSO) Constant False Alarm Rate (CFAR) with binary integration in Weibull background with known shape parameter is analyzed,in the cases that the processor operates in homogeneous background and non-homogeneous situation caused by multiple targets and clutter edge.The analytical models of this scheme for the performance evaluation are given.It is shown that the OSSO-CFAR with binary integration can greatly improve the detection performance with respect to the single pulse processing case.As the clutter background becomes spiky,a high threshold S of binary integration (S/M) is required in order to obtain a good detection performance in homogeneous background.Moreover,the false alarm performance of the OSSO-CFAR with binary integration is more sensitive to the changes of shape parameter or power level of the clutter background.
A study of optimization problem for amplify-and-forward relaying over weibull fading channels
Ikki, Salama Said
2010-09-01
This paper addresses the power allocation and relay positioning problems in amplify-and-forward cooperative networks operating in Weibull fading environments. We study adaptive power allocation (PA) with fixed relay location, optimal relay location with fixed power allocation, and joint optimization of the PA and relay location under total transmit power constraint, in order to minimize the outage probability and average error probability at high signal-to-noise ratios (SNR). Analytical results are validated by numerical simulations and comparisons between the different optimization schemes and their performance are provided. Results show that optimum PA brings only coding gain, while optimum relay location yields, in addition to the latter, diversity gains as well. Also, joint optimization improves both, the diversity gain and coding gain. Furthermore, results illustrate that the analyzed adaptive algorithms outperform uniform schemes. ©2010 IEEE.
CRAB: the CMS distributed analysis tool development and design
Energy Technology Data Exchange (ETDEWEB)
Spiga, D. [University and INFN Perugia (Italy); Lacaprara, S. [INFN Legnaro (Italy); Bacchi, W. [University and INFN Bologna (Italy); Cinquilli, M. [University and INFN Perugia (Italy); Codispoti, G. [University and INFN Bologna (Italy); Corvo, M. [CERN (Switzerland); Dorigo, A. [INFN Padova (Italy); Fanfani, A. [University and INFN Bologna (Italy); Fanzago, F. [CERN (Switzerland); Farina, F. [INFN Milano-Bicocca (Italy); Gutsche, O. [FNAL (United States); Kavka, C. [INFN Trieste (Italy); Merlo, M. [INFN Milano-Bicocca (Italy); Servoli, L. [University and INFN Perugia (Italy)
2008-03-15
Starting from 2007 the CMS experiment will produce several Pbytes of data each year, to be distributed over many computing centers located in many different countries. The CMS computing model defines how the data are to be distributed such that CMS physicists can access them in an efficient manner in order to perform their physics analysis. CRAB (CMS Remote Analysis Builder) is a specific tool, designed and developed by the CMS collaboration, that facilitates access to the distributed data in a very transparent way. The tool's main feature is the possibility of distributing and parallelizing the local CMS batch data analysis processes over different Grid environments without any specific knowledge of the underlying computational infrastructures. More specifically CRAB allows the transparent usage of WLCG, gLite and OSG middleware. CRAB interacts with both the local user environment, with CMS Data Management services and with the Grid middleware.
CRAB: the CMS distributed analysis tool development and design
Spiga, D; Bacchi, W; Cinquilli, M; Codispoti, G; Corvo, M; Dorigo, A; Fanfani, A; Fanzago, F; Farina, F; Gutsche, O; Kavka, C; Merlo, M; Servoli, L
2008-01-01
Starting from 2007 the CMS experiment will produce several Pbytes of data each year, to be distributed over many computing centers located in many different countries. The CMS computing model defines how the data are to be distributed such that CMS physicists can access them in an efficient manner in order to perform their physics analysis. CRAB (CMS Remote Analysis Builder) is a specific tool, designed and developed by the CMS collaboration, that facilitates access to the distributed data in a very transparent way. The tool's main feature is the possibility of distributing and parallelizing the local CMS batch data analysis processes over different Grid environments without any specific knowledge of the underlying computational infrastructures. More specifically CRAB allows the transparent usage of WLCG, gLite and OSG middleware. CRAB interacts with both the local user environment, with CMS Data Management services and with the Grid middleware.
DEFF Research Database (Denmark)
Han, Xue; Sandels, Claes; Zhu, Kun;
2013-01-01
, comprising distributed generation, active demand and electric vehicles. Subsequently, quantitative analysis was made on the basis of the current and envisioned DER deployment scenarios proposed for Sweden. Simulations are performed in two typical distribution network models for four seasons. The simulation...... results show that in general the DER deployment brings in the possibilities to reduce the power losses and voltage drops by compensating power from the local generation and optimizing the local load profiles....
Modeling and analysis of solar distributed generation
Ortiz Rivera, Eduardo Ivan
power point tracking algorithms. Finally, several PV power applications will be presented like circuit analysis for a load connected to two different PV arrays, speed control for a do motor connected to a PVM, and a novel single phase photovoltaic inverter system using the Z-source converter.
Subjective Probability Distribution Elicitation in Cost Risk Analysis: A Review
2007-01-01
DeGroot , Morris H., Optimal Statistical Decisions, New York: McGraw-Hill, 1970. Dewar, James A., Assumption-Based Planning: A Tool for Reducing...formal decision-analysis point of view. See DeGroot (1970) for a clear exposition of utility in decision analysis. 2 For the triangle distribution, the
System analysis and planning of a gas distribution network
Energy Technology Data Exchange (ETDEWEB)
Salas, Edwin F.M.; Farias, Helio Monteiro [AUTOMIND, Rio de Janeiro, RJ (Brazil); Costa, Carla V.R. [Universidade Salvador (UNIFACS), BA (Brazil)
2009-07-01
The increase in demand by gas consumers require that projects or improvements in gas distribution networks be made carefully and safely to ensure a continuous, efficient and economical supply. Gas distribution companies must ensure that the networks and equipment involved are defined and designed at the appropriate time to attend to the demands of the market. To do that a gas distribution network analysis and planning tool should use distribution networks and transmission models for the current situation and the future changes to be implemented. These models are used to evaluate project options and help in making appropriate decisions in order to minimize the capital investment in new components or simple changes in operational procedures. Gas demands are increasing and it is important that gas distribute design new distribution systems to ensure this growth, considering financial constraints of the company, as well as local legislation and regulation. In this study some steps of developing a flexible system that attends to those needs will be described. The analysis of distribution requires geographically referenced data for the models as well as an accurate connectivity and the attributes of the equipment. GIS systems are often used as a deposit center that holds the majority of this information. GIS systems are constantly updated as distribution network equipment is modified. The distribution network modeling gathered from this system ensures that the model represents the current network condition. The benefits of this architecture drastically reduce the creation and maintenance cost of the network models, because network components data are conveniently made available to populate the distribution network. This architecture ensures that the models are continually reflecting the reality of the distribution network. (author)
Energy Technology Data Exchange (ETDEWEB)
Bass, B.R.; Williams, P.T.; McAfee, W.J.; Pugh, C.E. [Oak Ridge National Lab., Heavy-Section Steel Technology Program, Oak Ridge, TN (United States)
2001-07-01
A primary objective of the United States Nuclear Regulatory Commission (USNRC) -sponsored Heavy-Section Steel Technology (HSST) Program is to develop and validate technology applicable to quantitative assessments of fracture prevention margins in nuclear reactor pressure vessels (RPVs) containing flaws and subjected to service-induced material toughness degradation. This paper describes an experimental/analytical program for the development of a Weibull statistical model of cleavage fracture toughness for applications to shallow surface-breaking and embedded flaws in RPV materials subjected to multi-axial loading conditions. The experimental part includes both material characterization testing and larger fracture toughness experiments conducted using a special-purpose cruciform beam specimen developed by Oak Ridge National Laboratory for applying biaxial loads to shallow cracks. Test materials (pressure vessel steels) included plate product forms (conforming to ASTM A533 Grade B Class 1 specifications) and shell segments procured from a pressurized-water reactor vessel intended for a nuclear power plant. Results from tests performed on cruciform specimens demonstrated that biaxial loading can have a pronounced effect on shallow-flaw fracture toughness in the lower-transition temperature region. A local approach methodology based on a three-parameter Weibull model was developed to correlate these experimentally-observed biaxial effects on fracture toughness. The Weibull model, combined with a new hydrostatic stress criterion in place of the more commonly used maximum principal stress in the kernel of the Weibull stress integral definition, is shown to provide a scaling mechanism between uniaxial and biaxial loading states for 2-dimensional flaws located in the A533-B plate material. The Weibull stress density was introduced as a matrice for identifying regions along a semi-elliptical flaw front that have a higher probability of cleavage initiation. Cumulative
An integrated economic and distributional analysis of energy policies
Energy Technology Data Exchange (ETDEWEB)
Labandeira, Xavier [Facultade de CC. Economicas, University of Vigo, 36310 Vigo (Spain); Labeaga, Jose M. [Instituto de Estudios Fiscales, Avda. Cardenal Herrera Oria 378, 28035 Madrid (Spain); Rodriguez, Miguel [Facultade de CC. Empresariais e Turismo, University of Vigo, 32004 Ourense (Spain)
2009-12-15
Most public policies, particularly those in the energy sphere, have not only efficiency but also distributional effects. However, there is a trade-off between modelling approaches suitable for calculating those impacts on the economy. For the former most of the studies have been conducted with general equilibrium models, whereas partial equilibrium models represent the main approach for distributional analysis. This paper proposes a methodology to simultaneously carry out an analysis of the distributional and efficiency consequences of changes in energy taxation. In order to do so, we have integrated a microeconomic household demand model and a computable general equilibrium model for the Spanish economy. We illustrate the advantages of this approach by simulating a revenue-neutral reform in Spanish indirect taxation, with a large increase of energy taxes that serve an environmental purpose. The results show that the reforms bring about significant efficiency and distributional effects, in some cases counterintuitive, and demonstrate the academic and social utility of this approximation. (author)
Seismic reliability analysis of urban water distribution network
Institute of Scientific and Technical Information of China (English)
Li Jie; Wei Shulin; Liu Wei
2006-01-01
An approach to analyze the seismic reliability of water distribution networks by combining a hydraulic analysis with a first-order reliability method (FORM), is proposed in this paper.The hydraulic analysis method for normal conditions is modified to accommodate the special conditions necessary to perform a seismic hydraulic analysis. In order to calculate the leakage area and leaking flow of the pipelines in the hydraulic analysis method, a new leakage model established from the seismic response analysis of buried pipelines is presented. To validate the proposed approach, a network with 17 nodes and 24 pipelines is investigated in detail. The approach is also applied to an actual project consisting of 463 nodes and 767pipelines. Thee results show that the proposed approach achieves satisfactory results in analyzing the seismic reliability of large-scale water distribution networks.
Directory of Open Access Journals (Sweden)
Rodrigo Geroni Mendes Nascimento
2012-06-01
Full Text Available
Em 1979 a técnica de modelagem de distribuições diamétricas por funções probabilísticas foi aplicada pela primeira vez por Hyink & Moser na prognose do crescimento e da produção de florestas multiâneas e heterogêneas. Entretanto, atualmente, poucos trabalhos a utilizam no planejamento da produção dessas florestas por desconhecerem a viabilidade operacional da técnica. Sendo assim, esse trabalho visa apresentar uma revisão das características que propiciam a modelagem do crescimento e da produção por classe diamétrica, destacando a importância da dinâmica do recrutamento, mortalidade, sobrevivência, bem como dos atributos populacionais correlacionados à modelagem da distribuição de Weibull, apresentando as particularidades estatísticas utilizadas na modelagem da produção por esse método.
doi: 10.4336/2012.pfb.32.70.93
In 1979 the technique of modeling diameter distributions by probabilistic functions was first applied for Hyink & Moser in forecasting growth and production of uneven aged and heterogeneous forests. However, today few studies use this method for planning the production in these forests for not knowing the operational feasibility of the technique. Therefore this paper presents a review of the characteristics that allow the modeling of growth and yield by diameter class, highlighting the importance of the dynamics of recruitment, mortality, survival, and population of attributes related to the modeling of Weibull distribution, with the specific statistics used in the modeling of yield by this method.
doi: 10.4336/2012.pfb.32.70.93
First Experiences with LHC Grid Computing and Distributed Analysis
Energy Technology Data Exchange (ETDEWEB)
Fisk, Ian
2010-12-01
In this presentation the experiences of the LHC experiments using grid computing were presented with a focus on experience with distributed analysis. After many years of development, preparation, exercises, and validation the LHC (Large Hadron Collider) experiments are in operations. The computing infrastructure has been heavily utilized in the first 6 months of data collection. The general experience of exploiting the grid infrastructure for organized processing and preparation is described, as well as the successes employing the infrastructure for distributed analysis. At the end the expected evolution and future plans are outlined.
Discriminating topology in galaxy distributions using network analysis
Hong, Sungryong; Coutinho, Bruno C.; Dey, Arjun; Barabási, Albert-L.; Vogelsberger, Mark; Hernquist, Lars; Gebhardt, Karl
2016-07-01
The large-scale distribution of galaxies is generally analysed using the two-point correlation function. However, this statistic does not capture the topology of the distribution, and it is necessary to resort to higher order correlations to break degeneracies. We demonstrate that an alternate approach using network analysis can discriminate between topologically different distributions that have similar two-point correlations. We investigate two galaxy point distributions, one produced by a cosmological simulation and the other by a Lévy walk. For the cosmological simulation, we adopt the redshift z = 0.58 slice from Illustris and select galaxies with stellar masses greater than 108 M⊙. The two-point correlation function of these simulated galaxies follows a single power law, ξ(r) ˜ r-1.5. Then, we generate Lévy walks matching the correlation function and abundance with the simulated galaxies. We find that, while the two simulated galaxy point distributions have the same abundance and two-point correlation function, their spatial distributions are very different; most prominently, filamentary structures, absent in Lévy fractals. To quantify these missing topologies, we adopt network analysis tools and measure diameter, giant component, and transitivity from networks built by a conventional friends-of-friends recipe with various linking lengths. Unlike the abundance and two-point correlation function, these network quantities reveal a clear separation between the two simulated distributions; therefore, the galaxy distribution simulated by Illustris is not a Lévy fractal quantitatively. We find that the described network quantities offer an efficient tool for discriminating topologies and for comparing observed and theoretical distributions.
Rapid Analysis of Mass Distribution of Radiation Shielding
Zapp, Edward
2007-01-01
Radiation Shielding Evaluation Toolset (RADSET) is a computer program that rapidly calculates the spatial distribution of mass of an arbitrary structure for use in ray-tracing analysis of the radiation-shielding properties of the structure. RADSET was written to be used in conjunction with unmodified commercial computer-aided design (CAD) software that provides access to data on the structure and generates selected three-dimensional-appearing views of the structure. RADSET obtains raw geometric, material, and mass data on the structure from the CAD software. From these data, RADSET calculates the distribution(s) of the masses of specific materials about any user-specified point(s). The results of these mass-distribution calculations are imported back into the CAD computing environment, wherein the radiation-shielding calculations are performed.
Analyzing Distributed Functions in an Integrated Hazard Analysis
Morris, A. Terry; Massie, Michael J.
2010-01-01
Large scale integration of today's aerospace systems is achievable through the use of distributed systems. Validating the safety of distributed systems is significantly more difficult as compared to centralized systems because of the complexity of the interactions between simultaneously active components. Integrated hazard analysis (IHA), a process used to identify unacceptable risks and to provide a means of controlling them, can be applied to either centralized or distributed systems. IHA, though, must be tailored to fit the particular system being analyzed. Distributed systems, for instance, must be analyzed for hazards in terms of the functions that rely on them. This paper will describe systems-oriented IHA techniques (as opposed to traditional failure-event or reliability techniques) that should be employed for distributed systems in aerospace environments. Special considerations will be addressed when dealing with specific distributed systems such as active thermal control, electrical power, command and data handling, and software systems (including the interaction with fault management systems). Because of the significance of second-order effects in large scale distributed systems, the paper will also describe how to analyze secondary functions to secondary functions through the use of channelization.
Location Analysis of Freight Distribution Terminal of Jakarta City, Indonesia
Directory of Open Access Journals (Sweden)
Nahry Nahry
2016-03-01
Full Text Available Currently Jakarta has two freight terminals, namely Pulo Gebang and Tanah Merdeka. But, both terminals are just functioned for parking and have not been utilized properly yet, e.g. for consolidation. Goods consolidation, which is usually performed in distribution terminal, may reduce number of freight flow within the city. This paper is aimed to determine the best location of distribution terminal in Jakarta among those two terminals and two additional alternative sites, namely Lodan and Rawa Buaya. It is initialized by the identification of important factors that affect the location selection. It is carried out by Likert analysis through the questionnaires distributed to logistics firms. The best location is determined by applying Overlay Analysis using ArcGIS 9.2. Four grid maps are produced to represent the accessibility, cost, time, and environment factors as the important factors of location. The result shows that the ranking from the best is; Lodan, Tanah Merdeka, Pulo Gebang, and Rawa Buaya.
GIS-based poverty and population distribution analysis in China
Cui, Jing; Wang, Yingjie; Yan, Hong
2009-07-01
Geographically, poverty status is not only related with social-economic factors but also strongly affected by geographical environment. In the paper, GIS-based poverty and population distribution analysis method is introduced for revealing their regional differences. More than 100000 poor villages and 592 national key poor counties are chosen for the analysis. The results show that poverty distribution tends to concentrate in most of west China and mountainous rural areas of mid China. Furthermore, the fifth census data are overlaid to those poor areas in order to gain its internal diversity of social-economic characteristics. By overlaying poverty related social-economic parameters, such as sex ratio, illiteracy, education level, percentage of ethnic minorities, family composition, finding shows that poverty distribution is strongly correlated with high illiteracy rate, high percentage minorities, and larger family member.
Environment for Test and Analysis of Distributed Software (ETADS)
1994-09-27
and Analysis of Distributed Software (ETADS) Final Report Dear Sir/ Madam , Enclosed please find the subject final report for your review. If you have any...OSF >= 3.0 BAL Sequent Balance BFLY BBN Butterfly TC2000 BSD386 80[34]86 running BSDI, 386BSD, Net- BSD, FreeBSD CM2 Thinking Machines CM-2 Sun front
Analysis of acidic properties of distribution transformer oil insulation ...
African Journals Online (AJOL)
Analysis of acidic properties of distribution transformer oil insulation: a case study of ... Then 0.1mol/litre of Potassium hydroxide (KOH) was added as titre with ... With the acidic content beyond the prescribed minimum value present in the ...
Bi-Modal and Mixture Distributions in Circular Data Analysis
Directory of Open Access Journals (Sweden)
Muhammet Burak KILIC
2016-10-01
Full Text Available Objective: Circular statistics is a special area which is analyzed by observed angular data on the unit circle. In many various studies, such as environment, biology or medicine, circular (angular data is an important part of the research. For illustration, to determine the secondary structure of the proteins by utilizing dihedral angles or to asses physical disorders such as gait disturbances between the bones in the geometric morphology or the organization of the beach after leaving the eggs of sea turtles, are the important applications of this area. The uses of linear statistical methods in this area lead to misleading results because of the geometric shape of the circular data. Therefore, when it is analyzing such angular data, circular statistical methods should be used. The objective of this study is compared with the bi-modal and mixture distributions in circular data analysis. Material and Methods: The bi-modal mixture von Mises, wrapped Normal, wrapped Cauchy and the generalisations of von Mises distributions were used and it was performed by iterative methods to obtain aximum likelihood estimators. These iterative methods were applied in R programming and the R codes were given for the circular distribution of the parameter estimation. These distributions were examined for analyzing dihedral angles in proteins and turtles rotations, and model selection was performed by using Akaike and Bayesian information criteria. Results: For dihedral angles in protein, two mixture wrapped Cauchy distribution was given the better fit. For turtle rotations, the generalizations of von Mises distribution and two mixture von Mises distribution were given the better fit. Conclusion: If there is observed an excessive concentration in one or more modes in analyzing circular data, the bimodal mixture von Mises and the generalisations of von Mises distribution for modeling may not be preferred. If there is not observed an excessive concentration in
Statistical analysis of absorptive laser damage in dielectric thin films
Energy Technology Data Exchange (ETDEWEB)
Budgor, A.B.; Luria-Budgor, K.F.
1978-09-11
The Weibull distribution arises as an example of the theory of extreme events. It is commonly used to fit statistical data arising in the failure analysis of electrical components and in DC breakdown of materials. This distribution is employed to analyze time-to-damage and intensity-to-damage statistics obtained when irradiating thin film coated samples of SiO/sub 2/, ZrO/sub 2/, and Al/sub 2/O/sub 3/ with tightly focused laser beams. The data used is furnished by Milam. The fit to the data is excellent; and least squared correlation coefficients greater than 0.9 are often obtained.
Bayesian analysis of a disability model for lung cancer survival.
Armero, C; Cabras, S; Castellanos, M E; Perra, S; Quirós, A; Oruezábal, M J; Sánchez-Rubio, J
2016-02-01
Bayesian reasoning, survival analysis and multi-state models are used to assess survival times for Stage IV non-small-cell lung cancer patients and the evolution of the disease over time. Bayesian estimation is done using minimum informative priors for the Weibull regression survival model, leading to an automatic inferential procedure. Markov chain Monte Carlo methods have been used for approximating posterior distributions and the Bayesian information criterion has been considered for covariate selection. In particular, the posterior distribution of the transition probabilities, resulting from the multi-state model, constitutes a very interesting tool which could be useful to help oncologists and patients make efficient and effective decisions.
Area and Flux Distributions of Active Regions, Sunspot Groups, and Sunspots: A Multi-Database Study
Muñoz-Jaramillo, Andrés; Windmueller, John C; Amouzou, Ernest C; Longcope, Dana W; Tlatov, Andrey G; Nagovitsyn, Yury A; Pevtsov, Alexei A; Chapman, Gary A; Cookson, Angela M; Yeates, Anthony R; Watson, Fraser T; Balmaceda, Laura A; DeLuca, Edward E; Martens, Petrus C H
2014-01-01
In this work we take advantage of eleven different sunspot group, sunspot, and active region databases to characterize the area and flux distributions of photospheric magnetic structures. We find that, when taken separately, different databases are better fitted by different distributions (as has been reported previously in the literature). However, we find that all our databases can be reconciled by the simple application of a proportionality constant, and that, in reality, different databases are sampling different parts of a composite distribution. This composite distribution is made up by linear combination of Weibull and log-normal distributions -- where a pure Weibull (log-normal) characterizes the distribution of structures with fluxes below (above) $10^{21}$Mx ($10^{22}$Mx). Additionally, we demonstrate that the Weibull distribution shows the expected linear behaviour of a power-law distribution (when extended into smaller fluxes), making our results compatible with the results of Parnell et al.\\ (200...
Optimal power flow for distribution networks with distributed generation
Directory of Open Access Journals (Sweden)
Radosavljević Jordan
2015-01-01
Full Text Available This paper presents a genetic algorithm (GA based approach for the solution of the optimal power flow (OPF in distribution networks with distributed generation (DG units, including fuel cells, micro turbines, diesel generators, photovoltaic systems and wind turbines. The OPF is formulated as a nonlinear multi-objective optimization problem with equality and inequality constraints. Due to the stochastic nature of energy produced from renewable sources, i.e. wind turbines and photovoltaic systems, as well as load uncertainties, a probabilisticalgorithm is introduced in the OPF analysis. The Weibull and normal distributions are employed to model the input random variables, namely the wind speed, solar irradiance and load power. The 2m+1 point estimate method and the Gram Charlier expansion theory are used to obtain the statistical moments and the probability density functions (PDFs of the OPF results. The proposed approach is examined and tested on a modified IEEE 34 node test feeder with integrated five different DG units. The obtained results prove the efficiency of the proposed approach to solve both deterministic and probabilistic OPF problems for different forms of the multi-objective function. As such, it can serve as a useful decision-making supporting tool for distribution network operators. [Projekat Ministarstva nauke Republike Srbije, br. TR33046
Directory of Open Access Journals (Sweden)
Yiannoutsos Constantin T
2009-06-01
Full Text Available Abstract Background Mortality of HIV-infected patients initiating antiretroviral therapy in the developing world is very high immediately after the start of ART therapy and drops sharply thereafter. It is necessary to use models of survival time that reflect this change. Methods In this endeavor, parametric models with changepoints such as Weibull models can be useful in order to explicitly model the underlying failure process, even in the case where abrupt changes in the mortality rate are present. Estimation of the temporal location of possible mortality changepoints has important implications on the effective management of these patients. We briefly describe these models and apply them to the case of estimating survival among HIV-infected patients who are initiating antiretroviral therapy in a care and treatment programme in sub-Saharan Africa. Results As a first reported data-driven estimate of the existence and location of early mortality changepoints after antiretroviral therapy initiation, we show that there is an early change in risk of death at three months, followed by an intermediate risk period lasting up to 10 months after therapy. Conclusion By explicitly modelling the underlying abrupt changes in mortality risk after initiation of antiretroviral therapy we are able to estimate their number and location in a rigorous, data-driven manner. The existence of a high early risk of death after initiation of antiretroviral therapy and the determination of its duration has direct implications for the optimal management of patients initiating therapy in this setting.
Generalized Exponential Distribution in Flood Frequency Analysis for Polish Rivers
Markiewicz, Iwona; Strupczewski, Witold G.; Bogdanowicz, Ewa; Kochanek, Krzysztof
2015-01-01
Many distributions have been used in flood frequency analysis (FFA) for fitting the flood extremes data. However, as shown in the paper, the scatter of Polish data plotted on the moment ratio diagram shows that there is still room for a new model. In the paper, we study the usefulness of the generalized exponential (GE) distribution in flood frequency analysis for Polish Rivers. We investigate the fit of GE distribution to the Polish data of the maximum flows in comparison with the inverse Gaussian (IG) distribution, which in our previous studies showed the best fitting among several models commonly used in FFA. Since the use of a discrimination procedure without the knowledge of its performance for the considered probability density functions may lead to erroneous conclusions, we compare the probability of correct selection for the GE and IG distributions along with the analysis of the asymptotic model error in respect to the upper quantile values. As an application, both GE and IG distributions are alternatively assumed for describing the annual peak flows for several gauging stations of Polish Rivers. To find the best fitting model, four discrimination procedures are used. In turn, they are based on the maximized logarithm of the likelihood function (K procedure), on the density function of the scale transformation maximal invariant (QK procedure), on the Kolmogorov-Smirnov statistics (KS procedure) and the fourth procedure based on the differences between the ML estimate of 1% quantile and its value assessed by the method of moments and linear moments, in sequence (R procedure). Due to the uncertainty of choosing the best model, the method of aggregation is applied to estimate of the maximum flow quantiles. PMID:26657239
Generalized Exponential Distribution in Flood Frequency Analysis for Polish Rivers.
Markiewicz, Iwona; Strupczewski, Witold G; Bogdanowicz, Ewa; Kochanek, Krzysztof
2015-01-01
Many distributions have been used in flood frequency analysis (FFA) for fitting the flood extremes data. However, as shown in the paper, the scatter of Polish data plotted on the moment ratio diagram shows that there is still room for a new model. In the paper, we study the usefulness of the generalized exponential (GE) distribution in flood frequency analysis for Polish Rivers. We investigate the fit of GE distribution to the Polish data of the maximum flows in comparison with the inverse Gaussian (IG) distribution, which in our previous studies showed the best fitting among several models commonly used in FFA. Since the use of a discrimination procedure without the knowledge of its performance for the considered probability density functions may lead to erroneous conclusions, we compare the probability of correct selection for the GE and IG distributions along with the analysis of the asymptotic model error in respect to the upper quantile values. As an application, both GE and IG distributions are alternatively assumed for describing the annual peak flows for several gauging stations of Polish Rivers. To find the best fitting model, four discrimination procedures are used. In turn, they are based on the maximized logarithm of the likelihood function (K procedure), on the density function of the scale transformation maximal invariant (QK procedure), on the Kolmogorov-Smirnov statistics (KS procedure) and the fourth procedure based on the differences between the ML estimate of 1% quantile and its value assessed by the method of moments and linear moments, in sequence (R procedure). Due to the uncertainty of choosing the best model, the method of aggregation is applied to estimate of the maximum flow quantiles.
Integrating software architectures for distributed simulations and simulation analysis communities.
Energy Technology Data Exchange (ETDEWEB)
Goldsby, Michael E.; Fellig, Daniel; Linebarger, John Michael; Moore, Patrick Curtis; Sa, Timothy J.; Hawley, Marilyn F.
2005-10-01
The one-year Software Architecture LDRD (No.79819) was a cross-site effort between Sandia California and Sandia New Mexico. The purpose of this research was to further develop and demonstrate integrating software architecture frameworks for distributed simulation and distributed collaboration in the homeland security domain. The integrated frameworks were initially developed through the Weapons of Mass Destruction Decision Analysis Center (WMD-DAC), sited at SNL/CA, and the National Infrastructure Simulation & Analysis Center (NISAC), sited at SNL/NM. The primary deliverable was a demonstration of both a federation of distributed simulations and a federation of distributed collaborative simulation analysis communities in the context of the same integrated scenario, which was the release of smallpox in San Diego, California. To our knowledge this was the first time such a combination of federations under a single scenario has ever been demonstrated. A secondary deliverable was the creation of the standalone GroupMeld{trademark} collaboration client, which uses the GroupMeld{trademark} synchronous collaboration framework. In addition, a small pilot experiment that used both integrating frameworks allowed a greater range of crisis management options to be performed and evaluated than would have been possible without the use of the frameworks.
Pseudodifferential Analysis, Automorphic Distributions in the Plane and Modular Forms
Unterberger, Andre
2011-01-01
Pseudodifferential analysis, introduced in this book in a way adapted to the needs of number theorists, relates automorphic function theory in the hyperbolic half-plane I to automorphic distribution theory in the plane. Spectral-theoretic questions are discussed in one or the other environment: in the latter one, the problem of decomposing automorphic functions in I according to the spectral decomposition of the modular Laplacian gives way to the simpler one of decomposing automorphic distributions in R2 into homogeneous components. The Poincare summation process, which consists in building au
Influence Of Lateral Load Distributions On Pushover Analysis Effectiveness
Colajanni, P.; Potenzone, B.
2008-07-01
The effectiveness of two simple load distributions for pushover analysis recently proposed by the authors is investigated through a comparative study, involving static and dynamic analyses of seismic response of eccentrically braced frames. It is shown that in the upper floors only multimodal pushover procedures provide results close to the dynamic profile, while the proposed load patterns are always conservative in the lower floors. They over-estimate the seismic response less than the uniform distribution, representing a reliable alternative to the uniform or more sophisticated adaptive procedures proposed by seismic codes.
Employing wavelet for neutron tracks distribution analysis in PADC detectors
Ferrari, Paolo; Campani, Lorenzo; Mariotti, Francesca
2017-08-01
PADC nuclear track dosemeters are used for fast neutron monitoring. A system, based on one-shot image acquisition is employed and a simple image analysis, based on the track counting, is performed in a series of image regions. When this procedure fails a different approach is needed. In the present paper we tested a wavelet transform based algorithm to detect possible ;patterns; in tracks distributions that could be associated to dosemeter anomalies, assuming that a neutron exposure should produce a homogenous distribution. The algorithm, tested with samples of our dosimetric service, showed its potential effectiveness and capabilities.
Directory of Open Access Journals (Sweden)
Jianhua Ni
2016-08-01
Full Text Available The spatial distribution of urban service facilities is largely constrained by the road network. In this study, network point pattern analysis and correlation analysis were used to analyze the relationship between road network and healthcare facility distribution. The weighted network kernel density estimation method proposed in this study identifies significant differences between the outside and inside areas of the Ming city wall. The results of network K-function analysis show that private hospitals are more evenly distributed than public hospitals, and pharmacy stores tend to cluster around hospitals along the road network. After computing the correlation analysis between different categorized hospitals and street centrality, we find that the distribution of these hospitals correlates highly with the street centralities, and that the correlations are higher with private and small hospitals than with public and large hospitals. The comprehensive analysis results could help examine the reasonability of existing urban healthcare facility distribution and optimize the location of new healthcare facilities.
Ni, Jianhua; Qian, Tianlu; Xi, Changbai; Rui, Yikang; Wang, Jiechen
2016-01-01
The spatial distribution of urban service facilities is largely constrained by the road network. In this study, network point pattern analysis and correlation analysis were used to analyze the relationship between road network and healthcare facility distribution. The weighted network kernel density estimation method proposed in this study identifies significant differences between the outside and inside areas of the Ming city wall. The results of network K-function analysis show that private hospitals are more evenly distributed than public hospitals, and pharmacy stores tend to cluster around hospitals along the road network. After computing the correlation analysis between different categorized hospitals and street centrality, we find that the distribution of these hospitals correlates highly with the street centralities, and that the correlations are higher with private and small hospitals than with public and large hospitals. The comprehensive analysis results could help examine the reasonability of existing urban healthcare facility distribution and optimize the location of new healthcare facilities. PMID:27548197
Ni, Jianhua; Qian, Tianlu; Xi, Changbai; Rui, Yikang; Wang, Jiechen
2016-08-18
The spatial distribution of urban service facilities is largely constrained by the road network. In this study, network point pattern analysis and correlation analysis were used to analyze the relationship between road network and healthcare facility distribution. The weighted network kernel density estimation method proposed in this study identifies significant differences between the outside and inside areas of the Ming city wall. The results of network K-function analysis show that private hospitals are more evenly distributed than public hospitals, and pharmacy stores tend to cluster around hospitals along the road network. After computing the correlation analysis between different categorized hospitals and street centrality, we find that the distribution of these hospitals correlates highly with the street centralities, and that the correlations are higher with private and small hospitals than with public and large hospitals. The comprehensive analysis results could help examine the reasonability of existing urban healthcare facility distribution and optimize the location of new healthcare facilities.
Directory of Open Access Journals (Sweden)
Gary Black
2011-01-01
Full Text Available Many real-world processes generate autocorrelated and/or Weibull data. In such cases, the independence and/or normality assumptions underlying the Shewhart and EWMA control charts are invalid. Although data transformations exist, such tools would not normally be understood or employed by naive practitioners. Thus, the question arises, “What are the effects on robustness whenever these charts are used in such applications?” Consequently, this paper examines and compares the performance of these two control charts when the problem (the model is subjected to autocorrelated and/or Weibull data. A variety of conditions are investigated related to the magnitudes of various parameters related to the process shift, the autocorrelation coefficient and the Weibull shape parameter. Results indicate that the EWMA chart outperforms the Shewhart in 62% of the cases, particularly those cases with low to moderate autocorrelation effects. The Shewhart chart outperforms the EWMA chart in 35% of the cases, particularly those cases with high autocorrelation and zero or high process shift effects.
Numerical analysis of the dynamics of distributed vortex configurations
Govorukhin, V. N.
2016-08-01
A numerical algorithm is proposed for analyzing the dynamics of distributed plane vortex configurations in an inviscid incompressible fluid. At every time step, the algorithm involves the computation of unsteady vortex flows, an analysis of the configuration structure with the help of heuristic criteria, the visualization of the distribution of marked particles and vorticity, the construction of streamlines of fluid particles, and the computation of the field of local Lyapunov exponents. The inviscid incompressible fluid dynamic equations are solved by applying a meshless vortex method. The algorithm is used to investigate the interaction of two and three identical distributed vortices with various initial positions in the flow region with and without the Coriolis force.
Production analysis of functionally distributed machines for underground mining
Institute of Scientific and Technical Information of China (English)
Fukui Rui; Kusaka Kouhei; Nakao Masayuki; Kodama Yuichi; Uetake Masaaki; Kawai Kazunari
2016-01-01
Recent years, underground mining method is becoming popular because of its potentially high produc-tivity and efficiency. In this method, a mining machinery;load haul dump (LHD), is used as both an exca-vator and a transporter of ore. This paper proposes a distributed system that realizes the excavation and transport functions with separated vehicles, an excavator and a transporter. In addition, this research proposes a mining map and configurations suitable for the proposed distributed system. To evaluate the productivity of the proposed system, a simulation environment has been developed. Analysis using the simulator reveals what performance factors of the excavator and the transporter have large impacts on the productivity. Simulation results also demonstrate the difference of potential between LHD system and the distributed system that can be explained based on their functions allocation.
Empirical Analysis on Factors Influencing Distribution of Vegetal Production
Institute of Scientific and Technical Information of China (English)
Wenjie; WU
2015-01-01
Since the reform and opening-up,there has been a great change in spatial pattern of China’s vegetable production. This paper studied vegetable production in provinces of China in 1978- 2013. From the sequential characteristics,China’s vegetable production area is constantly growing and takes on stage characteristic. From the spatial distribution,China’s vegetable production takes on the trend of " going down the south" and " marching the west". In order to grasp rules of changes of vegetable production and the influence factors,this paper made theoretical and empirical analysis on factors possibly influencing distribution of vegetable production. Results show that major factors influencing distribution of China’s vegetable production include irrigation condition,non-agricultural employment,market demand,knowledge spillover,comparative effectiveness,rural road and government policies.
ANALYSIS OF BRANCHING DISTRIBUTION IN POLYETHYLENES BY DIFFERENTIAL SCANNING CALORIMETRY
Institute of Scientific and Technical Information of China (English)
Robert Shanks; Fei Chen; Gandara Amarasinghe
2003-01-01
Short chain branching has been characterized using thermal fractionation, a stepwise isothermal crystallization technique, followed by a melting analysis scan using differential scanning calorimetry. Short chain branching distribution was also characterized by a continuous slow cooling crystallization, followed by a melting analysis scan. Four different polyethylenes were studied: Ziegler-Natta gas phase, Ziegler-Natta solution, metallocene, constrained-geometry single site catalyzed polyethylenes. The branching distribution was calculated from a calibration of branch content with melting temperature. The lamellar thickness was calculated based on the thermodynamic melting temperature of each polyethylene and the surface free energy of the crystal face. The branching distribution and lamellar thickness distribution were used to calculate weight average branch content, mean lamellar thickness, and a branch dispersity index. The results for the branch content were in good agreement with the known comonomer content of the polyethylenes. A limitation was that high branch content polyethylenes did not reach their potential crystallization at ambient temperatures. Cooling to sub-ambient was necessary to equilibrate the crystallization, but melting temperature versus branch content was not applicable after cooling to below ambient because the calibration data were not performed in this way.
ANALYSIS OF BRANCHING DISTRIBUTION IN POLYETHYLENES BY DIFFERENTIAL SCANNING CALORIMETRY
Institute of Scientific and Technical Information of China (English)
RobertShanks; FeiChan; GandaraAmarasinghe; RobertShanks
2003-01-01
Short chain branching has been characterized using thermal fractionation,a stepwise isothermal crystallization technique,followed by a melting analysis scan using differential scanning calorimetry.Short chain branching distribution was also characterized by a continuous slow cooling crystallization,followed by a melting analysis scan.Four different polyethylenes were studied:Ziegler-Natta gas phase,Ziegler-Natta solution,metallocene,constrained-geometry single site catalyzed polyethylenes.The branching distribution was calculated from a calibration of branch content with melting temperature.The lamellar thickness was calculated based on the thermodynamic melting temperature of each polyethylene and the surface free energy of the crystal face.The branching distribution and lamellar thickness distribution were used to calculate weight average branch content,mean lamellar thickness,and a branch dispersity index.The results for the branch content were in good agreement with the known comonomer content of the polyethylenes.A limitation was that high branch content polyethylenes did not reach their potential crystallization at ambient temperatures.Cooling to sub-ambient was necessary to equilibrate the crystallization,but melting temperature versus branch content was not applicable after cooling to below ambient because the calibration data were not performed in this way.
Probability Measure of Navigation pattern predition using Poisson Distribution Analysis
Directory of Open Access Journals (Sweden)
Dr.V.Valli Mayil
2012-06-01
Full Text Available The World Wide Web has become one of the most important media to store, share and distribute information. The rapid expansion of the web has provided a great opportunity to study user and system behavior by exploring web access logs. Web Usage Mining is the application of data mining techniques to large web data repositories in order to extract usage patterns. Every web server keeps a log of all transactions between the server and the clients. The log data which are collected by web servers contains information about every click of user to the web documents of the site. The useful log information needs to be analyzed and interpreted in order to obtainknowledge about actual user preferences in accessing web pages. In recent years several methods have been proposed for mining web log data. This paper addresses the statistical method of Poisson distribution analysis to find out the higher probability session sequences which is then used to test the web application performance.The analysis of large volumes of click stream data demands the employment of data mining methods. Conducting data mining on logs of web servers involves the determination of frequently occurring access sequences. A statistical poisson distribution shows the frequency probability of specific events when the average probability of a single occurrence is known. The Poisson distribution is a discrete function wich is used in this paper to find out the probability frequency of particular page is visited by the user.
Calculation and Analysis of Temperature Distribution in Hot Rolling Strip
Directory of Open Access Journals (Sweden)
Kaixiang Peng
2013-07-01
Full Text Available Modern steel grades require constant and reproducible production conditions both in the hot strip mill and in the cooling section to achieve constant material properties along the entire strip length and from strip to strip. Calculation of the temperature in final rolling process always utilizes factors such as the work piece's inner organizational structure, plastic deformation, and it's variations of properties and so on, also as well as the physical parameters such as gauge, shape, etc. In this paper, a finite element model is constructed for the temperature field in a rolling process. The temperature field of strip steel is modeled with a 3-D finite element analysis (FEA structure, simultaneously considering the distribution of the work roll temperature. Then the distribution of field is simulated numerically. From the model, the temperature contours can be obtained by analysis of the temperature distribution of contact area. At the same time, the distribution of temperature in any position at any time can be acquired. These efforts provide the reliable parameters for the later finishing temperature and shape control.
Distribution System Reliability Analysis for Smart Grid Applications
Aljohani, Tawfiq Masad
Reliability of power systems is a key aspect in modern power system planning, design, and operation. The ascendance of the smart grid concept has provided high hopes of developing an intelligent network that is capable of being a self-healing grid, offering the ability to overcome the interruption problems that face the utility and cost it tens of millions in repair and loss. To address its reliability concerns, the power utilities and interested parties have spent extensive amount of time and effort to analyze and study the reliability of the generation and transmission sectors of the power grid. Only recently has attention shifted to be focused on improving the reliability of the distribution network, the connection joint between the power providers and the consumers where most of the electricity problems occur. In this work, we will examine the effect of the smart grid applications in improving the reliability of the power distribution networks. The test system used in conducting this thesis is the IEEE 34 node test feeder, released in 2003 by the Distribution System Analysis Subcommittee of the IEEE Power Engineering Society. The objective is to analyze the feeder for the optimal placement of the automatic switching devices and quantify their proper installation based on the performance of the distribution system. The measures will be the changes in the reliability system indices including SAIDI, SAIFI, and EUE. The goal is to design and simulate the effect of the installation of the Distributed Generators (DGs) on the utility's distribution system and measure the potential improvement of its reliability. The software used in this work is DISREL, which is intelligent power distribution software that is developed by General Reliability Co.
Power Quality Analysis Using Bilinear Time-Frequency Distributions
Directory of Open Access Journals (Sweden)
Sha'ameri AhmadZuri
2010-01-01
Full Text Available Abstract Bilinear time-frequency distributions (TFDs are powerful techniques that offer good time and frequency resolution of time-frequency representation (TFR. It is very appropriate to analyze power quality signals which consist of nonstationary and multi-frequency components. However, the TFDs suffer from interference because of cross-terms. Many TFDs have been implemented, and there is no fixed window or kernel that can remove the cross-terms for all types of signals. In this paper, the bilinear TFDs are implemented to analyze power quality signals such as smooth-windowed Wigner-Ville distribution (SWWVD, Choi-Williams distribution (CWD, B-distribution (BD, and modified B-distribution (MBD. The power quality signals focused are swell, sag, interruption, harmonic, interharmonic, and transient based on IEEE Std, 1159-1995. A set of performance measures is defined and used to compare the TFRs. It shows that SWWVD presents the best performance and is selected for power quality signal analysis. Thus, an adaptive optimal kernel SWWVD is designed to determine the separable kernel automatically from the input signal.
A Grouping Method of Distribution Substations Using Cluster Analysis
Ohtaka, Toshiya; Iwamoto, Shinichi
Recently, it has been considered to group distribution substations together for evaluating the reinforcement planning of distribution systems. However, the grouping is carried out by the knowledge and experience of an expert who is in charge of distribution systems, and a subjective feeling of a human being causes ambiguous grouping at the moment. Therefore, a method for imitating the grouping by the expert has been desired in order to carry out a systematic grouping which has numerical corroboration. In this paper, we propose a grouping method of distribution substations using cluster analysis based on the interconnected power between the distribution substations. Moreover, we consider the geographical constraints such as rivers, roads, business office boundaries and branch boundaries, and also examine a method for adjusting the interconnected power. Simulations are carried out to verify the validity of the proposed method using an example system. From the simulation results, we can find that the imitation of the grouping by the expert becomes possible due to considering the geographical constraints and adjusting the interconnected power, and also the calculation time and iterations can be greatly reduced by introducing the local and tabu search methods.
DIRAC - The Distributed MC Production and Analysis for LHCb
Tsaregorodtsev, A
2004-01-01
DIRAC is the LHCb distributed computing grid infrastructure for MC production and analysis. Its architecture is based on a set of distributed collaborating services. The service decomposition broadly follows the ARDA project proposal, allowing for the possibility of interchanging the EGEE/ARDA and DIRAC components in the future. Some components developed outside the DIRAC project are already in use as services, for example the File Catalog developed by the AliEn project. An overview of the DIRAC architecture will be given, in particular the recent developments to support user analysis. The main design choices will be presented. One of the main design goals of DIRAC is the simplicity of installation, configuring and operation of various services. This allows all the DIRAC resources to be easily managed by a single Production Manager. The modular design of the DIRAC components allows its functionality to be easily extended to include new computing and storage elements or to handle new tasks. The DIRAC system al...
Remnant lipoprotein size distribution profiling via dynamic light scattering analysis.
Chandra, Richa; Mellis, Birgit; Garza, Kyana; Hameed, Samee A; Jurica, James M; Hernandez, Ana V; Nguyen, Mia N; Mittal, Chandra K
2016-11-01
Remnant lipoproteins (RLP) are a metabolically derived subpopulation of triglyceride-rich lipoproteins (TRL) in human blood that are involved in the metabolism of dietary fats or triglycerides. RLP, the smaller and denser variants of TRL particles, are strongly correlated with cardiovascular disease (CVD) and were listed as an emerging atherogenic risk factor by the AHA in 2001. Varying analytical techniques used in clinical studies in the size determination of RLP contribute to conflicting hypotheses in regard to whether larger or smaller RLP particles contribute to CVD progression, though multiple pathways may exist. We demonstrated a unique combinatorial bioanalytical approach involving the preparative immunoseparation of RLP, and dynamic light scattering for size distribution analysis. This is a new facile and robust methodology for the size distribution analysis of RLP that in conjunction with clinical studies may reveal the mechanisms by which RLP cause CVD progression. Copyright © 2016 Elsevier B.V. All rights reserved.
Comparing Distributions of Environmental Outcomes for Regulatory Environmental Justice Analysis
Directory of Open Access Journals (Sweden)
Glenn Sheriff
2011-05-01
Full Text Available Economists have long been interested in measuring distributional impacts of policy interventions. As environmental justice (EJ emerged as an ethical issue in the 1970s, the academic literature has provided statistical analyses of the incidence and causes of various environmental outcomes as they relate to race, income, and other demographic variables. In the context of regulatory impacts, however, there is a lack of consensus regarding what information is relevant for EJ analysis, and how best to present it. This paper helps frame the discussion by suggesting a set of questions fundamental to regulatory EJ analysis, reviewing past approaches to quantifying distributional equity, and discussing the potential for adapting existing tools to the regulatory context.
Size distribution measurements and chemical analysis of aerosol components
Energy Technology Data Exchange (ETDEWEB)
Pakkanen, T.A.
1995-12-31
The principal aims of this work were to improve the existing methods for size distribution measurements and to draw conclusions about atmospheric and in-stack aerosol chemistry and physics by utilizing size distributions of various aerosol components measured. A sample dissolution with dilute nitric acid in an ultrasonic bath and subsequent graphite furnace atomic absorption spectrometric analysis was found to result in low blank values and good recoveries for several elements in atmospheric fine particle size fractions below 2 {mu}m of equivalent aerodynamic particle diameter (EAD). Furthermore, it turned out that a substantial amount of analyses associated with insoluble material could be recovered since suspensions were formed. The size distribution measurements of in-stack combustion aerosols indicated two modal size distributions for most components measured. The existence of the fine particle mode suggests that a substantial fraction of such elements with two modal size distributions may vaporize and nucleate during the combustion process. In southern Norway, size distributions of atmospheric aerosol components usually exhibited one or two fine particle modes and one or two coarse particle modes. Atmospheric relative humidity values higher than 80% resulted in significant increase of the mass median diameters of the droplet mode. Important local and/or regional sources of As, Br, I, K, Mn, Pb, Sb, Si and Zn were found to exist in southern Norway. The existence of these sources was reflected in the corresponding size distributions determined, and was utilized in the development of a source identification method based on size distribution data. On the Finnish south coast, atmospheric coarse particle nitrate was found to be formed mostly through an atmospheric reaction of nitric acid with existing coarse particle sea salt but reactions and/or adsorption of nitric acid with soil derived particles also occurred. Chloride was depleted when acidic species reacted
A comprehensive study of distribution laws for the fragments of Ko\\v{s}ice meteorite
Gritsevich, Maria; Kohout, Tomáš; Tóth, Juraj; Peltoniemi, Jouni; Turchak, Leonid; Virtanen, Jenni
2014-01-01
In this study, we conduct a detailed analysis of the Ko\\v{s}ice meteorite fall (February 28, 2010), in order to derive a reliable law describing the mass distribution among the recovered fragments. In total, 218 fragments of the Ko\\v{s}ice meteorite, with a total mass of 11.285 kg, were analyzed. Bimodal Weibull, bimodal Grady and bimodal lognormal distributions are found to be the most appropriate for describing the Ko\\v{s}ice fragmentation process. Based on the assumption of bimodal lognormal, bimodal Grady, bimodal sequential and bimodal Weibull fragmentation distributions, we suggest that, prior to further extensive fragmentation in the lower atmosphere, the Ko\\v{s}ice meteoroid was initially represented by two independent pieces with cumulative residual masses of approximately 2 kg and 9 kg respectively. The smaller piece produced about 2 kg of multiple lightweight meteorite fragments with the mean around 12 g. The larger one resulted in 9 kg of meteorite fragments, recovered on the ground, including the...
Analysis and Synthesis of Distributed Real-Time Embedded Systems
DEFF Research Database (Denmark)
Pop, Paul; Eles, Petru; Peng, Zebo
Embedded computer systems are now everywhere: from alarm clocks to PDAs, from mobile phones to cars, almost all the devices we use are controlled by embedded computers. An important class of embedded computer systems is that of hard real-time systems, which have to fulfill strict timing...... in important reductions of design costs. Analysis and Synthesis of Distributed Real-Time Embedded Systems will be of interest to advanced undergraduates, graduate students, researchers and designers involved in the field of embedded systems....
Electrical Power Distribution and Control Modeling and Analysis
Fu, Johnny S.; Liffring, Mark; Mehdi, Ishaque S.
2001-01-01
This slide presentation reviews the use of Electrical Power Distribution and Control (EPD&C) Modeling and how modeling can support analysis. The presentation discusses using the EASY5 model to simulate and analyze the Space Shuttle Electric Auxiliary Power Unit. Diagrams of the model schematics are included, as well as graphs of the battery cell impedance, hydraulic load dynamics, and EPD&C response to hydraulic load variations.
Synchronization analysis on cascaded multilevel converters with distributed control
Institute of Scientific and Technical Information of China (English)
无
2008-01-01
Cascaded multilevel converters built with integrated modules have many advantages such as increased power density,flexible distributed control, multi-functionality, increased reliability and short design cycles. However, the system performance will be affected due to the synchronization errors among each integrated modules. This paper analyzes the impact of the three kinds of synchronization errors on the whole system performance, as well as detailed synchronization implementation. Some valuable conclusions are derived from the theoretical analysis, simulations and experimental results.
Power distribution and performance analysis for wireless communication networks
Zhao, Dongmei
2012-01-01
This book provides an analysis of transmission power and network performance in different wireless communication networks. It presents the latest research and techniques for power and interference control and performance modeling in wireless communication networks with different network topologies, air interfaces, and transmission techniques. While studying the power distributions and resource management, the reader will also learn basic methodology and skills for problem formulations, can ascertain the complexity for designing radio resource management strategies in modern wireless communicat
Distributional Analysis for Model Predictive Deferrable Load Control
Chen, Niangjun; Gan, Lingwen; Low, Steven H.; Wierman, Adam
2014-01-01
Deferrable load control is essential for handling the uncertainties associated with the increasing penetration of renewable generation. Model predictive control has emerged as an effective approach for deferrable load control, and has received considerable attention. In particular, previous work has analyzed the average-case performance of model predictive deferrable load control. However, to this point, distributional analysis of model predictive deferrable load control has been elusive. In ...
Performance analysis of distributed beamforming in a spectrum sharing system
Yang, Liang
2012-09-01
In this paper, we consider a distributed beamforming scheme (DBF) in a spectrum sharing system where multiple secondary users share the spectrum with the licensed primary users under an interference temperature constraint. We assume that DBF is applied at the secondary users. We first consider optimal beamforming and compare it with the user selection scheme in terms of the outage probability and bit-error rate performance. Since perfect feedback is difficult to obtain, we then investigate a limited feedback DBF scheme and develop an outage probability analysis for a random vector quantization (RVQ) design algorithm. Numerical results are provided to illustrate our mathematical formalism and verify our analysis. © 2012 IEEE.
Automatic analysis of attack data from distributed honeypot network
Safarik, Jakub; Voznak, MIroslav; Rezac, Filip; Partila, Pavol; Tomala, Karel
2013-05-01
There are many ways of getting real data about malicious activity in a network. One of them relies on masquerading monitoring servers as a production one. These servers are called honeypots and data about attacks on them brings us valuable information about actual attacks and techniques used by hackers. The article describes distributed topology of honeypots, which was developed with a strong orientation on monitoring of IP telephony traffic. IP telephony servers can be easily exposed to various types of attacks, and without protection, this situation can lead to loss of money and other unpleasant consequences. Using a distributed topology with honeypots placed in different geological locations and networks provides more valuable and independent results. With automatic system of gathering information from all honeypots, it is possible to work with all information on one centralized point. Communication between honeypots and centralized data store use secure SSH tunnels and server communicates only with authorized honeypots. The centralized server also automatically analyses data from each honeypot. Results of this analysis and also other statistical data about malicious activity are simply accessible through a built-in web server. All statistical and analysis reports serve as information basis for an algorithm which classifies different types of used VoIP attacks. The web interface then brings a tool for quick comparison and evaluation of actual attacks in all monitored networks. The article describes both, the honeypots nodes in distributed architecture, which monitor suspicious activity, and also methods and algorithms used on the server side for analysis of gathered data.
Human leptospirosis distribution pattern analysis in Hulu Langat, Selangor
Zulkifli, Zuhafiza; Shariff, Abdul Rashid Mohamed; Tarmidi, Zakri M.
2016-06-01
This paper discussed the distribution pattern of human leptospirosis in the Hulu Langat District, Selangor, Malaysia. The data used in this study is leptospirosis cases’ report, and spatial boundaries. Leptospirosis cases, data were collected from Health Office of Hulu Langat and spatial boundaries, including lot and district boundaries was collected from the Department of Mapping and Surveying Malaysia (JUPEM). A total of 599 leptospirosis cases were reported in 2013, and this data was mapped based on the addresses provided in the leptospirosis cases’ report. This study uses three statistical methods to analyze the distribution pattern; Moran's I, average nearest neighborhood (ANN) and kernel density estimation. The analysis was used to determine the spatial distribution and the average distance of leptospirosis cases and located the hotspot locations. Using Moran's I analysis, results indicated the cases were random, with a value of -0.202816 which show negative spatial autocorrelation exist among leptospirosis cases. The ANN analysis result, indicated the cases are in cluster pattern, with value of the average nearest neighbor ratio is -21.80. And results also show the hotspots are has been identified and mapped in the Hulu Langat District.
Distributed and interactive visual analysis of omics data.
Farag, Yehia; Berven, Frode S; Jonassen, Inge; Petersen, Kjell; Barsnes, Harald
2015-11-01
The amount of publicly shared proteomics data has grown exponentially over the last decade as the solutions for sharing and storing the data have improved. However, the use of the data is often limited by the manner of which it is made available. There are two main approaches: download and inspect the proteomics data locally, or interact with the data via one or more web pages. The first is limited by having to download the data and thus requires local computational skills and resources, while the latter most often is limited in terms of interactivity and the analysis options available. A solution is to develop web-based systems supporting distributed and fully interactive visual analysis of proteomics data. The use of a distributed architecture makes it possible to perform the computational analysis at the server, while the results of the analysis can be displayed via a web browser without the need to download the whole dataset. Here the challenges related to developing such systems for omics data will be discussed. Especially how this allows for multiple connected interactive visual displays of omics dataset in a web-based setting, and the benefits this provide for computational analysis of proteomics data.This article is part of a Special Issue entitled: Computational Proteomics.
Distributed Robustness Analysis of Interconnected Uncertain Systems Using Chordal Decomposition
DEFF Research Database (Denmark)
Pakazad, Sina Khoshfetrat; Hansson, Anders; Andersen, Martin Skovgaard
2014-01-01
Large-scale interconnected uncertain systems commonly have large state and uncertainty dimensions. Aside from the heavy computational cost of performing robust stability analysis in a centralized manner, privacy requirements in the network can also introduce further issues. In this paper, we...... utilize IQC analysis for analyzing large-scale interconnected uncertain systems and we evade these issues by describing a decomposition scheme that is based on the interconnection structure of the system. This scheme is based on the so-called chordal decomposition and does not add any conservativeness...... to the analysis approach. The decomposed problem can be solved using distributed computational algorithms without the need for a centralized computational unit. We further discuss the merits of the proposed analysis approach using a numerical experiment....
Weibull-Based Parts Failure Analysis Computer Program User’s Manual
1989-01-25
and ETA are based on (required) 4) Estimated value of time to first failure (This value has no effect on the calculations) (optional) Output data: 1...Confidence interval (range) for time to first failure 2) Estimated value of time to first failure (same as 4 above) A quick estimate of time to first...NUMBER OF FAILURES BETA AND ETA ARE BASED ON IS : 4 ESTIMATED (CALCULATED) VALUE OF TIME TO FIRST FAILURE IS : 6000 FIGURE 27: CNFIN INTERVAL CALCULATION
Discriminating Topology in Galaxy Distributions using Network Analysis
Hong, Sungryong; Dey, Arjun; Barabási, Albert -L; Vogelsberger, Mark; Hernquist, Lars; Gebhardt, Karl
2016-01-01
(abridged) The large-scale distribution of galaxies is generally analyzed using the two-point correlation function. However, this statistic does not capture the topology of the distribution, and it is necessary to resort to higher order correlations to break degeneracies. We demonstrate that an alternate approach using network analysis can discriminate between topologically different distributions that have similar two-point correlations. We investigate two galaxy point distributions, one produced by a cosmological simulation and the other by a L\\'evy walk. For the cosmological simulation, we adopt the redshift $z = 0.58$ slice from Illustris (Vogelsberger et al. 2014A) and select galaxies with stellar masses greater than $10^8$$M_\\odot$. The two point correlation function of these simulated galaxies follows a single power-law, $\\xi(r) \\sim r^{-1.5}$. Then, we generate L\\'evy walks matching the correlation function and abundance with the simulated galaxies. We find that, while the two simulated galaxy point d...
Mechanical network in titin immunoglobulin from force distribution analysis.
Directory of Open Access Journals (Sweden)
Wolfram Stacklies
2009-03-01
Full Text Available The role of mechanical force in cellular processes is increasingly revealed by single molecule experiments and simulations of force-induced transitions in proteins. How the applied force propagates within proteins determines their mechanical behavior yet remains largely unknown. We present a new method based on molecular dynamics simulations to disclose the distribution of strain in protein structures, here for the newly determined high-resolution crystal structure of I27, a titin immunoglobulin (IG domain. We obtain a sparse, spatially connected, and highly anisotropic mechanical network. This allows us to detect load-bearing motifs composed of interstrand hydrogen bonds and hydrophobic core interactions, including parts distal to the site to which force was applied. The role of the force distribution pattern for mechanical stability is tested by in silico unfolding of I27 mutants. We then compare the observed force pattern to the sparse network of coevolved residues found in this family. We find a remarkable overlap, suggesting the force distribution to reflect constraints for the evolutionary design of mechanical resistance in the IG family. The force distribution analysis provides a molecular interpretation of coevolution and opens the road to the study of the mechanism of signal propagation in proteins in general.
Development of a site analysis tool for distributed wind projects
Energy Technology Data Exchange (ETDEWEB)
Shaw, Shawn [The Cadmus Group, Inc., Waltham MA (United States)
2012-02-28
The Cadmus Group, Inc., in collaboration with the National Renewable Energy Laboratory (NREL) and Encraft, was awarded a grant from the Department of Energy (DOE) to develop a site analysis tool for distributed wind technologies. As the principal investigator for this project, Mr. Shawn Shaw was responsible for overall project management, direction, and technical approach. The product resulting from this project is the Distributed Wind Site Analysis Tool (DSAT), a software tool for analyzing proposed sites for distributed wind technology (DWT) systems. This user-friendly tool supports the long-term growth and stability of the DWT market by providing reliable, realistic estimates of site and system energy output and feasibility. DSAT-which is accessible online and requires no purchase or download of software-is available in two account types; Standard: This free account allows the user to analyze a limited number of sites and to produce a system performance report for each; and Professional: For a small annual fee users can analyze an unlimited number of sites, produce system performance reports, and generate other customizable reports containing key information such as visual influence and wind resources. The tool’s interactive maps allow users to create site models that incorporate the obstructions and terrain types present. Users can generate site reports immediately after entering the requisite site information. Ideally, this tool also educates users regarding good site selection and effective evaluation practices.
A New Lifetime Distribution with Bathtube and Unimodal Hazard Function
Barriga, Gladys D. C.; Louzada-Neto, Francisco; Cancho, Vicente G.
2008-11-01
In this paper we propose a new lifetime distribution which accommodate bathtub-shaped, unimodal, increasing and decreasing hazard function. Some special particular cases are derived, including the standard Weibull distribution. Maximum likelihood estimation is considered for estimate the tree parameters present in the model. The methodology is illustrated in a real data set on industrial devices on a lite test.
FIBER ORIENTATION DISTRIBUTION OF PAPER SURFACE CALCULATED BY IMAGE ANALYSIS
Institute of Scientific and Technical Information of China (English)
Toshiharu Enomae; Yoon-Hee Han; Akira Isogai
2004-01-01
Anisotropy of paper is an important parameter of paper structure. Image analysis technique was improved for accurate fiber orientation in paper surfaces. Image analysis using Fast Fourier Transform was demonstrated to be an effective means to determine fiber orientation angle and its intensity. Binarization process of micrograph images of paper surface and precise calculation for average Fourier coefficients as an angular distribution by interpolation developed were found to improve the accuracy. This analysis method was applied to digital optical micrographs and scanning electron micrographs of paper. A laboratory handsheet showed a large deviation in the average value of fiber orientation angle, but some kinds of machine-made paper showed about 90 degrees in the orientation angle with very small deviations as expected. Korea and Japanese paper made in the traditional ways showed its own characteristic depending on its hand making processes.
Analysis and Comparison of Typical Models within Distribution Network Design
DEFF Research Database (Denmark)
Jørgensen, Hans Jacob; Larsen, Allan; Madsen, Oli B.G.
are covered in the categorisation include fixed vs. general networks, specialised vs. general nodes, linear vs. nonlinear costs, single vs. multi commodity, uncapacitated vs. capacitated activities, single vs. multi modal and static vs. dynamic. The models examined address both strategic and tactical planning...... for educational purposes. Furthermore, the paper can be seen as a practical introduction to network design modelling as well as a being an art manual or recipe when constructing such a model.......This paper investigates the characteristics of typical optimisation models within Distribution Network Design. During the paper fourteen models known from the literature will be thoroughly analysed. Through this analysis a schematic approach to categorisation of distribution network design models...
Distributed Recursive Least-Squares: Stability and Performance Analysis
Mateos, Gonzalo
2011-01-01
The recursive least-squares (RLS) algorithm has well-documented merits for reducing complexity and storage requirements, when it comes to online estimation of stationary signals as well as for tracking slowly-varying nonstationary processes. In this paper, a distributed recursive least-squares (D-RLS) algorithm is developed for cooperative estimation using ad hoc wireless sensor networks. Distributed iterations are obtained by minimizing a separable reformulation of the exponentially-weighted least-squares cost, using the alternating-minimization algorithm. Sensors carry out reduced-complexity tasks locally, and exchange messages with one-hop neighbors to consent on the network-wide estimates adaptively. A steady-state mean-square error (MSE) performance analysis of D-RLS is conducted, by studying a stochastically-driven `averaged' system that approximates the D-RLS dynamics asymptotically in time. For sensor observations that are linearly related to the time-invariant parameter vector sought, the simplifying...
Water hammer analysis in a water distribution system
Directory of Open Access Journals (Sweden)
John Twyman
2017-04-01
Full Text Available The solution to water hammer in a water distribution system (WDS is shown by applying three hybrid methods (HM based on the Box’s scheme, McCormack's method and Diffusive Scheme. Each HM formulation in conjunction with their relative advantages and disadvantages are reviewed. The analyzed WDS has pipes with different lengths, diameters and wave speeds, being the Courant number different in each pipe according to the adopted discretization. The HM results are compared with the results obtained by the Method of Characteristics (MOC. In reviewing the numerical attenuation, second order schemes based on Box and McCormack are more conservative from a numerical point of view, being recommendable their application in the analysis of water hammer in water distribution systems.
DIRAC - The Distributed MC Production and Analysis for LHCb
Tsaregorodtsev, A; Closier, J; Frank, M; Garonne, V; Witek, M; Romanovski, V; Egede, U; Vagnoni, V; Korolko, I; Blouw, J; Kuznetsov, G; Patrick, G; Gandelman, M; Graciani-Diaz, R; Bernet, R; Brook, N; Pickford, A; Tobin, M; Saroka, A; Stokes-Rees, I; Saborido-Silva, J; Sanchez-Garcia, M
2004-09-30
DIRAC is the LHCb distributed computing grid infrastructure for Monte Carlo (MC) production and analysis. Its architecture is based on a set of distributed collaborating services. The service decomposition broadly follows the CERN/ARDA-RTAG proposal, which should allow for the interchange of the EGEE/gLite and DIRAC components. In this paper we give an overview of the DIRAC architecture, as well as the main design choices in its implementation. The light nature and modular design of the DIRAC components allows its functionality to be easily extended to include new computing and storage elements or to handle new types of tasks. The DIRAC system already uses different types of computing resources - from single PC's to a variety of batch systems and to the Grid environment. In particular, the DIRAC interface to the LCG2 grid will be presented.
DIRAC The Distributed MC Production and Analysis for LHCb
Bernet, R; Brook, N; Charpentier, P; Closier, J; Egede, U; Frank, M; Gandelman, M; Garonne, V; Graciani-Díaz, R; Korolko, I; Kuznetsov, G; Patrick, G; Pickford, A; Romanovski, V G; Saborido-Silva, J J; Sánchez-García, M; Saroka, A; Stokes-Rees, I; Tobin, M; Tsaregorodtsev, A Yu; Vagnoni, V; Witek, M
2005-01-01
DIRAC is the LHCb distributed computing grid infrastructure for MC production and analysis. Its architecture is based on a set of distributed collaborating services. The service decomposition broadly follows the CERN/ARDA-RTAG proposal, which can eventually make possible the interchange of the EGEE/gLite and DIRAC components. In this paper we give an overview of the DIRAC architecture, as well as the main design choices in its implementation. The light nature and modular design of the DIRAC components allows its functionality to be easily extended to include new computing and storage elements or to handle new types of tasks. The DIRAC system already uses different types of computing resources - from single PC's to a variety of batch systems and to the Grid environment. In particular, the DIRAC interface to the LCG2 grid will be presented.
Time series power flow analysis for distribution connected PV generation.
Energy Technology Data Exchange (ETDEWEB)
Broderick, Robert Joseph; Quiroz, Jimmy Edward; Ellis, Abraham; Reno, Matthew J.; Smith, Jeff; Dugan, Roger
2013-01-01
Distributed photovoltaic (PV) projects must go through an interconnection study process before connecting to the distribution grid. These studies are intended to identify the likely impacts and mitigation alternatives. In the majority of the cases, system impacts can be ruled out or mitigation can be identified without an involved study, through a screening process or a simple supplemental review study. For some proposed projects, expensive and time-consuming interconnection studies are required. The challenges to performing the studies are twofold. First, every study scenario is potentially unique, as the studies are often highly specific to the amount of PV generation capacity that varies greatly from feeder to feeder and is often unevenly distributed along the same feeder. This can cause location-specific impacts and mitigations. The second challenge is the inherent variability in PV power output which can interact with feeder operation in complex ways, by affecting the operation of voltage regulation and protection devices. The typical simulation tools and methods in use today for distribution system planning are often not adequate to accurately assess these potential impacts. This report demonstrates how quasi-static time series (QSTS) simulation and high time-resolution data can be used to assess the potential impacts in a more comprehensive manner. The QSTS simulations are applied to a set of sample feeders with high PV deployment to illustrate the usefulness of the approach. The report describes methods that can help determine how PV affects distribution system operations. The simulation results are focused on enhancing the understanding of the underlying technical issues. The examples also highlight the steps needed to perform QSTS simulation and describe the data needed to drive the simulations. The goal of this report is to make the methodology of time series power flow analysis readily accessible to utilities and others responsible for evaluating
The CMS Analysis Chain In a Distributed Environment
De Filippis, N; Barrass, T; Bonacorsi, D; Corvo, M; Ciraolo, G; Innocente, V; Donvito, G; Maggi, M; Pierro, A; Silvestris, L; Faina, L; Spiga, D; Fanfani, A; Fanzago, F; Grandi, C; Lacaprara, S; Taylor, L; Tuura, L; Wildish, T
2006-01-01
The CMS collaboration is undertaking a big effort to define the analysis model and to develop software tools with the purpose of analyzing several millions of simulated and real data events by a large number of people in many geographically distributed sites. From the computing point of view, one of the most complex issues when doing remote analysis is the data discovery and access. Some software tools were developed in order to move data, make them available to the full international community and validate them for the subsequent analysis. The batch analysis processing is performed with workload management tools developed on purpose, which are mainly responsible for the job preparation and the job submission.The job monitoring and the output management are implemented as the last part of the analysis chain. Grid tools provided by the LCG project are evaluated to gain access to the data and the resources by providing a user friendly interface to the physicists submitting the analysis jobs. An overview of the ...
Spatial analysis of snail distribution in Jiangning county
Institute of Scientific and Technical Information of China (English)
ZHANG Zhi-ying; ZHOU Yun; XU De-zhong; SUN Zhi-dong; ZHOU Xiao-nong; GONG Zi-li
2002-01-01
Objective: To explore the spatial distribution of oncomelenia snails in Jiangning County. Methods:Cluster analysis and the Spatial Scan Statistics were performed based on the density of alive-snails in habitats and its rate infected by the S. Japonicum. Results: Although areas of snail habitats and density of the alivesnails in marshland in 2000 are higher significantly than that in mountain areas in Jiangning County, the numbers of habitats in mountain are more than that in marshland and they distributed sporadically. The snail habitats were classified into 4 in marshlands and 3 classes in mountain areas respectively in cluster analysis.Although they are mainly the one with low density of alive and infected snails, we should alert that there are also some habitats with high snail density and infection rate, which is important for the transmission of schistosomia. The analysis of Spatial Scan Statistics detected 2 significant spatial aggregations for alive-snail in marshland and 4 in mountain areas respectively with p-values less than 0. 01. There are also 2 significant spatial aggregations for infected snails in marshland. Conclusion.. The significant spatial aggregations for alivesnails and infected snails indicated that there are some factors in the habitats suitable for the survival of snails and the transmission of schistosomia.
DIANE - Distributed analysis environment for GRID-enabled simulation and analysis of physics data
Moscicki, Jakub T
2003-01-01
Distributed ANalysis Environment (DIANE) is the result of R&D in CERN IT Division focused on interfacing semiinteractive parallel applications with distributed GRID technology. DIANE provides a master-worker workflow management layer above low-level GRID services. DIANE is application and language-neutral. Component- container architecture and component adapters provide flexibility necessary to fulfill the diverse requirements of distributed applications. Physical Transport Layer assures interoperability with existing middleware frameworks based on Web Services. Several distributed simulations based on Geant 4 were deployed and tested in real-life scenarios with DIANE.
Numerical analysis of decoy state quantum key distribution protocols
Energy Technology Data Exchange (ETDEWEB)
Harrington, Jim W [Los Alamos National Laboratory; Rice, Patrick R [Los Alamos National Laboratory
2008-01-01
Decoy state protocols are a useful tool for many quantum key distribution systems implemented with weak coherent pulses, allowing significantly better secret bit rates and longer maximum distances. In this paper we present a method to numerically find optimal three-level protocols, and we examine how the secret bit rate and the optimized parameters are dependent on various system properties, such as session length, transmission loss, and visibility. Additionally, we show how to modify the decoy state analysis to handle partially distinguishable decoy states as well as uncertainty in the prepared intensities.
EST analysis pipeline: use of distributed computing resources.
González, Francisco Javier; Vizcaíno, Juan Antonio
2011-01-01
This chapter describes how a pipeline for the analysis of expressed sequence tag (EST) data can be -implemented, based on our previous experience generating ESTs from Trichoderma spp. We focus on key steps in the workflow, such as the processing of raw data from the sequencers, the clustering of ESTs, and the functional annotation of the sequences using BLAST, InterProScan, and BLAST2GO. Some of the steps require the use of intensive computing power. Since these resources are not available for small research groups or institutes without bioinformatics support, an alternative will be described: the use of distributed computing resources (local grids and Amazon EC2).
The analysis and distribution of mescaline in postmortem tissues.
Henry, Joni L; Epley, Jahna; Rohrig, Timothy P
2003-09-01
Mescaline (3,4,5-trimethoxyphenethylamine) is a hallucinogenic alkaloid found in the peyote cactus. This report documents mescaline distribution in a death caused by multiple gunshot wounds. Mescaline was extracted with a butyl chloride liquid-liquid method and identified by mass spectrometry. Quantitative analysis was performed by gas chromatography using a nitrogen-phosphorus detector. Concentrations of the drug were 2.95 mg/L, 2.36 mg/L, 8.2 mg/kg, and 2.2 mg/kg in blood, vitreous, liver, and brain, respectively.
Cloud for Distributed Data Analysis Based on the Actor Model
Directory of Open Access Journals (Sweden)
Ivan Kholod
2016-01-01
Full Text Available This paper describes the construction of a Cloud for Distributed Data Analysis (CDDA based on the actor model. The design uses an approach to map the data mining algorithms on decomposed functional blocks, which are assigned to actors. Using actors allows users to move the computation closely towards the stored data. The process does not require loading data sets into the cloud and allows users to analyze confidential information locally. The results of experiments show that the efficiency of the proposed approach outperforms established solutions.
Aeroelastic Analysis of a Distributed Electric Propulsion Wing
Massey, Steven J.; Stanford, Bret K.; Wieseman, Carol D.; Heeg, Jennifer
2017-01-01
An aeroelastic analysis of a prototype distributed electric propulsion wing is presented. Results using MSC Nastran (Registered Trademark) doublet lattice aerodynamics are compared to those based on FUN3D Reynolds Averaged Navier- Stokes aerodynamics. Four levels of grid refinement were examined for the FUN3D solutions and solutions were seen to be well converged. It was found that no oscillatory instability existed, only that of divergence, which occurred in the first bending mode at a dynamic pressure of over three times the flutter clearance condition.
Govoni, A.; Lomax, A.; Michelini, A.
The Web now provides a single, universal infrastructure for developing client/server data access applications and the seismology community can greatly benefit of this situation both for the routine observatory data analysis and for research purposes. The Web has reduced the myriad of platforms and technologies used to handle and exchange data to a single user interface (HTML), a single client platform (the Web browser), a single network protocol (HTTP), and a single server platform (the Web server). In this context we have designed a system that taking advantage of the latest devel- opment in the client/server data access technologies based on JAVA, JAVA RMI and XML may solve the most common problems in the data access and manipulation com- monly experienced in the seismological community. Key concepts in this design are thin client approach, minimum standards for data exchange and distributed computing. Thin client means that any PC with a JAVA enabled Web browser can interact with a set of remote data servers distributed in the world computer network querying for data and for services. Minimum standards relates to the language needed for client/server interaction that must be abstract enough to avoid that everybody know all the details of the transaction and this is solved by XML. Distribution means that a set of servers is able to provide to the client not only a data object (the actual data and the methods to work on it) but also the computing power to perform a particular task (a remote method in the JAVA RMI context) and limits the exchange of data to the results. This allows for client interaction also in very limited communication bandwidth situations. We describe in detail also the implementation of the main modules of the toolkit. A data eater module that gathers/archives seismological data from a variety of sources ranging from portable digitizers data to real-time network data. A picking/location server that allows for multi user Web based analysis of
Distribution of Deformation on Cyprus, Inferences from Morphotectonic Analysis
Altinbas, Cevza; Yildirim, Cengiz; Tuysuz, Okan; Melnick, Daniel
2016-04-01
Cyprus is located on the subduction zone between African and Anatolian Plates. The topography of the island is a result of distributed deformation associated with the subduction related processes in the south of the Central Anatolian Plateau. Trodos and Kyrenia mountains are major morphotectonic units that integrally tied to plate boundary deformations. To elucidate the mode and pattern of active deformation and possible effects of subduction related processes on topography, we integrated morphometric and topographical analysis across the island. Our regional morphometric analysis rely on topographical swath profiles and topographic residuals to identify regional topographic anomalies, as well as steepness and concavity values of longitudinal river profiles that may reflect ongoing uplift. Accordingly, our swath profiles indicate an assymmetric topography across the Troodos Massif and Kyrenia Range. South of Trodos Massif indicates relatively less disected surfaces that partly associated with marine terraces of Quaternary. Our topographical resudial analysis indicate also strong relief assymmetry on the Troodos Massif that might be related to the Arakapas Fault and lithological contact between Neogene and Pre-Neogene rocks. In the north of the island the Kyrenia Range is characterized by a narrow, steep and long range that is delimited by the Ovgos Fault in the south. Our swath profiles across the range display also strong southward assymmetry. The southern flank is steeper in comparison to northern flank. The steepness index value of the rivers on the southern flank of the Kyrenia Range do not give strong signal along the Ovgos Fault. Neverthess, longitudinal profiles of rivers reveal evident deviations from degraded river profiles in the northern flank. Together with the presence of uplifted marine terraces along the northern flank that might indicate the presence of onshore structure(s) responsible for coastal uplift or regional uplift of the island because of
Cartographic system for spatial distribution analysis of corneal endothelial cells.
Corkidi, G; Márquez, J; García-Ruiz, M; Díaz-Cintra, S; Graue, E
1994-07-01
A combined cartographic and morphometric endothelium analyser has been developed by integrating the HISTO 2000 histological imaging and analysis system with a prototype human corneal endothelium analyser. The complete system allows the elaboration and analysis of cartographies of corneal endothelial tissue, and hence the in vitro study of the spatial distribution of corneal endothelial cells, according to their regional morphometric characteristics (cell size and polygonality). The global cartographic reconstruction is obtained by sequential integration of the data analysed for each microscopic field. Subsequently, the location of each microscopically analysed field is referred to its real position on the histologic preparation by means of X-Y co-ordinates; both are provided by micrometric optoelectronic sensors installed on the optical microscope stage. Some cartographies of an excised human corneal keratoconus button in vitro are also presented. These cartographic images allow a macroscopic view of endothelial cells analysed microscopically. Parametric colour images show the spatial distribution of endothelial cells, according to their specific morphometric parameters, and exhibit the variability in size and cellular shape which depend on the analysed area.
Data Intensive High Energy Physics Analysis in a Distributed Cloud
Sobie, R J; Anderson, M; Armstrong, P; Fransham, K; Gable, I; Harris, D; Leavett-Brown, C; Paterson, M; Penfold-Brown, D; Vliet, M; Charbonneau, A; Impey, R; Podaima, W
2011-01-01
We show that distributed Infrastructure-as-a-Service (IaaS) compute clouds can be effectively used for the analysis of high energy physics data. We have designed a distributed cloud system that works with any application using large input data sets requiring a high throughput computing environment. The system uses IaaS-enabled science and commercial clusters in Canada and the United States. We describe the process in which a user prepares an analysis virtual machine (VM) and submits batch jobs to a central scheduler. The system boots the user-specific VM on one of the IaaS clouds, runs the jobs and returns the output to the user. The user application accesses a central database for calibration data during the execution of the application. Similarly, the data is located in a central location and streamed by the running application. The system can easily run one hundred simultaneous jobs in an efficient manner and should scale to many hundreds and possibly thousands of user jobs.
Implementation of force distribution analysis for molecular dynamics simulations
Directory of Open Access Journals (Sweden)
Seifert Christian
2011-04-01
Full Text Available Abstract Background The way mechanical stress is distributed inside and propagated by proteins and other biopolymers largely defines their function. Yet, determining the network of interactions propagating internal strain remains a challenge for both, experiment and theory. Based on molecular dynamics simulations, we developed force distribution analysis (FDA, a method that allows visualizing strain propagation in macromolecules. Results To be immediately applicable to a wide range of systems, FDA was implemented as an extension to Gromacs, a commonly used package for molecular simulations. The FDA code comes with an easy-to-use command line interface and can directly be applied to every system built using Gromacs. We provide an additional R-package providing functions for advanced statistical analysis and presentation of the FDA data. Conclusions Using FDA, we were able to explain the origin of mechanical robustness in immunoglobulin domains and silk fibers. By elucidating propagation of internal strain upon ligand binding, we previously also successfully revealed the functionality of a stiff allosteric protein. FDA thus has the potential to be a valuable tool in the investigation and rational design of mechanical properties in proteins and nano-materials.
Specimen type and size effects on lithium hydride tensile strength distributions
Energy Technology Data Exchange (ETDEWEB)
Oakes, Jr, R E
1991-12-01
Weibull's two-parameter statistical-distribution function is used to account for the effects of specimen size and loading differences on strength distributions of lithium hydride. Three distinctly differing uniaxial specimen types (i.e., an elliptical-transition pure tensile specimen, an internally pressurized ring tensile, and two sizes of four-point-flexure specimens) are shown to provide different strength distributions as expected, because of their differing sizes and modes of loading. After separation of strengths into volumetric- and surface-initiated failure distributions, the Weibull characteristic strength parameters for the higher-strength tests associated with internal fracture initiations are shown to vary as predicted by the effective specimen volume Weibull relationship. Lower-strength results correlate with the effective area to much lesser degree, probably because of the limited number of surface-related failures and the different machining methods used to prepare the specimen. The strength distribution from the fourth specimen type, the predominantly equibiaxially stressed disk-flexure specimen, is well below that predicted by the two-parameter Weibull-derived effective volume or surface area relations. The two-parameter Weibull model cannot account for the increased failure probability associated with multiaxial stress fields. Derivations of effective volume and area relationships for those specimens for which none were found in the literature, the elliptical-transition tensile, the ring tensile, and the disk flexure (including the outer region), are also included.
Evaluation of Distribution Analysis Software for DER Applications
Energy Technology Data Exchange (ETDEWEB)
Staunton, RH
2003-01-23
unstoppable. In response, energy providers will be forced to both fully acknowledge the trend and plan for accommodating DER [3]. With bureaucratic barriers [4], lack of time/resources, tariffs, etc. still seen in certain regions of the country, changes still need to be made. Given continued technical advances in DER, the time is fast approaching when the industry, nation-wide, must not only accept DER freely but also provide or review in-depth technical assessments of how DER should be integrated into and managed throughout the distribution system. Characterization studies are needed to fully understand how both the utility system and DER devices themselves will respond to all reasonable events (e.g., grid disturbances, faults, rapid growth, diverse and multiple DER systems, large reactive loads). Some of this work has already begun as it relates to operation and control of DER [5] and microturbine performance characterization [6,7]. One of the most urgently needed tools that can provide these types of analyses is a distribution network analysis program in combination with models for various DER. Together, they can be used for (1) analyzing DER placement in distribution networks and (2) helping to ensure that adequate transmission reliability is maintained. Surveys of the market show products that represent a partial match to these needs; specifically, software that has been developed to plan electrical distribution systems and analyze reliability (in a near total absence of DER). The first part of this study (Sections 2 and 3 of the report) looks at a number of these software programs and provides both summary descriptions and comparisons. The second part of this study (Section 4 of the report) considers the suitability of these analysis tools for DER studies. It considers steady state modeling and assessment work performed by ORNL using one commercially available tool on feeder data provided by a southern utility. Appendix A provides a technical report on the results of
DEFF Research Database (Denmark)
Mihet-Popa, Lucian; Groza, Voicu; Isleifsson, Fridrik Rafn
2012-01-01
Experimental Testing for Stability Analysis of Distributed Energy Resources Components with Storage Devices and Loads......Experimental Testing for Stability Analysis of Distributed Energy Resources Components with Storage Devices and Loads...
Rafal Podlaski; Francis A. Roesch
2013-01-01
Study assessed the usefulness of various methods for choosing the initial values for the numerical procedures for estimating the parameters of mixture distributions and analysed variety of mixture models to approximate empirical diameter at breast height (dbh) distributions. Two-component mixtures of either the Weibull distribution or the gamma distribution were...
Directory of Open Access Journals (Sweden)
Justyna Kobryń
2017-01-01
Full Text Available Triterpenoid saponins complex of biological origin, escin, exhibits significant clinical activity in chronic venous insufficiency, skin inflammation, epidermal abrasions, allergic dermatitis, and acute impact injuries, especially in topical application. The aim of the study is the comparison of various hydrogel formulations, as carriers for a horse chestnut seed extract (EH. Methylcellulose (MC, two polyacrylic acid derivatives (PA1 and PA2, and polyacrylate crosspolymer 11 (PC-11 were employed. The release rates of EH were examined and a comparison with the Weibull model equation was performed. Application of MC as the carrier in the hydrogel preparation resulted in fast release rate of EH, whereas in the case of the hydrogel composed with PC-11 the release was rather prolonged. Applied Weibull function adhered best to the experimental data. Due to the evaluated shape parameter β, in the Weibull equation, the systems under study released the active compound according to the Fickian diffusion.
Lacunarity and multifractal analysis of the large DLA mass distribution
Rodriguez-Romo, Suemi; Sosa-Herrera, Antonio
2013-08-01
We show the methodology used to analyze fractal and mass-multifractal properties of very large Diffusion-Limited Aggregation (DLA) clusters with a maximum of 109 particles for 2D aggregates and 108 particles for 3D clusters, to support our main result; the scaling behavior obtained by our experimental results corresponds to the expected performance of monofractal objects. In order to estimate lacunarity measures for large DLA clusters, we develop a variant of the gliding-box algorithm which reduces the computer time needed to obtain experimental results. We show how our mass multifractal data have a tendency to present monofractal behavior for the mass distribution of the cases presented in this paper in the limit of very large clusters. Lacunarity analysis shows, provided we study small clusters mass distributions, data which might be interpreted as two different values of fractal dimensions while the cluster grows; however, this effect tends to vanish when the cluster size increases further, in such a way that monofractality is achieved. The outcomes of this paper lead us to conclude that the previously reported mass multifractality behavior (Vicsek et al., 1990 [13]) detected for DLA clusters is a consequence of finite size effects and floating point precision limitations and not an intrinsic feature of the phenomena, since the scaling behavior of our DLA clusters space corresponds to monofractal objects, being this situation remarkably noticeable in the limit of very large clusters.
A meta-analysis of parton distribution functions
Gao, Jun; Nadolsky, Pavel
2014-07-01
A "meta-analysis" is a method for comparison and combination of nonperturbative parton distribution functions (PDFs) in a nucleon obtained with heterogeneous procedures and assumptions. Each input parton distribution set is converted into a "meta-parametrization" based on a common functional form. By analyzing parameters of the meta-parametrizations from all input PDF ensembles, a combined PDF ensemble can be produced that has a smaller total number of PDF member sets than the original ensembles. The meta-parametrizations simplify the computation of the PDF uncertainty in theoretical predictions and provide an alternative to the 2010 PDF4LHC convention for combination of PDF uncertainties. As a practical example, we construct a META ensemble for computation of QCD observables at the Large Hadron Collider using the next-to-next-to-leading order PDF sets from CTEQ, MSTW, and NNPDF groups as the input. The META ensemble includes a central set that reproduces the average of LHC predictions based on the three input PDF ensembles and Hessian eigenvector sets for computing the combined PDF+α s uncertainty at a common QCD coupling strength of 0.118.
A meta-analysis of parton distribution functions
Gao, Jun
2014-01-01
A "meta-analysis" is a method for comparison and combination of nonperturbative parton distribution functions (PDFs) in a nucleon obtained with heterogeneous procedures and assumptions. Each input parton distribution set is converted into a "meta-parametrization" based on a common functional form. By analyzing parameters of the meta-parametrizations from all input PDF ensembles, a combined PDF ensemble can be produced that has a smaller total number of PDF member sets than the original ensembles. The meta-parametrizations simplify the computation of the PDF uncertainty in theoretical predictions and provide an alternative to the 2010 PDF4LHC convention for combination of PDF uncertainties. As a practical example, we construct a META ensemble for computation of QCD observables at the Large Hadron Collider using the next-to-next-to-leading order PDF sets from CTEQ, MSTW, and NNPDF groups as the input. The META ensemble includes a central set that reproduces the average of LHC predictions based on the three inpu...
Modelling and analysis of solar cell efficiency distributions
Wasmer, Sven; Greulich, Johannes
2017-08-01
We present an approach to model the distribution of solar cell efficiencies achieved in production lines based on numerical simulations, metamodeling and Monte Carlo simulations. We validate our methodology using the example of an industrial feasible p-type multicrystalline silicon “passivated emitter and rear cell” process. Applying the metamodel, we investigate the impact of each input parameter on the distribution of cell efficiencies in a variance-based sensitivity analysis, identifying the parameters and processes that need to be improved and controlled most accurately. We show that if these could be optimized, the mean cell efficiencies of our examined cell process would increase from 17.62% ± 0.41% to 18.48% ± 0.09%. As the method relies on advanced characterization and simulation techniques, we furthermore introduce a simplification that enhances applicability by only requiring two common measurements of finished cells. The presented approaches can be especially helpful for ramping-up production, but can also be applied to enhance established manufacturing.
RELIABILITY ANALYSIS OF RING, AGENT AND CLUSTER BASED DISTRIBUTED SYSTEMS
Directory of Open Access Journals (Sweden)
R.SEETHALAKSHMI
2011-08-01
Full Text Available The introduction of pervasive devices and mobile devices has led to immense growth of real time distributed processing. In such context reliability of the computing environment is very important. Reliability is the probability that the devices, links, processes, programs and files work efficiently for the specified period of time and in the specified condition. Distributed systems are available as conventional ring networks, clusters and agent based systems. Reliability of such systems is focused. These networks are heterogeneous and scalable in nature. There are several factors, which are to be considered for reliability estimation. These include the application related factors like algorithms, data-set sizes, memory usage pattern, input-output, communication patterns, task granularity and load-balancing. It also includes the hardware related factors like processor architecture, memory hierarchy, input-output configuration and network. The software related factors concerning reliability are operating systems, compiler, communication protocols, libraries and preprocessor performance. In estimating the reliability of a system, the performance estimation is an important aspect. Reliability analysis is approached using probability.
A Distributed Flocking Approach for Information Stream Clustering Analysis
Energy Technology Data Exchange (ETDEWEB)
Cui, Xiaohui [ORNL; Potok, Thomas E [ORNL
2006-01-01
Intelligence analysts are currently overwhelmed with the amount of information streams generated everyday. There is a lack of comprehensive tool that can real-time analyze the information streams. Document clustering analysis plays an important role in improving the accuracy of information retrieval. However, most clustering technologies can only be applied for analyzing the static document collection because they normally require a large amount of computation resource and long time to get accurate result. It is very difficult to cluster a dynamic changed text information streams on an individual computer. Our early research has resulted in a dynamic reactive flock clustering algorithm which can continually refine the clustering result and quickly react to the change of document contents. This character makes the algorithm suitable for cluster analyzing dynamic changed document information, such as text information stream. Because of the decentralized character of this algorithm, a distributed approach is a very natural way to increase the clustering speed of the algorithm. In this paper, we present a distributed multi-agent flocking approach for the text information stream clustering and discuss the decentralized architectures and communication schemes for load balance and status information synchronization in this approach.
Single and Joint Multifractal Analysis of Soil Particle Size Distributions
Institute of Scientific and Technical Information of China (English)
LI Yi; LI Min; R.HORTON
2011-01-01
It is noted that there has been little research to compare volume-based and number-based soil particle size distributions (PSDs).Our objectives were to characterize the scaling properties and the possible connections between volume-based and number-based PSDs by applying single and joint multifractal analysis.Twelve soil samples were taken from selected sites in Northwest China and their PSDs were analyzed using laser diffractometry.The results indicated that the volume-based PSDs of all 12 samples and thc number-based PSDs of 4 samples had multifractal scalings for moment order -6 ＜ q ＜ 6.Some empirical relationships were identified between the extreme probability values, maximum probability (Pmax), minimum probability (Pmin), and Pmax/Pmin, and the multifractal indices,the difference and the ratio of generalized dimensions at q=0 and 1(D0-D1 and D1/D0), maximum and minimum singularity strength (αmax and αmin) and their difference (αmax - αmin, spectrum width), and asymmetric index (RD).An increase in Pmax generally resulted in corresponding increases of D0 - D1, αmax, αmax - αmin, and RD, which indicated that a large Pmax increased the multifractality of a distribution.Joint multifractal analysis showed that there was significant correlation between the scaling indices of volume-based and number-based PSDs.The multifractality indices indicated that for a given soil, the volume-based PSD was more homogeneous than the number-based PSD, and more likely to display monofractal rather than multifractal scaling.
Directory of Open Access Journals (Sweden)
Abrahams Mwasha
2012-07-01
Full Text Available The exploitation of the wind energy resource is expected to have a key role in climate change mitigation in the Caribbean region. However, wind energy is also affected by climate change. The availability and reliability of wind energy depend on the climate conditions. An evaluation based on Weibull Distribution Model is done on the average monthly wind speed variation at a standard height of 10m over a ten year period in the island of Trinidad. The variations in the Power Density Per Unit Area of the wind are examined in relation to temperature changes for the last ten year period. The temperature is examined since increased temperatures provide significant evidence of global warming. In this paper, it was found that in Piarco, Trinidad, the highest average annual Power Density Per Unit Area is 14.42Wm-2 and occurred in the year 2004 at an annual average temperature 32.1oC. By comparing these values with the Power Densities per Unit Area and temperatures for other years, the temperature did not have any significant impact on the distribution of wind energy. As such, the wind power potential in Trinidad is not jeopardized by climate change.
Distributional patterns of cecropia (Cecropiaceae: a panbiogeographic analysis
Directory of Open Access Journals (Sweden)
Franco Rosselli Pilar
1997-06-01
Full Text Available A panbiogeographic analysis of the distributional patterns of 60 species of Cecropia was carried out. Based on the distributional ranges of 36 species, we found eight generalized tracks for Cecropia species. whereas distributional patterns of 24 species were uninformative for the analysis. The major concentration of species of Cecropia is in the Neotropical Andean region. where there are three generalized tracks and two nodes. The northern Andes in Colombia and Ecuador are richer than the Central Andes in Perú. they contain two generalized tracks; one to the west and another to the east, formed by individual tracks of eight species each. There are four generalized tracks outside the Andean region: two in the Amazonian region in Guayana-Pará and in Manaus. one in Roraima. one in Serra do Mar in the Atlantic forest of Brazil and one in Central America. Speciation in Cecropia may be related to the Andean first uplift.Con base en la distribución de 60 especies del género Cecropia, se hizo un análisis panbiogeográfico. Se construyeron 8 trazos generalizados con base en el patrón de distribución de 36 especies; la distribución de las demás especies no aportaba información para la definición de los trazos. La región andina tiene la mayor concentración de especies de Cecropia representada por la presencia de tres trazos generalizados y dos nodos; los dos trazos con mayor número de especies se localizan en su parte norte, en Colombia y Ecuador y el otro en los Andes centrales en Perú. Se encontraron además, cuatro trazos extrandinos: dos en la región amazónica, en Pará-Guayana y en Manaus, uno en Roraima, uno en Serra do Mar en la Selva Atlánfíca del Brasil y uno en Centro América. La especiación en Cecropia parece estar relacionada con el primer levantamiento de los Andes.
Garten, Justin; Hoover, Joe; Johnson, Kate M; Boghrati, Reihane; Iskiwitch, Carol; Dehghani, Morteza
2017-03-31
Theory-driven text analysis has made extensive use of psychological concept dictionaries, leading to a wide range of important results. These dictionaries have generally been applied through word count methods which have proven to be both simple and effective. In this paper, we introduce Distributed Dictionary Representations (DDR), a method that applies psychological dictionaries using semantic similarity rather than word counts. This allows for the measurement of the similarity between dictionaries and spans of text ranging from complete documents to individual words. We show how DDR enables dictionary authors to place greater emphasis on construct validity without sacrificing linguistic coverage. We further demonstrate the benefits of DDR on two real-world tasks and finally conduct an extensive study of the interaction between dictionary size and task performance. These studies allow us to examine how DDR and word count methods complement one another as tools for applying concept dictionaries and where each is best applied. Finally, we provide references to tools and resources to make this method both available and accessible to a broad psychological audience.
An Effective Distributed Model for Power System Transient Stability Analysis
Directory of Open Access Journals (Sweden)
MUTHU, B. M.
2011-08-01
Full Text Available The modern power systems consist of many interconnected synchronous generators having different inertia constants, connected with large transmission network and ever increasing demand for power exchange. The size of the power system grows exponentially due to increase in power demand. The data required for various power system applications have been stored in different formats in a heterogeneous environment. The power system applications themselves have been developed and deployed in different platforms and language paradigms. Interoperability between power system applications becomes a major issue because of the heterogeneous nature. The main aim of the paper is to develop a generalized distributed model for carrying out power system stability analysis. The more flexible and loosely coupled JAX-RPC model has been developed for representing transient stability analysis in large interconnected power systems. The proposed model includes Pre-Fault, During-Fault, Post-Fault and Swing Curve services which are accessible to the remote power system clients when the system is subjected to large disturbances. A generalized XML based model for data representation has also been proposed for exchanging data in order to enhance the interoperability between legacy power system applications. The performance measure, Round Trip Time (RTT is estimated for different power systems using the proposed JAX-RPC model and compared with the results obtained using traditional client-server and Java RMI models.
Performance analysis of distributed beamforming in a spectrum sharing system
Yang, Liang
2013-05-01
In this paper, we consider a distributed beamforming scheme (DBF) in a spectrum sharing system where multiple secondary users share the spectrum with some licensed primary users under an interference temperature constraint. We assume that the DBF is applied at the secondary users. We first consider optimal beamforming and compare it with the user selection scheme in terms of the outage probability and bit error rate performance metrics. Since perfect feedback is difficult to obtain, we then investigate a limited feedback DBF scheme and develop an analysis for a random vector quantization design algorithm. Specifically, the approximate statistics functions of the squared inner product between the optimal and quantized vectors are derived. With these statistics, we analyze the outage performance. Furthermore, the effects of channel estimation error and number of primary users on the system performance are investigated. Finally, optimal power adaptation and cochannel interference are considered and analyzed. Numerical and simulation results are provided to illustrate our mathematical formalism and verify our analysis. © 2012 IEEE.
Directional spatial frequency analysis of lipid distribution in atherosclerotic plaque
Korn, Clyde; Reese, Eric; Shi, Lingyan; Alfano, Robert; Russell, Stewart
2016-04-01
Atherosclerosis is characterized by the growth of fibrous plaques due to the retention of cholesterol and lipids within the artery wall, which can lead to vessel occlusion and cardiac events. One way to evaluate arterial disease is to quantify the amount of lipid present in these plaques, since a higher disease burden is characterized by a higher concentration of lipid. Although therapeutic stimulation of reverse cholesterol transport to reduce cholesterol deposits in plaque has not produced significant results, this may be due to current image analysis methods which use averaging techniques to calculate the total amount of lipid in the plaque without regard to spatial distribution, thereby discarding information that may have significance in marking response to therapy. Here we use Directional Fourier Spatial Frequency (DFSF) analysis to generate a characteristic spatial frequency spectrum for atherosclerotic plaques from C57 Black 6 mice both treated and untreated with a cholesterol scavenging nanoparticle. We then use the Cauchy product of these spectra to classify the images with a support vector machine (SVM). Our results indicate that treated plaque can be distinguished from untreated plaque using this method, where no difference is seen using the spatial averaging method. This work has the potential to increase the effectiveness of current in-vivo methods of plaque detection that also use averaging methods, such as laser speckle imaging and Raman spectroscopy.
Preliminary analysis of distributed in situ soil moisture measurements
Directory of Open Access Journals (Sweden)
L. Brocca
2005-01-01
Full Text Available Surface soil moisture content is highly variable in both space and time. Remote sensing can provide an effective methodology for mapping surface moisture content over large areas but ground based measurements are required to test its reliability and to calibrate retrieval algorithms. Recently, we had the opportunity to design and perform an experiment aimed at jointly acquiring measurements of surface soil water content at various locations and remotely sensed hyperspectral data. The area selected for the experiment is located in central Umbria and it extends for 90km2. For the area, detailed lithological and multi-temporal landslide inventory maps were available. We identified eight plots where measurements of soil water content were made using a Time Domain Reflectometer (TDR. The plots range in size from 100m2 to 600m2, and cover a variety of topographic and morphological settings. The TDR measurements were conducted during four days, on 5 April, 15 April, 2 May and 3 May 2004. On 3 May the NERC airborne CASI 2 acquired the hyperspectral data. Preliminary analysis concerning the matching between the landslides and the soil moisture were reported. Statistical and geostatistical analysis investigating the spatial-temporal soil moisture distribution were performed. These results will be compared with the data of surface temperature obtained from the remotely sensed hyperspectral sensor.
Use of Grid Tools to Support CMS Distributed Analysis
Fanfani, A; Anjum, A; Barrass, T; Bonacorsi, D; Bunn, J; Corvo, M; Darmenov, N; De Filippis, N; Donno, F; Donvito, G; Eulisse, G; Fanzago, F; Filine, A; Grandi, C; Hernández, J M; Innocente, V; Jan, A; Lacaprara, S; Legrand, I; Metson, S; Newman, H; Silvestris, L; Steenberg, C; Stockinger, H; Taylor, L; Thomas, M; Tuura, L; Van Lingen, F; Wildish, T
2004-01-01
In order to prepare the Physic Technical Design Report, due by end of 2005, the CMS experiment needs to simulate, reconstruct and anlayse about 100 million events, corresponding to more than 200 TB of data. The data will be distributed to several Computing Centres. In order to provide access to the whole data sample to all the world-wide dispersed physicists, CMS is developing a layer of software that uses the grid tools provided by the LCG project to gain access to data and resources and that aims to provide physicists with a user friendly interface for submitting analysis jobs. The GRID tools used are both those already available in the LCG-2 release and those being developed in gain access to data and resources and that aims to provide physicists with a user friendly interface for submitting analysis jobs. The GRID tools used are both those already available in the LCG-2 release and those being developed in the framework of the ARDA project. This work describes the current status and the future development...
Statistical distribution of rainfall in Uttarakhand, India
Kumar, Vikram; Shanu; Jahangeer
2017-07-01
Understanding of rainfall is an important issue for Uttarakhand, India which having varied topography and due to that extreme rainfall causes quick runoff which warns structural and functional safety of large structures and other natural resources. In this study, an attempt has been made to determine the best-fit distribution of the annual series of rainfall data for the period of 1991-2002 of 13 districts of Uttarakhand. A best-fit distribution such as Chi-squared, Chi-squared (2P), exponential, exponential (2P), gamma, gamma (3P), gen. extreme value (GEV), log-Pearson 3, Weibull, Weibull (3P) distributions was applied. Comparisons of best distributions were based on the use of goodness-of-fit tests such as Kolmogorov-Smirnov, Anderson-Darling, and Chi squared. Results showed that the Weibull distribution performed the best with 46% of the total district, while the second best distribution was Chi squared (2P) and log-Pearson. The results of this study would be useful to the water resource engineers, policy makers and planners for the agricultural development and conservation of natural resources of Uttarakhand.
2014-01-01
Distributing development tasks in the context of global software development bears both many risks and many opportunities. Nowadays, distributed development is often driven by only a few factors or even just a single factor such as workforce costs. Risks and other relevant factors such as workforce capabilities, the innovation potential of different regions, or cultural factors are often not recognized sufficiently. This could be improved by using empirically-based multi-criteria distribution...
Comparative analysis of aerosols elemental distribution in some Romanian regions
Amemiya, Susumu; Masuda, Toshio; Popa-Simil, Liviu; Mateescu, Liviu
1996-04-01
The study's main aim is obtaining aerosols particulate elemental distribution and mapping it for some Romanian regions, in order to obtain preliminary information regarding the concentrations of aerosol particles and networking strategy versus local conditions. For this we used the mobile sampling strategy, but taking care on all local specific conditions and weather. In the summer of 1993, in July we took about 8 samples on a rather large territory of SE Romania which were analysed and mapped. The regions which showed an interesting behaviour or doubts such as Bucharest and Dobrogea were zoomed in near the same period of 1994, for comparing the new details with the global aspect previously obtained. An attempt was made to infer the minimum necessary number of stations in a future monitoring network. A mobile sampler was used, having tow polycarbonate filter posts of 8 and 0.4 μm. PIXE elemental analysis was performed on a 2.5 MV Van de Graaff accelerator, by using a proton beam. More than 15 elements were measured. Suggestive 2D and 3D representations were drawn, as well as histogram charts for the concentrations' distribution in the specific regions at the specified times. In spite of the poor samples from the qualitative point of view the experiment surprised us by the good coincidence (good agreement) with realities in terrain known by other means long time ago, and highlighted the power of PIXE methods in terms of money and time. Conclusions over the link between industry, traffic, vegetation, wether, surface waters, soil composition, power plant exhaust and so on, on the one hand, and surface concentration distribution, on the other, were drawn. But the method's weak points were also highlighted; these are weather dependencies (especially air masses movement and precipitation), local relief, microclimate and vegetation, and of course localisation of the sampling point versus the pollution sources and their regime. The paper contains a synthesis of the whole
Directory of Open Access Journals (Sweden)
Carlos García Mogollón
2010-07-01
Full Text Available La guayaba (Psidium guajava L. es una fruta tropical susceptible de sufrir alteraciones indeseables que afectan su vida útil debido a condiciones inadecuadas de almacenamiento y acondicionamiento. En este trabajo se estimó la vida útil de guayaba en fresco utilizando el modelo probabilístico de Weibull y se valuó la calidad de los frutos durante almacenamiento en diferentes condiciones de temperatura y empaque. La evaluación poscosecha se hizo por 15 días con guayabas variedad regional roja. Se utilizó un dise&#ntilde;o completamente al azar con arreglo factorial consistente en tres factores: tiempo de almacenamiento con seis niveles (0, 3, 6, 9, 12 y 15 días, temperatura de almacenamiento con dos niveles: ambiente (37 °C y humedad relativa (HR entre 85 y 90% y refrigeración (9±2 °C y HR de 85 - 90%; dos tipo de empaques: bandeja de poliestireno con film plástico de PVC y 'foil' de aluminio. Durante la evaluación sensorial en el periodo de almacenamiento se usó una escala estructurada de tres puntos grado de satisfacción. El modelo de Weibull demostró ser adecuado para predecir la vida útil de la guayaba fresca basados en los criterios de ajustes, límites de confianza de aceptación y fallo. Durante el periodo de almacenamiento se observó que el factor tiempo, la temperatura y el tipo de empaque tienen un efecto estadístico significativo (P Guava is a tropical fruit susceptible to undesirable alterations that affect the shelf-life due to inadequate conditions of storage and packing. In this work the shelf-life of guava in fresh using the probabilistic model of Weibull was considered and the quality of the fruits was estimated during storage to different conditions of temperature and packing. The postharvest evaluation was made during 15 days with guavas variety `Red Regional´. The completely randomized design and factorial design with 3 factors: storage time with 6 levels (0, 3, 6, 9, 12, 15 days, storage temperature with
Energy Technology Data Exchange (ETDEWEB)
Santhosh, T.V., E-mail: santutv@barc.gov.in [Reactor Safety Division, Bhabha Atomic Research Centre (India); Gopika, V. [Reactor Safety Division, Bhabha Atomic Research Centre (India); Ghosh, A.K. [Raja Ramanna Fellow, Department of Atomic Energy (India); Fernandes, B.G. [Department of Electrical Engineering, Indian Institute of Technology Bombay (India); Dubey, K.A. [Radiation Technology Development Division, Bhabha Atomic Research Centre (India)
2016-01-15
Highlights: • An approach for time dependent reliability prediction of I&C cable insulation materials for use in PSA of NPP has been developed based on OIT and OITp measurement, and Weibull theory. • OITs were determined from the measured OITp based on the fundamental thermodynamics principles, and the correlations obtained from DSC and FTIR are in good agreement with the EAB. • The SEM of thermal and irradiated samples of insulation materials was performed to support the degradation behaviour observed from OIT and EAB measurements. • The proposed methodology has been illustrated with the accelerated thermal and radiation ageing data on low voltage cables used in NPP for I&C applications. • The time dependent reliability predicted from the OIT based on Weibull theory will be useful in incorporating the cable ageing into PSA of NPP. - Abstract: Instrumentation and control (I&C) cables used in nuclear power plants (NPPs) are exposed to various deteriorative environmental effects during their operational lifetime. The factors consisting of long-term irradiation and enhanced temperature eventually result in insulation degradation. Monitoring of the actual state of the cable insulation and the prediction of their residual service life consist of the measurement of the properties that are directly proportional to the functionality of the cables (usually, elongation at break is used as the critical parameter). Although, several condition monitoring (CM) and life estimation techniques are available, currently there is no any standard methodology or an approach towards incorporating the cable ageing effects into probabilistic safety assessment (PSA) of NPPs. In view of this, accelerated thermal and radiation ageing of I&C cable insulation materials have been carried out and the degradation due to thermal and radiation ageing has been assessed using oxidation induction time (OIT) and oxidation induction temperature (OITp) measurements by differential scanning
Comparative Analysis of Possible Designs for Flexible Distribution System Operation
DEFF Research Database (Denmark)
Lin, Jeremy; Knezovic, Katarina
2016-01-01
A massive amount of distributed energy resources will be connected to the distribution system in the near future. This emerging phenomenon will pose significant challenges to the traditional operation of distribution systems. This clearly calls for a growing need to develop novel grid designs...
Factory Gate Pricing : An Analysis of the Dutch Retail Distribution
Le Blanc, H.M.; Cruijssen, F.C.A.M.; Fleuren, H.A.; de Koster, M.B.M.
2004-01-01
Factory Gate Pricing (FGP) is a relatively new phenomenon in retail distribution.Under FGP, products are no longer delivered at the retailer distribution center, but collected by the retailer at the factory gates of the suppliers.Owing to both the asymmetry in the distribution networks (the supplier
Factory Gate Pricing: An Analysis of the Dutch Retail Distribution
H.M. le Blanc; F. Cruijssen (Frans); H.A. Fleuren; M.B.M. de Koster (René)
2004-01-01
textabstractFactory Gate Pricing (FGP) is a relatively new phenomenon in retail distribution. Under FGP, products are no longer delivered at the retailer distribution center, but collected by the retailer at the factory gates of the suppliers. Owing to both the asymmetry in the distribution networks
Bayesian estimation of generalized exponential distribution under noninformative priors
Moala, Fernando Antonio; Achcar, Jorge Alberto; Tomazella, Vera Lúcia Damasceno
2012-10-01
The generalized exponential distribution, proposed by Gupta and Kundu (1999), is a good alternative to standard lifetime distributions as exponential, Weibull or gamma. Several authors have considered the problem of Bayesian estimation of the parameters of generalized exponential distribution, assuming independent gamma priors and other informative priors. In this paper, we consider a Bayesian analysis of the generalized exponential distribution by assuming the conventional noninformative prior distributions, as Jeffreys and reference prior, to estimate the parameters. These priors are compared with independent gamma priors for both parameters. The comparison is carried out by examining the frequentist coverage probabilities of Bayesian credible intervals. We shown that maximal data information prior implies in an improper posterior distribution for the parameters of a generalized exponential distribution. It is also shown that the choice of a parameter of interest is very important for the reference prior. The different choices lead to different reference priors in this case. Numerical inference is illustrated for the parameters by considering data set of different sizes and using MCMC (Markov Chain Monte Carlo) methods.
Probabilistic analysis in normal operation of distribution system with distributed generation
DEFF Research Database (Denmark)
Villafafila-Robles, R.; Sumper, A.; Bak-Jensen, B.
2011-01-01
Nowadays, the incorporation of high levels of small-scale non-dispatchable distributed generation is leading to the transition from the traditional 'vertical' power system structure to a 'horizontally-operated' power system, where the distribution networks contain both stochastic generation...
Clustering analysis of seismicity and aftershock identification.
Zaliapin, Ilya; Gabrielov, Andrei; Keilis-Borok, Vladimir; Wong, Henry
2008-07-01
We introduce a statistical methodology for clustering analysis of seismicity in the time-space-energy domain and use it to establish the existence of two statistically distinct populations of earthquakes: clustered and nonclustered. This result can be used, in particular, for nonparametric aftershock identification. The proposed approach expands the analysis of Baiesi and Paczuski [Phys. Rev. E 69, 066106 (2004)10.1103/PhysRevE.69.066106] based on the space-time-magnitude nearest-neighbor distance eta between earthquakes. We show that for a homogeneous Poisson marked point field with exponential marks, the distance eta has the Weibull distribution, which bridges our results with classical correlation analysis for point fields. The joint 2D distribution of spatial and temporal components of eta is used to identify the clustered part of a point field. The proposed technique is applied to several seismicity models and to the observed seismicity of southern California.
Analysis of distribution systems with a high penetration of distributed generation
DEFF Research Database (Denmark)
Lund, Torsten
power alone can cover the entire load demand. The objective of the work is to investigate the influence of wind power and distributed combined heat and power production on the operation of the distribution systems. Where other projects have focused on the modeling and control of the generators and prime...... is on the representation of the network during and after a fault as a Thevenin equivalent voltage and impedance. The influence of adjacent synchronous generators, Danish concept wind turbines, SVCs and STATCOMs on the Thevenin parameters have been investigated. Thirdly, the problem of voltage and reactive power control...... movers, the focus of this project is on the operation of an entire distribution system with several wind farms and CHPs. Firstly, the subject of allocation of power system losses in a distribution system with distributed generation is treated. A new approach to loss allocation based on current injections...
Rod internal pressure quantification and distribution analysis using Frapcon
Energy Technology Data Exchange (ETDEWEB)
Jessee, Matthew Anderson [ORNL; Wieselquist, William A [ORNL; Ivanov, Kostadin [Pennsylvania State University, University Park
2015-09-01
This report documents work performed supporting the Department of Energy (DOE) Office of Nuclear Energy (NE) Fuel Cycle Technologies Used Fuel Disposition Campaign (UFDC) under work breakdown structure element 1.02.08.10, ST Analysis. In particular, this report fulfills the M4 milestone M4FT- 15OR0810036, Quantify effects of power uncertainty on fuel assembly characteristics, within work package FT-15OR081003 ST Analysis-ORNL. This research was also supported by the Consortium for Advanced Simulation of Light Water Reactors (http://www.casl.gov), an Energy Innovation Hub (http://www.energy.gov/hubs) for Modeling and Simulation of Nuclear Reactors under U.S. Department of Energy Contract No. DE-AC05-00OR22725. The discharge rod internal pressure (RIP) and cladding hoop stress (CHS) distributions are quantified for Watts Bar Nuclear Unit 1 (WBN1) fuel rods by modeling core cycle design data, operation data (including modeling significant trips and downpowers), and as-built fuel enrichments and densities of each fuel rod in FRAPCON-3.5. A methodology is developed which tracks inter-cycle assembly movements and assembly batch fabrication information to build individual FRAPCON inputs for each evaluated WBN1 fuel rod. An alternate model for the amount of helium released from the zirconium diboride (ZrB2) integral fuel burnable absorber (IFBA) layer is derived and applied to FRAPCON output data to quantify the RIP and CHS for these types of fuel rods. SCALE/Polaris is used to quantify fuel rodspecific spectral quantities and the amount of gaseous fission products produced in the fuel for use in FRAPCON inputs. Fuel rods with ZrB2 IFBA layers (i.e., IFBA rods) are determined to have RIP predictions that are elevated when compared to fuel rod without IFBA layers (i.e., standard rods) despite the fact that IFBA rods often have reduced fill pressures and annular fuel pellets. The primary contributor to elevated RIP predictions at burnups less than and greater than 30 GWd
A distributed analysis of Human impact on global sediment dynamics
Cohen, S.; Kettner, A.; Syvitski, J. P.
2012-12-01
Understanding riverine sediment dynamics is an important undertaking for both socially-relevant issues such as agriculture, water security and infrastructure management and for scientific analysis of landscapes, river ecology, oceanography and other disciplines. Providing good quantitative and predictive tools in therefore timely particularly in light of predicted climate and landuse changes. Ever increasing human activity during the Anthropocene have affected sediment dynamics in two major ways: (1) an increase is hillslope erosion due to agriculture, deforestation and landscape engineering and (2) trapping of sediment in dams and other man-made reservoirs. The intensity and dynamics between these man-made factors vary widely across the globe and in time and are therefore hard to predict. Using sophisticated numerical models is therefore warranted. Here we use a distributed global riverine sediment flux and water discharge model (WBMsed) to compare a pristine (without human input) and disturbed (with human input) simulations. Using these 50 year simulations we will show and discuss the complex spatial and temporal patterns of human effect on riverine sediment flux and water discharge.
Particle Swarm Optimization for Hydraulic Analysis of Water Distribution Systems
Directory of Open Access Journals (Sweden)
Naser Moosavian
2015-06-01
Full Text Available The analysis of flow in water-distribution networks with several pumps by the Content Model may be turned into a non-convex optimization uncertain problem with multiple solutions. Newton-based methods such as GGA are not able to capture a global optimum in these situations. On the other hand, evolutionary methods designed to use the population of individuals may find a global solution even for such an uncertain problem. In the present paper, the Content Model is minimized using the particle-swarm optimization (PSO technique. This is a population-based iterative evolutionary algorithm, applied for non-linear and non-convex optimization problems. The penalty-function method is used to convert the constrained problem into an unconstrained one. Both the PSO and GGA algorithms are applied to analyse two sample examples. It is revealed that while GGA demonstrates better performance in convex problems, PSO is more successful in non-convex networks. By increasing the penalty-function coefficient the accuracy of the solution may be improved considerably.
Scalable distributed service migration via Complex Networks Analysis
Pantazopoulos, Panagiotis; Stavrakakis, Ioannis
2010-01-01
With social networking sites providing increasingly richer context, User-Centric Service (UCS) creation is expected to explode following a similar success path to User-Generated Content. One of the major challenges in this emerging highly user-centric networking paradigm is how to make these exploding in numbers yet, individually, of vanishing demand services available in a cost-effective manner. Of prime importance to the latter (and focus of this paper) is the determination of the optimal location for hosting a UCS. Taking into account the particular characteristics of UCS, we formulate the problem as a facility location problem and devise a distributed and highly scalable heuristic solution to it. Key to the proposed approach is the introduction of a novel metric drawing on Complex Network Analysis. Given a current location of UCS, this metric helps to a) identify a small subgraph of nodes with high capacity to act as service demand concentrators; b) project on them a reduced yet accurate view of the globa...
Finite-key security analysis for multilevel quantum key distribution
Brádler, Kamil; Mirhosseini, Mohammad; Fickler, Robert; Broadbent, Anne; Boyd, Robert
2016-07-01
We present a detailed security analysis of a d-dimensional quantum key distribution protocol based on two and three mutually unbiased bases (MUBs) both in an asymptotic and finite-key-length scenario. The finite secret key rates (in bits per detected photon) are calculated as a function of the length of the sifted key by (i) generalizing the uncertainly relation-based insight from BB84 to any d-level 2-MUB QKD protocol and (ii) by adopting recent advances in the second-order asymptotics for finite block length quantum coding (for both d-level 2- and 3-MUB QKD protocols). Since the finite and asymptotic secret key rates increase with d and the number of MUBs (together with the tolerable threshold) such QKD schemes could in principle offer an important advantage over BB84. We discuss the possibility of an experimental realization of the 3-MUB QKD protocol with the orbital angular momentum degrees of freedom of photons.
Failure analysis a practical guide for manufacturers of electronic components and systems
Bâzu, Marius
2011-01-01
Failure analysis is the preferred method to investigate product or process reliability and to ensure optimum performance of electrical components and systems. The physics-of-failure approach is the only internationally accepted solution for continuously improving the reliability of materials, devices and processes. The models have been developed from the physical and chemical phenomena that are responsible for degradation or failure of electronic components and materials and now replace popular distribution models for failure mechanisms such as Weibull or lognormal. Reliability engineers nee
Directory of Open Access Journals (Sweden)
A. Taravat
2013-09-01
Full Text Available As a major aspect of marine pollution, oil release into the sea has serious biological and environmental impacts. Among remote sensing systems (which is a tool that offers a non-destructive investigation method, synthetic aperture radar (SAR can provide valuable synoptic information about the position and size of the oil spill due to its wide area coverage and day/night, and all-weather capabilities. In this paper we present a new automated method for oil-spill monitoring. A new approach is based on the combination of Weibull Multiplicative Model and machine learning techniques to differentiate between dark spots and the background. First, the filter created based on Weibull Multiplicative Model is applied to each sub-image. Second, the sub-image is segmented by two different neural networks techniques (Pulsed Coupled Neural Networks and Multilayer Perceptron Neural Networks. As the last step, a very simple filtering process is used to eliminate the false targets. The proposed approaches were tested on 20 ENVISAT and ERS2 images which contained dark spots. The same parameters were used in all tests. For the overall dataset, the average accuracies of 94.05 % and 95.20 % were obtained for PCNN and MLP methods, respectively. The average computational time for dark-spot detection with a 256 × 256 image in about 4 s for PCNN segmentation using IDL software which is the fastest one in this field at present. Our experimental results demonstrate that the proposed approach is very fast, robust and effective. The proposed approach can be applied to the future spaceborne SAR images.
Analysis of Statistical Distributions of Energization Overvoltages of EHV Cables
DEFF Research Database (Denmark)
Ohno, Teruo; Ametani, Akihiro; Bak, Claus Leth
Insulation levels of EHV systems have been determined based on the statistical distribution of switching overvoltages since 1970s when the statistical distribution was found for overhead lines. Responding to an increase in the planned and installed EHV cables, the authors have derived the statist......Insulation levels of EHV systems have been determined based on the statistical distribution of switching overvoltages since 1970s when the statistical distribution was found for overhead lines. Responding to an increase in the planned and installed EHV cables, the authors have derived...... the statistical distribution of energization overvoltages for EHV cables and have made clear their characteristics compared with those of the overhead lines. This paper identifies the causes and physical meanings of the characteristics so that it becomes possible to use the obtained statistical distribution...... for the determination of insulation levels of cable systems....
Modelling of Active Distribution Grids for Stability Analysis
2016-01-01
In the last years the share of distributed generation connected into distribution grids has increased considerably. As their number increases, distribution and transmission network operators are becoming aware on the risks DG can represent on the stable operation of national power systems. To cope with this, the grid code requirements are becoming more and more demanding in order to ensure the secure and reliable supply of energy to the end users. Early grid code requirement...
Analysis of Statistical Distributions of Energization Overvoltages of EHV Cables
DEFF Research Database (Denmark)
Ohno, Teruo; Ametani, Akihiro; Bak, Claus Leth;
Insulation levels of EHV systems have been determined based on the statistical distribution of switching overvoltages since 1970s when the statistical distribution was found for overhead lines. Responding to an increase in the planned and installed EHV cables, the authors have derived...... the statistical distribution of energization overvoltages for EHV cables and have made clear their characteristics compared with those of the overhead lines. This paper identifies the causes and physical meanings of the characteristics so that it becomes possible to use the obtained statistical distribution...... for the determination of insulation levels of cable systems....
Rafal Podlaski; Francis Roesch
2014-01-01
In recent years finite-mixture models have been employed to approximate and model empirical diameter at breast height (DBH) distributions. We used two-component mixtures of either the Weibull distribution or the gamma distribution for describing the DBH distributions of mixed-species, two-cohort forest stands, to analyse the relationships between the DBH components,...
A bivariate limiting distribution of tumor latency time.
Rachev, S T; Wu, C; Yakovlev AYu
1995-06-01
The model of radiation carcinogenesis, proposed earlier by Klebanov, Rachev, and Yakovlev [8] substantiates the employment of limiting forms of the latent time distribution at high dose values. Such distributions arise within the random minima framework, the two-parameter Weibull distribution being a special case. This model, in its present form, does not allow for carcinogenesis at multiple sites. As shown in the present paper, a natural two-dimensional generalization of the model appears in the form of a Weibull-Marshall-Olkin distribution. Similarly, the study of a randomized version of the model based on the negative binomial minima scheme results in a bivariate Pareto-Marshall-Olkin distribution. In the latter case, an estimate for the rate of convergence to the limiting distribution is given.
Distribution analysis of segmented wave sea clutter in littoral environments
CSIR Research Space (South Africa)
Strempel, MD
2015-10-01
Full Text Available are then fitted against the K-distribution. It is shown that the approach can accurately describe specific sections of the wave with a reduced error between actual and estimated distributions. The improved probability density function (PDF) representation...
Estimation of Nanoparticle Size Distributions by Image Analysis
DEFF Research Database (Denmark)
Fisker, Rune; Carstensen, Jens Michael; Hansen, Mikkel Fougt
2000-01-01
Knowledge of the nanoparticle size distribution is important for the interpretation of experimental results in many studies of nanoparticle properties. An automated method is needed for accurate and robust estimation of particle size distribution from nanoparticle images with thousands of particl...
Analysis Model for Domestic Hot Water Distribution Systems: Preprint
Energy Technology Data Exchange (ETDEWEB)
Maguire, J.; Krarti, M.; Fang, X.
2011-11-01
A thermal model was developed to estimate the energy losses from prototypical domestic hot water (DHW) distribution systems for homes. The developed model, using the TRNSYS simulation software, allows researchers and designers to better evaluate the performance of hot water distribution systems in homes. Modeling results were compared with past experimental study results and showed good agreement.
Bayesian Analysis for Binomial Models with Generalized Beta Prior Distributions.
Chen, James J.; Novick, Melvin, R.
1984-01-01
The Libby-Novick class of three-parameter generalized beta distributions is shown to provide a rich class of prior distributions for the binomial model that removes some restrictions of the standard beta class. A numerical example indicates the desirability of using these wider classes of densities for binomial models. (Author/BW)
Analysis of random laser scattering pulse signals with lognormal distribution
Institute of Scientific and Technical Information of China (English)
Yan Zhen-Gang; Bian Bao-Min; Wang Shou-Yu; Lin Ying-Lu; Wang Chun-Yong; Li Zhen-Hua
2013-01-01
The statistical distribution of natural phenomena is of great significance in studying the laws of nature.In order to study the statistical characteristics of a random pulse signal,a random process model is proposed theoretically for better studying of the random law of measured results.Moreover,a simple random pulse signal generation and testing system is designed for studying the counting distributions of three typical objects including particles suspended in the air,standard particles,and background noise.Both normal and lognormal distribution fittings are used for analyzing the experimental results and testified by chi-square distribution fit test and correlation coefficient for comparison.In addition,the statistical laws of three typical objects and the relations between them are discussed in detail.The relation is also the non-integral dimension fractal relation of statistical distributions of different random laser scattering pulse signal groups.
An Analysis of Checkpointing Algorithms for Distributed Mobile Systems
Directory of Open Access Journals (Sweden)
Ajay Khunteta
2010-07-01
Full Text Available Distributed snapshots are an important building block for distributed systems, and are useful for constructing efficient checkpointing protocols, among other uses. Direct application of these algorithms to mobile systems is not easible, however, due to differences in the environment in which mobile systems operate, relative to general distributed systems. The mobile computing environment introduces newchallenges in the area of fault-tolerant computing. Compared to traditional distributed environments, wireless networks are typically slower, providing lower throughput and latency, comparing to wireline networks. In addition, the mobile hosts have limited computation esources, are often exposed to harsh operating environment that makes them more likely to fail, and can roam while operating. Over the past two decades, intensive research work has been carried out on providing efficient checkpointing protocols in traditional distributed computing. Recently, more attention has been paid to providing checkpointing protocols for mobile systems. Some of these protocols have been adapted from the traditional distributed environment; others have been created from scratch for mobile systems. Checkpoint is defined as a designated place in a program at which normal processing is interrupted specifically to preserve the status information necessary to allow resumption of processing at alater time. Checkpointing is the process of saving the status information. This paper surveys the algorithms which have been reported in the literature for checkpointing in Mobile Distributed systems.
Statistical analysis of solar measurements in Algeria using beta distributions
Energy Technology Data Exchange (ETDEWEB)
Ettoumi, F. Youcef; Adane, A. [Univ. of Sciences and Technology of Algiers (U.S.T.H.B.), Dept. of Telecommunications, Algiers (Algeria); Mefti, A.; Bouroubi, M.Y. [Centre de Developpement des Energies Renouvelables (CDER), Algiers (Algeria)
2002-05-01
A method of smoothing solar data by beta probability distributions is implemented in this paper. In the first step, this method has been used to process daily sunshine duration data recorded at thirty- three meteorological stations in Algeria for eleven year periods or more. In the second step, it has been applied to hourly global solar irradiation flux measured in Algiers during the 1987/89 period. For each location and each month of the year, beta probability density functions fitting the monthly frequency distributions of the daily sunshine duration measurements are obtained. Both the parameters characterising the resulting beta distributions are then mapped, enabling us to build the frequency distributions of sunshine duration for every site in Algeria. In the case of solar radiation for Algiers, the recorded data have been processed following two different ways. The first one consists in sorting the hourly global solar irradiation data into eight typical classes of the daily clearness index. The second one is based on the repartition of these data per month. The results of the first classification show that for each class of daily clearness index, the hourly data under consideration are modelled by only one beta distribution. When using the second classification, linear combinations of two beta distributions are found to fit the monthly frequency distributions of the hourly solar radiation data. (Author)
Pérez-Sánchez, Julio; Senent-Aparicio, Javier
2017-08-01
Dry spells are an essential concept of drought climatology that clearly defines the semiarid Mediterranean environment and whose consequences are a defining feature for an ecosystem, so vulnerable with regard to water. The present study was conducted to characterize rainfall drought in the Segura River basin located in eastern Spain, marked by the self seasonal nature of these latitudes. A daily precipitation set has been utilized for 29 weather stations during a period of 20 years (1993-2013). Furthermore, four sets of dry spell length (complete series, monthly maximum, seasonal maximum, and annual maximum) are used and simulated for all the weather stations with the following probability distribution functions: Burr, Dagum, error, generalized extreme value, generalized logistic, generalized Pareto, Gumbel Max, inverse Gaussian, Johnson SB, Log-Logistic, Log-Pearson 3, Triangular, Weibull, and Wakeby. Only the series of annual maximum spell offer a good adjustment for all the weather stations, thereby gaining the role of Wakeby as the best result, with a p value means of 0.9424 for the Kolmogorov-Smirnov test (0.2 significance level). Probability of dry spell duration for return periods of 2, 5, 10, and 25 years maps reveal the northeast-southeast gradient, increasing periods with annual rainfall of less than 0.1 mm in the eastern third of the basin, in the proximity of the Mediterranean slope.
Analysis of DNS cache effects on query distribution.
Wang, Zheng
2013-01-01
This paper studies the DNS cache effects that occur on query distribution at the CN top-level domain (TLD) server. We first filter out the malformed DNS queries to purify the log data pollution according to six categories. A model for DNS resolution, more specifically DNS caching, is presented. We demonstrate the presence and magnitude of DNS cache effects and the cache sharing effects on the request distribution through analytic model and simulation. CN TLD log data results are provided and analyzed based on the cache model. The approximate TTL distribution for domain name is inferred quantificationally.
Data analysis and mapping of the mountain permafrost distribution
Deluigi, Nicola; Lambiel, Christophe; Kanevski, Mikhail
2017-04-01
the permafrost occurrence where it is unknown, the mentioned supervised learning techniques inferred a classification function from labelled training data (pixels of permafrost absence and presence). A particular attention was given to the pre-processing of the dataset, with the study of its complexity and the relation between permafrost data and employed environmental variables. The application of feature selection techniques completed this analysis and informed about redundant or valueless predictors. Classification performances were assessed with AUROC on independent validation sets (0.81 for LR, 0.85 with SVM and 0.88 with RF). At the micro scale obtained permafrost maps illustrate consistent results compared to the field reality thanks to the high resolution of the dataset (10 meters). Moreover, compared to classical models, the permafrost prediction is computed without recurring to altitude thresholds (above which permafrost may be found). Finally, as machine learning is a non-deterministic approach, mountain permafrost distribution maps are presented and discussed with corresponding uncertainties maps, which provide information on the quality of the results.
Determination analysis of energy conservation standards for distribution transformers
Energy Technology Data Exchange (ETDEWEB)
Barnes, P.R.; Van Dyke, J.W.; McConnell, B.W.; Das, S.
1996-07-01
This report contains information for US DOE to use in making a determination on proposing energy conservation standards for distribution transformers as required by the Energy Policy Act of 1992. Potential for saving energy with more efficient liquid-immersed and dry-type distribution transformers could be significant because these transformers account for an estimated 140 billion kWh of the annual energy lost in the delivery of electricity. Objective was to determine whether energy conservation standards for distribution transformers would have the potential for significant energy savings, be technically feasible, and be economically justified from a national perspective. It was found that energy conservation for distribution transformers would be technically and economically feasible. Based on the energy conservation options analyzed, 3.6-13.7 quads of energy could be saved from 2000 to 2030.
Foundations of distributed multiscale computing: formalization, specification, and analysis
Borgdorff, J.; Falcone, J.-L.; Lorenz, E.; Bona-Casas, C.; Chopard, B.; Hoekstra, A.G.
2013-01-01
Inherently complex problems from many scientific disciplines require a multiscale modeling approach. Yet its practical contents remain unclear and inconsistent. Moreover, multiscale models can be very computationally expensive, and may have potential to be executed on distributed infrastructure. In
Analysis of communication based distributed control of MMC for HVDC
DEFF Research Database (Denmark)
Huang, Shaojun; Teodorescu, Remus; Mathe, Laszlo
2013-01-01
methods and Matlab tools. Finally, sensitiveness of the distributed control system to modulation effect (phase-shifted PWM), communication delay, individual carrier frequency and sampling frequency is studied through simulations that are made in Matlab Simulink and PLECS....
Income Distribution Dependence of Poverty Measure: A Theoretical Analysis
Chattopadhyay, A K; Chattopadhyay, Amit K; Mallick, Sushanta K
2005-01-01
With a new deprivation (or poverty) function, in this paper, we theoretically study the changes in poverty with respect to the `global' mean and variance of the income distribution using Indian survey data. We show that when the income obeys a log-normal distribution, a rising mean income generally indicates a reduction in poverty while an increase in the variance of the income distribution increases poverty. This altruistic view for a developing economy, however, is not tenable anymore once the poverty index is found to follow a pareto distribution. Here although a rising mean income indicates a reduction in poverty, due to the presence of an inflexion point in the poverty function, there is a critical value of the variance below which poverty decreases with increasing variance while beyond this value, poverty undergoes a steep increase followed by a decrease with respect to higher variance. Following these results, we make quantitative predictions to correlate a developing with a developed economy.
THEORETICAL ANALYSIS ON THE VERTICAL DISTRIBUTION OF PARTICLE CONCENTRATION
Institute of Scientific and Technical Information of China (English)
Guangqian WANG; Xudong FU
2001-01-01
In steady, solid-liquid two-phase turbulent flows, there exist two typical patterns of the vertical distribution of particle concentration. The pattern I shows a maximum concentration at an elevation above the bed. The pattern II shows an increase of the particle concentration downward over the whole vertical,with the maximum at the bed. Most of the theories on particle concentration distribution have been done with the pattern II, and it is lack of a successful theory coveting both of the two patterns. This paper reviews the particle distribution theories, including the diffusion theory, the mixture theory, the energy theory, the similarity theory, the stochastic theory and the kinetic theory. The kinetic theory is also applied to describe the vertical distribution of particle concentration in both dilute and dense flows.
Asymptotic distributions in the projection pursuit based canonical correlation analysis
Institute of Scientific and Technical Information of China (English)
无
2010-01-01
In this paper, associations between two sets of random variables based on the projection pursuit (PP) method are studied. The asymptotic normal distributions of estimators of the PP based canonical correlations and weighting vectors are derived.
Statistical Analysis of Data for Timber Strengths
DEFF Research Database (Denmark)
Sørensen, John Dalsgaard; Hoffmeyer, P.
Statistical analyses are performed for material strength parameters from approximately 6700 specimens of structural timber. Non-parametric statistical analyses and fits to the following distributions types have been investigated: Normal, Lognormal, 2 parameter Weibull and 3-parameter Weibull....... The statistical fits have generally been made using all data (100%) and the lower tail (30%) of the data. The Maximum Likelihood Method and the Least Square Technique have been used to estimate the statistical parameters in the selected distributions. 8 different databases are analysed. The results show that 2......-parameter Weibull (and Normal) distributions give the best fits to the data available, especially if tail fits are used whereas the LogNormal distribution generally gives poor fit and larger coefficients of variation, especially if tail fits are used....
Complexity Analysis of Pipeline Mapping Problems in Distributed Heterogeneous Networks
Gu, Yi; Wu, Qishi; Zhu, Mengxia; Nageswara S. V. Rao
2009-01-01
Largescale scientific applications require using various system resources to execute complex computing pipelines in distributed networks to support collaborative research. System resources are typically shared in the Internet or over dedicated connections based on their location, availability, capability, and capacity. Optimizing the network performance of computing pipelines in such distributed environments is critical to the success of these applications.We consider two types of largescale ...
Furbish, David J.; Schmeeckle, Mark; Schumer, Rina; Fathel, Siobhan L.
2016-01-01
We describe the most likely forms of the probability distributions of bed load particle velocities, accelerations, hop distances, and travel times, in a manner that formally appeals to inferential statistics while honoring mechanical and kinematic constraints imposed by equilibrium transport conditions. The analysis is based on E. Jaynes's elaboration of the implications of the similarity between the Gibbs entropy in statistical mechanics and the Shannon entropy in information theory. By maximizing the information entropy of a distribution subject to known constraints on its moments, our choice of the form of the distribution is unbiased. The analysis suggests that particle velocities and travel times are exponentially distributed and that particle accelerations follow a Laplace distribution with zero mean. Particle hop distances, viewed alone, ought to be distributed exponentially. However, the covariance between hop distances and travel times precludes this result. Instead, the covariance structure suggests that hop distances follow a Weibull distribution. These distributions are consistent with high-resolution measurements obtained from high-speed imaging of bed load particle motions. The analysis brings us closer to choosing distributions based on our mechanical insight.
Flaw strength distributions and statistical parameters for ceramic fibers: The normal distribution
R'Mili, M.; Godin, N.; Lamon, J.
2012-05-01
The present paper investigates large sets of ceramic fibre failure strengths (500 to 1000 data) produced using tensile tests on tows that contained either 500 or 1000 filaments. The probability density function was determined through acoustic emission monitoring which allowed detection and counting of filament fractures. The statistical distribution of filament strengths was described using the normal distribution. The Weibull equation was then fitted to this normal distribution for estimation of statistical parameters. A perfect agreement between both distributions was obtained, and a quite negligible scatter in statistical parameters was observed, as opposed to the wide variability that is reported in the literature. Thus it was concluded that flaw strengths are distributed normally and that the statistical parameters that were derived are the true ones. In a second step, the conventional method of estimation of Weibull parameters was applied to these sets of data and, then, to subsets selected randomly. The influence of other factors involved in the conventional method of determination of statistical parameters is discussed. It is demonstrated that selection of specimens, sample size, and method of construction of so-called Weibull plots are responsible for statistical parameters variability.
Performance Analysis of Distributed Object-Oriented Applications
Schoeffler, James D.
1998-01-01
The purpose of this research was to evaluate the efficiency of a distributed simulation architecture which creates individual modules which are made self-scheduling through the use of a message-based communication system used for requesting input data from another module which is the source of that data. To make the architecture as general as possible, the message-based communication architecture was implemented using standard remote object architectures (Common Object Request Broker Architecture (CORBA) and/or Distributed Component Object Model (DCOM)). A series of experiments were run in which different systems are distributed in a variety of ways across multiple computers and the performance evaluated. The experiments were duplicated in each case so that the overhead due to message communication and data transmission can be separated from the time required to actually perform the computational update of a module each iteration. The software used to distribute the modules across multiple computers was developed in the first year of the current grant and was modified considerably to add a message-based communication scheme supported by the DCOM distributed object architecture. The resulting performance was analyzed using a model created during the first year of this grant which predicts the overhead due to CORBA and DCOM remote procedure calls and includes the effects of data passed to and from the remote objects. A report covering the distributed simulation software and the results of the performance experiments has been submitted separately. The above report also discusses possible future work to apply the methodology to dynamically distribute the simulation modules so as to minimize overall computation time.
Campos, Jose Alejandro Gonzalez; Moraga, Paulina Saavedra; Del Pozo, Manuel Freire
2013-01-01
This paper introduces the generalized beta (GB) model as a new modeling tool in the educational assessment area and evaluation analysis, specifically. Unlike normal model, GB model allows us to capture some real characteristics of data and it is an important tool for understanding the phenomenon of learning. This paper develops a contrast with the…
Methods for Dynamic Analysis of Distribution Feeders with High Penetration of PV Generators
Energy Technology Data Exchange (ETDEWEB)
Nagarajan, Adarsh; Ayyanar, Raja
2016-11-21
An increase in the number of inverter-interfaced photovoltaic (PV) generators on existing distribution feeders affects the design, operation, and control of the distribution systems. Existing distribution system analysis tools are capable of supporting only snapshot and quasi-static analyses. Capturing the dynamic effects of PV generators during the variation in distribution system states is necessary when studying the effects of controller bandwidths, multiple voltage correction devices, and anti-islanding. This work explores the use of dynamic phasors and differential algebraic equations (DAE) for impact analysis of PV generators on the existing distribution feeders.
Application of Support Vector Machine to Reliability Analysis of Engine Systems
Directory of Open Access Journals (Sweden)
Zhang Xinfeng
2013-07-01
Full Text Available Reliability analysis plays a very important role for assessing the performance and making maintenance plans of engine systems. This research presents a comparative study of the predictive performances of support vector machines (SVM , least square support vector machine (LSSVM and neural network time series models for forecasting failures and reliability in engine systems. Further, the reliability indexes of engine systems are computed by the weibull probability paper programmed with Matlab. The results shows that the probability distribution of the forecasting outcomes is consistent to the distribution of the actual data, which all follow weibull distribution and the predictions by SVM and LSSVM can provide accurate predictions of the characteristic life. So SVM and LSSVM are both another choice of engine system reliability analysis. Moreover, the predictive precise of the method based on LSSVM is higher than that of SVM. In small samples, the prediction by LSSVM will be more popular, because its compution cost is lower and the precise can be more satisfied.
Income distribution dependence of poverty measure: A theoretical analysis
Chattopadhyay, Amit K.; Mallick, Sushanta K.
2007-04-01
Using a modified deprivation (or poverty) function, in this paper, we theoretically study the changes in poverty with respect to the ‘global’ mean and variance of the income distribution using Indian survey data. We show that when the income obeys a log-normal distribution, a rising mean income generally indicates a reduction in poverty while an increase in the variance of the income distribution increases poverty. This altruistic view for a developing economy, however, is not tenable anymore once the poverty index is found to follow a pareto distribution. Here although a rising mean income indicates a reduction in poverty, due to the presence of an inflexion point in the poverty function, there is a critical value of the variance below which poverty decreases with increasing variance while beyond this value, poverty undergoes a steep increase followed by a decrease with respect to higher variance. Identifying this inflexion point as the poverty line, we show that the pareto poverty function satisfies all three standard axioms of a poverty index [N.C. Kakwani, Econometrica 43 (1980) 437; A.K. Sen, Econometrica 44 (1976) 219] whereas the log-normal distribution falls short of this requisite. Following these results, we make quantitative predictions to correlate a developing with a developed economy.
Directory of Open Access Journals (Sweden)
Francis Schaffner
2016-12-01
Full Text Available This is the second of a number of planned data papers presenting modelled vector distributions produced originally during the ECDC funded VBORNET project. This work continues under the VectorNet project now jointly funded by ECDC and EFSA. Further data papers will be published after sampling seasons when more field data will become available allowing further species to be modelled or validation and updates to existing models. The data package described here includes those mosquito species first modelled in 2013 & 2014 as part of the VBORNET gap analysis work which aimed to identify areas of potential species distribution in areas lacking records. It comprises three species models together with suitability masks based on land class and environmental limits. The species included as part of this phase are the mosquitoes 'Aedes vexans', 'Anopheles plumbeus' and 'Culex modestus'. The known distributions of these species within the area covered by the project (Europe, the Mediterranean Basin, North Africa, and Eurasia are currently incomplete to a greater or lesser degree. The models are designed to fill the gaps with predicted distributions, to provide a assistance in targeting surveys to collect distribution data for those areas with no field validated information, and b a first indication of the species distributions within the project areas.
Analysis of magnetic electron lens with secant hyperbolic field distribution
Pany, S S; Dubey, B P
2014-01-01
Electron-optical imaging instruments like Scanning Electron Microscope (SEM) and Transmission Electron Microscope (TEM) use specially designed solenoid electromagnets for focusing of electron beam probe. Indicators of imaging performance of these instruments, like spatial resolution, have strong correlation with focal characteristics of the magnetic lenses which in turn have been shown to be functions of the spatial distribution of axial magnetic field generated by them. Owing to complicated design of practical lenses, empirical mathematical expressions are deemed convenient for use in physics based calculations of their focal properties. So, degree of closeness of such models to the actual field distribution determines accuracy of the calculations. Mathematical models proposed by Glaser[1] and Ramberg[1] have historically been put into extensive use. In this paper the authors discuss one such model with secant-hyperbolic type magnetic field distribution function, and present a comparison among these models, ...
Performance Analysis of Distributed Neyman-Pearson Detection Systems
Institute of Scientific and Technical Information of China (English)
ZHAO Juan; TAO Ran; WANG Yue; ZHOU Si-yong
2007-01-01
The performance of a distributed Neyman-Pearson detection system is considered with the decision rules of the sensors given and the decisions from different sensors being mutually independent conditioned on both hypothese. To achieve the better performance at the fusion center for a general detection system of n＞3 sensor configuration, the necessary and sufficient conditions are derived by comparing the probability of detection at the fusion center with that of each of the sensors, with the constraint that the probability of false alarm at the fusion center is equal to that of the sensor. The conditions are related with the performances of the sensors and using the results we can predict the performance at the fusion center of a distributed detection system and can choose appropriate sensors to construct efficient distributed detection systems.
Analysis of distribution uniformity of nodes in wireless sensor networks
Institute of Scientific and Technical Information of China (English)
Zhang Zhenjiang
2007-01-01
Wireless sensor networks have several special characteristics which make against the network coverage, such as shortage of energy, difficulty with energy supply and so on. In order to prolong the lifetime of wireless sensor networks, it is necessary to balance the whole network load. As the energy consumption is related to the situation of nodes, the distribution uniformity must be considered. In this paper, a new model is proposed to evaluate the nodes distribution uniformity by considering some parameters which include compression discrepancy, sparseness discrepancy, self discrepancy, maximum cavity radius and minimum cavity radius. The simulation results show that the presented model could be helpful for measuring the distribution uniformity of nodes scattered randomly in wireless sensor networks.
Core Flow Distribution from Coupled Supercritical Water Reactor Analysis
Directory of Open Access Journals (Sweden)
Po Hu
2014-01-01
Full Text Available This paper introduces an extended code package PARCS/RELAP5 to analyze steady state of SCWR US reference design. An 8 × 8 quarter core model in PARCS and a reactor core model in RELAP5 are used to study the core flow distribution under various steady state conditions. The possibility of moderator flow reversal is found in some hot moderator channels. Different moderator flow orifice strategies, both uniform across the core and nonuniform based on the power distribution, are explored with the goal of preventing the reversal.
Phylogenetic analysis reveals wide distribution of globin X
Directory of Open Access Journals (Sweden)
Makałowski Wojciech
2011-10-01
Full Text Available Abstract The vertebrate globin gene repertoire consists of seven members that differ in terms of structure, function and phyletic distribution. While hemoglobin, myoglobin, cytoglobin, and neuroglobin are present in almost all gnathostomes examined so far, other globin genes, like globin X, are much more restricted in their phyletic distribution. Till today, globin X has only been found in teleost fish and Xenopus. Here, we report that globin X is also present in the genomes of the sea lamprey, ghost shark and reptiles. Moreover, the identification of orthologs of globin X in crustacean, insects, platyhelminthes, and hemichordates confirms its ancient origin.
Location Refinement and Power Coverage Analysis Based on Distributed Antenna
Institute of Scientific and Technical Information of China (English)
赵晓楠; 侯春萍; 汪清; 陈华; 浦亮洲
2016-01-01
To establish wireless channel suitable for the cabin environment, the power coverage was investigated with distributed antenna system and centralized antenna system based on the actual measurement of channel im-pulse response. The results indicated that the distributed antenna system has more uniform power coverage than the centralized antenna system. The average relative errors of receiving power of both antennas were calculated. The optimal position of the centralized antenna was obtained by Gaussian function refinement, making the system achieve a better transmission power with the same coverage effect, and providing a reference for antenna location in the future real communication in the cabin.
Energy efficiency analysis of reconfigured distribution system for practical loads
Directory of Open Access Journals (Sweden)
Pawan Kumar
2016-09-01
Full Text Available In deregulated rate structure, the performance evaluation of distribution system for energy efficiency includes; loss minimization, improved power quality, loadability limit, reliability and availability of supply. Energy efficiency changes with the variation in loading pattern and the load behaviour. Further, the nature of load at each node is not explicitly of any one type rather their characteristics depend upon the node voltages. In most cases, load is assumed to be constant power (real and reactive. In this paper voltage dependent practical loads are represented with composite load model and the energy efficiency performance of distribution system for practical loads is evaluated in different configurations of 33-node system.
Dianfeng Liu; Zimei Dong; Yanze Gu; Lingxia Tao
2008-01-01
We studied patterns of distribution and relationships among distributional areas of Tetrigidae insects in China using parsimony analysis of endemism (PAE). We constructed a matrix based on distribution data for Chinese Tetrigidae insects and an area cladogram using northeastern China area as an outgroup. Exhaustivesearches were conducted under the maximum parsimony criterion. Cluster analysis divided eight biogeographic areas into four groups; group 1 was composed of northeast China, group 2 ...
Metagenomic Analysis of Water Distribution System Bacterial Communities
The microbial quality of drinking water is assessed using culture-based methods that are highly selective and that tend to underestimate the densities and diversity of microbial populations inhabiting distribution systems. In order to better understand the effect of different dis...
Analysis and Optimization of Distributed Real-Time Embedded Systems
DEFF Research Database (Denmark)
Pop, Paul; Eles, Petru; Peng, Zebo
2006-01-01
An increasing number of real-time applications are today implemented using distributed heterogeneous architectures composed of interconnected networks of processors. The systems are heterogeneous not only in terms of hardware and software components, but also in terms of communication protocols...
Analysis of a distributed system for lifting trucks
Groote, J.F.; Pang, J.; Wouters, A.G.
2001-01-01
The process-algebraic language muCRL is used to analyse an existing distributed system for lifting trucks. Four errors are found in the original design. We propose solutions for these problems and show by means of model-checking that the modified system meets the requirements.
Analysis of distributed cooled high power millimeter wave windows
Energy Technology Data Exchange (ETDEWEB)
Nelson, S.D.; Caplan, M.; Reitter, T.A.
1995-09-09
The sectional high-frequency (100--170 GHz) distributed cooled window has been investigated both electromagnetically and thermally previously using computational electromagnetics (EM) and thermal codes. Recent data describes the relationship to some experimental data for the window. Results are presented for time domain CW EM analyses and CW thermal and stress calculations.
High Resolution PV Power Modeling for Distribution Circuit Analysis
Energy Technology Data Exchange (ETDEWEB)
Norris, B. L.; Dise, J. H.
2013-09-01
NREL has contracted with Clean Power Research to provide 1-minute simulation datasets of PV systems located at three high penetration distribution feeders in the service territory of Southern California Edison (SCE): Porterville, Palmdale, and Fontana, California. The resulting PV simulations will be used to separately model the electrical circuits to determine the impacts of PV on circuit operations.
Monte Carlo analysis of skew posterior distributions: an econometric example
H.K. van Dijk (Herman); T. Kloek (Teun)
1983-01-01
textabstractThe posterior distribution of a small-scale illustrative econometric model is used to compare symmetric simple importance sampling with asymmetric simple importance sampling. The numerical results include posterior first and second order moments, numerical error estimates of the first or
Resonance analysis in parallel voltage-controlled Distributed Generation inverters
DEFF Research Database (Denmark)
Wang, Xiongfei; Blaabjerg, Frede; Chen, Zhe
2013-01-01
Thanks to the fast responses of the inner voltage and current control loops, the dynamic behaviors of parallel voltage-controlled Distributed Generation (DG) inverters not only relies on the stability of load sharing among them, but subjects to the interactions between the voltage control loops o...
Ceramic design concepts based on stress distribution analysis.
Esquivel-Upshaw, J F; Anusavice, K J
2000-08-01
This article discusses general design concepts involved in fabricating ceramic and metal-ceramic restorations based on scientific stress distribution data. These include the effects of ceramic layer thickness, modulus of elasticity of supporting substrates, direction of applied loads, intraoral stress, and crown geometry on the susceptibility of certain restoration designs to fracture.
Analysis of dynamic foot pressure distribution and ground reaction forces
Ong, F. R.; Wong, T. S.
2005-04-01
The purpose of this study was to assess the relationship between forces derived from in-shoe pressure distribution and GRFs during normal gait. The relationship served to demonstrate the accuracy and reliability of the in-shoe pressure sensor. The in-shoe pressure distribution from Tekscan F-Scan system outputs vertical forces and Centre of Force (COF), while the Kistler force plate gives ground reaction forces (GRFs) in terms of Fz, Fx and Fy, as well as vertical torque, Tz. The two systems were synchronized for pressure and GRFs measurements. Data was collected from four volunteers through three trials for both left and right foot under barefoot condition with the in-shoe sensor. The forces derived from pressure distribution correlated well with the vertical GRFs, and the correlation coefficient (r2) was in the range of 0.93 to 0.99. This is a result of extended calibration, which improves pressure measurement to give better accuracy and reliability. The COF from in-shoe sensor generally matched well with the force plate COP. As for the maximum vertical torque at the forefoot during toe-off, there was no relationship with the pressure distribution. However, the maximum torque was shown to give an indication of the rotational angle of the foot.
Metagenomic Analysis of Water Distribution System Bacterial Communities
The microbial quality of drinking water is assessed using culture-based methods that are highly selective and that tend to underestimate the densities and diversity of microbial populations inhabiting distribution systems. In order to better understand the effect of different dis...
The Elementary and Secondary Education Act: A Distributional Analysis.
Barkin, David; Hettich, Walter
This study analyzes interstate redistribution of Federal tax money under Title One of the Elementary and Secondary Education Act of 1965. First, the consistency of the criteria used to distribute funds is studied to see if people of similar financial positions are treated "qually. Results show that when compared with an alternative--the…
Rafal Podlaski; Francis A. Roesch
2014-01-01
Two-component mixtures of either the Weibull distribution or the gamma distribution and the kernel density estimator were used for describing the diameter at breast height (dbh) empirical distributions of two-cohort stands. The data consisted of study plots from the Å wietokrzyski National Park (central Poland) and areas close to and including the North Carolina section...
Rafal Podlaski; Francis .A. Roesch
2013-01-01
The goals of this study are (1) to analyse the accuracy of the approximation of empirical distributions of diameter at breast height (dbh) using two-component mixtures of either the Weibull distribution or the gamma distribution in two−cohort stands, and (2) to discuss the procedure of choosing goodness−of−fit tests. The study plots were...
Stress-strength reliability for general bivariate distributions
Directory of Open Access Journals (Sweden)
Alaa H. Abdel-Hamid
2016-10-01
Full Text Available An expression for the stress-strength reliability R=P(X1
A Fractal Approach to Dynamic Inference and Distribution Analysis
Directory of Open Access Journals (Sweden)
Marieke M.J.W. van Rooij
2013-01-01
Full Text Available Event-distributions inform scientists about the variability and dispersion of repeated measurements. This dispersion can be understood from a complex systems perspective, and quantified in terms of fractal geometry. The key premise is that a distribution’s shape reveals information about the governing dynamics of the system that gave rise to the distribution. Two categories of characteristic dynamics are distinguished: additive systems governed by component-dominant dynamics and multiplicative or interdependent systems governed by interaction-dominant dynamics. A logic by which systems governed by interaction-dominant dynamics are expected to yield mixtures of lognormal and inverse power-law samples is discussed. These mixtures are described by a so-called cocktail model of response times derived from human cognitive performances. The overarching goals of this article are twofold: First, to offer readers an introduction to this theoretical perspective and second, to offer an overview of the related statistical methods.
Analysis of transverse field distributions in Porro prism resonators
CSIR Research Space (South Africa)
Litvin, IA
2007-01-01
Full Text Available at the apexes of the porro prisms. Experimental work on a particular system showed some interested correlations between the time domain behavior of the resonator and the transverse field output. These findings are presented and discussed. Key words: porro... prism resonator, petal (spot) transverse field distribution, second pulse 1. INTRODUCTION Right angle prisms, often referred to as Porro prisms, have the useful property that all incident rays on the prism are reflected back parallel to the initial...
Security analysis of continuous-variable quantum key distribution scheme
Institute of Scientific and Technical Information of China (English)
Zhu Jun; He Guang-Qiang; Zeng Gui-Hua
2007-01-01
In this paper security of the quantum key distribution scheme using correlations of continuous variable Einstein- Podolsky- Rosen (EPR) pairs is investigated. A new approach for calculating the secret information rate △I is proposed by using the Shannon information theory. Employing an available parameter F which is associated with the entanglement of the EPR pairs, one can detect easily the eavesdropping. Results show that the proposed scheme is secure against individual beam splitter attack strategy with a proper squeeze parameter.
Guidelines for transient analysis in water transmission and distribution systems
Pothof, Ivo; Karney, Bryan
2012-01-01
All water systems leak, and many supply systems do so considerably, with water losses typically of approximately 20% of the water production. The IWA Water Loss Task Force aims for a significant reduction of annual water losses by drafting documents to assist practitioners and others to prevent, monitor and mitigate water losses in water transmission and distribution systems. One of the causes of water losses are transient phenomena, caused by normal and accidental pump and valve operations. ...
Requirements Reasoning for Distributed Requirements Analysis using Semantic Wiki
Liang, Peng; Avgeriou, Paris; Clerc, Viktor
2009-01-01
In large-scale collaborative software projects, thousands of requirements with complex interdependencies and different granularity spreading in different levels are elicited, documented, and evolved during the project lifecycle. Non-technical stakeholders involved in requirements engineering activities rarely apply formal techniques; therefore it is infeasible to automatically detect problems in requirements. This situation becomes even worse in a distributed context when all sites are respon...
Distribution of the residual roots in principal components analysis.
Directory of Open Access Journals (Sweden)
A. M. Kshirsagar
1964-10-01
Full Text Available The latent of distribution of latent roots of the covariance martix of normal variables, when a hypothetical linear function of the variables is eliminated, is derived in this paper. The relation between original roots and the residual roots- after elimination of, is also derived by an analytical method. An exact test for the goodness of fit of a single nonisotropic hypothetical principal components, using the residual roots, is then obtained.
Directory of Open Access Journals (Sweden)
Alireza Taravat
2014-12-01
Full Text Available Oil spills represent a major threat to ocean ecosystems and their environmental status. Previous studies have shown that Synthetic Aperture Radar (SAR, as its recording is independent of clouds and weather, can be effectively used for the detection and classification of oil spills. Dark formation detection is the first and critical stage in oil-spill detection procedures. In this paper, a novel approach for automated dark-spot detection in SAR imagery is presented. A new approach from the combination of adaptive Weibull Multiplicative Model (WMM and MultiLayer Perceptron (MLP neural networks is proposed to differentiate between dark spots and the background. The results have been compared with the results of a model combining non-adaptive WMM and pulse coupled neural networks. The presented approach overcomes the non-adaptive WMM filter setting parameters by developing an adaptive WMM model which is a step ahead towards a full automatic dark spot detection. The proposed approach was tested on 60 ENVISAT and ERS2 images which contained dark spots. For the overall dataset, an average accuracy of 94.65% was obtained. Our experimental results demonstrate that the proposed approach is very robust and effective where the non-adaptive WMM & pulse coupled neural network (PCNN model generates poor accuracies.
Energy Technology Data Exchange (ETDEWEB)
Neilson, Henry J., E-mail: hjn2@case.edu [Case Western Reserve University, 10900 Euclid Ave, Cleveland, OH (United States); Petersen, Alex S.; Cheung, Andrew M.; Poon, S. Joseph; Shiflet, Gary J. [University of Virginia, 395 McCormick Road, P.O. Box 400745, Charlottesville, VA 22904 (United States); Widom, Mike [Carnegie Mellon University, 5000 Forbes Avenue, Wean Hall 3325, Pittsburgh, PA 15213 (United States); Lewandowski, John J. [Case Western Reserve University, 10900 Euclid Ave, Cleveland, OH (United States)
2015-05-14
In this study, the variations in mechanical properties of Ni−Co−Ta-based metallic glasses have been analyzed. Three different chemistries of metallic glass ribbons were analyzed: Ni{sub 45}Ta{sub 35}Co{sub 20}, Ni{sub 40}Ta{sub 35}Co{sub 20}Nb{sub 5}, and Ni{sub 30}Ta{sub 35}Co{sub 30}Nb{sub 5}. These alloys possess very high density (approximately 12.5 g/cm{sup 3}) and very high strength (e.g. >3 GPa). Differential scanning calorimetry (DSC) and x-ray diffraction (XRD) were used to characterize the amorphicity of the ribbons. Mechanical properties were measured via a combination of Vickers hardness, bending strength, and tensile strength for each chemistry. At least 50 tests were conducted for each chemistry and each test technique in order to quantify the variability of properties using both 2- and 3-parameter Weibull statistics. The variability in properties and their source(s) were compared to that of other engineering materials, while the nature of deformation via shear bands as well as fracture surface features have been determined using scanning electron microscopy (SEM). Toughness, the role of defects, and volume effects are also discussed.
Agent-based reasoning for distributed multi-INT analysis
Inchiosa, Mario E.; Parker, Miles T.; Perline, Richard
2006-05-01
Fully exploiting the intelligence community's exponentially growing data resources will require computational approaches differing radically from those currently available. Intelligence data is massive, distributed, and heterogeneous. Conventional approaches requiring highly structured and centralized data will not meet this challenge. We report on a new approach, Agent-Based Reasoning (ABR). In NIST evaluations, the use of ABR software tripled analysts' solution speed, doubled accuracy, and halved perceived difficulty. ABR makes use of populations of fine-grained, locally interacting agents that collectively reason about intelligence scenarios in a self-organizing, "bottom-up" process akin to those found in biological and other complex systems. Reproduction rules allow agents to make inferences from multi-INT data, while movement rules organize information and optimize reasoning. Complementary deterministic and stochastic agent behaviors enhance reasoning power and flexibility. Agent interaction via small-world networks - such as are found in nervous systems, social networks, and power distribution grids - dramatically increases the rate of discovering intelligence fragments that usefully connect to yield new inferences. Small-world networks also support the distributed processing necessary to address intelligence community data challenges. In addition, we have found that ABR pre-processing can boost the performance of commercial text clustering software. Finally, we have demonstrated interoperability with Knowledge Engineering systems and seen that reasoning across diverse data sources can be a rich source of inferences.
Measurement based scenario analysis of short-range distribution system planning
DEFF Research Database (Denmark)
Chen, Peiyuan; Bak-Jensen, Birgitte; Chen, Zhe
2009-01-01
This paper focuses on short-range distribution system planning using a probabilistic approach. Empirical probabilistic distributions of load demand and distributed generations are derived from the historical measurement data and incorporated into the system planning. Simulations with various...... feasible scenarios are performed based on a local distribution system at Støvring in Denmark. Simulation results provide more accurate and insightful information for the decision-maker when using the probabilistic analysis than using the worst-case analysis, so that a better planning can be achieved....
Recurrent frequency-size distribution of characteristic events
Directory of Open Access Journals (Sweden)
S. G. Abaimov
2009-04-01
Full Text Available Statistical frequency-size (frequency-magnitude properties of earthquake occurrence play an important role in seismic hazard assessments. The behavior of earthquakes is represented by two different statistics: interoccurrent behavior in a region and recurrent behavior at a given point on a fault (or at a given fault. The interoccurrent frequency-size behavior has been investigated by many authors and generally obeys the power-law Gutenberg-Richter distribution to a good approximation. It is expected that the recurrent frequency-size behavior should obey different statistics. However, this problem has received little attention because historic earthquake sequences do not contain enough events to reconstruct the necessary statistics. To overcome this lack of data, this paper investigates the recurrent frequency-size behavior for several problems. First, the sequences of creep events on a creeping section of the San Andreas fault are investigated. The applicability of the Brownian passage-time, lognormal, and Weibull distributions to the recurrent frequency-size statistics of slip events is tested and the Weibull distribution is found to be the best-fit distribution. To verify this result the behaviors of numerical slider-block and sand-pile models are investigated and the Weibull distribution is confirmed as the applicable distribution for these models as well. Exponents β of the best-fit Weibull distributions for the observed creep event sequences and for the slider-block model are found to have similar values ranging from 1.6 to 2.2 with the corresponding aperiodicities C_{V} of the applied distribution ranging from 0.47 to 0.64. We also note similarities between recurrent time-interval statistics and recurrent frequency-size statistics.
Rank-size distribution and primate city characteristics in India--a temporal analysis.
Das, R J; Dutt, A K
1993-02-01
"This paper is an analysis of the historical change in city size distribution in India....Rank-size distribution at national level and primate city-size distribution at regional levels are examined....The paper also examines, in the Indian context, the relation between rank-size distribution and an integrated urban system, and the normative nature of the latter as a spatial organization of human society. Finally, we have made a modest attempt to locate the research on city-size distribution...." excerpt
Analysis of the Spatial Distribution of Galaxies by Multiscale Methods
Directory of Open Access Journals (Sweden)
E. Saar
2005-09-01
Full Text Available Galaxies are arranged in interconnected walls and filaments forming a cosmic web encompassing huge, nearly empty, regions between the structures. Many statistical methods have been proposed in the past in order to describe the galaxy distribution and discriminate the different cosmological models. We present in this paper multiscale geometric transforms sensitive to clusters, sheets, and walls: the 3D isotropic undecimated wavelet transform, the 3D ridgelet transform, and the 3D beamlet transform. We show that statistical properties of transform coefficients measure in a coherent and statistically reliable way, the degree of clustering, filamentarity, sheetedness, and voidedness of a data set.
Fault Diagnosis for Electrical Distribution Systems using Structural Analysis
DEFF Research Database (Denmark)
Knüppel, Thyge; Blanke, Mogens; Østergaard, Jacob
2014-01-01
Fault-tolerance in electrical distribution relies on the ability to diagnose possible faults and determine which components or units cause a problem or are close to doing so. Faults include defects in instrumentation, power generation, transformation and transmission. The focus of this paper...... redundancies in large sets of equations only from the structure (topology) of the equations. A salient feature is automated generation of redundancy relations. The method is indeed feasible in electrical networks where circuit theory and network topology together formulate the constraints that define...
Statistical distributions of potential interest in ultrasound speckle analysis
Energy Technology Data Exchange (ETDEWEB)
Nadarajah, Saralees [School of Mathematics, University of Manchester, Manchester M60 1QD (United Kingdom)
2007-05-21
Compound statistical modelling of the uncompressed envelope of the backscattered signal has received much interest recently. In this note, a comprehensive collection of models is derived for the uncompressed envelope of the backscattered signal by compounding the Nakagami distribution with 13 flexible families. The corresponding estimation procedures are derived by the method of moments and the method of maximum likelihood. The sensitivity of the models to their various parameters is examined. It is expected that this work could serve as a useful reference and lead to improved modelling of the uncompressed envelope of the backscattered signal. (note)
Archiving, Distribution and Analysis of Solar-B Data
Shimojo, M.
2007-10-01
The Solar-B Mission Operation and Data Analysis (MODA) working group has been discussing the data analysis system for Solar-B data since 2001. In the paper, based on the Solar-B MODA document and the recent work in Japan, we introduce the dataflow from Solar-B to scientists, the data format and data-level of Solar-B data, and the data searching/providing system.
Improving Department of Defense Global Distribution Performance Through Network Analysis
2016-06-01
consideration given the origin. The new method includes the origin information. Using parametric and non- parametric statistical tests and data analysis...Using parametric and non- parametric statistical tests and data analysis techniques, we show that the addition of requisition origin information...apply a weighting scale to those elements. Kunadhamrak and Hanaoka constructed a multi-criteria metric using a combination of a fuzzy- 12
Directory of Open Access Journals (Sweden)
Yeping Chu
2013-12-01
Full Text Available This study aims to investigate the influence factors on the distribution efficiency of agricultural products. Distribution efficiency of agricultural product directly reflects the efficiency of agricultural distribution system. The output and cost in the process of agricultural distribution can be accessed by distribution efficiency. Much work has been done to measured distribution efficiency by single indicator but little work has been done to investigate the multiply indicators and their impacts on the distribution efficiency of agricultural products. Hence, for the first time, the Data Envelopment Analysis (DEA has been used to assess the influence of multiply indicators on the distribution efficiency of agricultural products. A case study using the historical data has been carried out. The empirical analysis results show that the level of informatization and logistics infrastructure greatly influence the distribution efficiency of agricultural products while the Logistics transportation and the professional level of labors do not promote the efficiency of agricultural distribution system. Hence, useful suggestions could be proposed to provide theoretical reference for the construction of robust and efficient agricultural distribution system.
Measuring the atmospheric organic aerosol volatility distribution: a theoretical analysis
Directory of Open Access Journals (Sweden)
E. Karnezi
2014-01-01
Full Text Available Organic compounds represent a significant fraction of submicrometer atmospheric aerosol mass. Even if most of these compounds are semi-volatile in atmospheric concentrations, the ambient organic aerosol volatility is quite uncertain. The most common volatility measurement method relies on the use of a thermodenuder (TD. The aerosol passes through a heated tube where its more volatile components evaporate leaving the less volatile behind in the particulate phase. The typical result of a~thermodenuder measurement is the mass fraction remaining (MFR, which depends among other factors on the organic aerosol (OA vaporization enthalpy and the accommodation coefficient. We use a new method combining forward modeling, introduction of "experimental" error and inverse modeling with error minimization for the interpretation of TD measurements. The OA volatility distribution, its effective vaporization enthalpy, the mass accommodation coefficient and the corresponding uncertainty ranges are calculated. Our results indicate that existing TD-based approaches quite often cannot estimate reliably the OA volatility distribution, leading to large uncertainties, since there are many different combinations of the three properties that can lead to similar thermograms. We propose an improved experimental approach combining TD and isothermal dilution measurements. We evaluate this experimental approach using the same model and show that it is suitable for studies of OA volatility in the lab and the field.
VISUALIZATION AND ANALYSIS OF LPS DISTRIBUTION IN BINARY PHOSPHOLIPID BILAYERS
Florencia, Henning María; Susana, Sanchez; Laura, Bakás
2010-01-01
Lipopolysaccharide (LPS) is an endotoxin released from the outer membrane of Gram negative bacteria during infections. It have been reported that LPS may play a rol in the outer membrane of bacteria similar to that of cholesterol in eukaryotic plasma membranes. In this article we compare the effect of introducing LPS or cholesterol in liposomes made of dipalmitoylphosphatidylcholine/dioleoylphosphatidylcholine on the solubilization process by Triton X-100. The results show that liposomes containing LPS or Cholesterol are more resistant to solubilization by Triton X-100 than the binary phospholipid mixtures at 4°C. The LPS distribution was analyzed on GUVs of DPPC:DOPC using FITC-LPS. Solid and liquid-crystalline domains were visualized labeling the GUVs with LAURDAN and GP images were acquired using a two-photon microscope. The images show a selective distribution of LPS in gel domains. Our results support the hypothesis that LPS could aggregate and concentrate selectively in biological membranes providing a mechanism to bring together several components of the LPS-sensing machinery. PMID:19324006
Analysis of an algorithm for distributed recognition and accountability
Energy Technology Data Exchange (ETDEWEB)
Ko, C.; Frincke, D.A.; Goan, T. Jr.; Heberlein, L.T.; Levitt, K.; Mukherjee, B.; Wee, C. [California Univ., Davis, CA (United States). Dept. of Computer Science
1993-08-01
Computer and network systems are available to attacks. Abandoning the existing huge infrastructure of possibly-insecure computer and network systems is impossible, and replacing them by totally secure systems may not be feasible or cost effective. A common element in many attacks is that a single user will often attempt to intrude upon multiple resources throughout a network. Detecting the attack can become significantly easier by compiling and integrating evidence of such intrusion attempts across the network rather than attempting to assess the situation from the vantage point of only a single host. To solve this problem, we suggest an approach for distributed recognition and accountability (DRA), which consists of algorithms which ``process,`` at a central location, distributed and asynchronous ``reports`` generated by computers (or a subset thereof) throughout the network. Our highest-priority objectives are to observe ways by which an individual moves around in a network of computers, including changing user names to possibly hide his/her true identity, and to associate all activities of multiple instance of the same individual to the same network-wide user. We present the DRA algorithm and a sketch of its proof under an initial set of simplifying albeit realistic assumptions. Later, we relax these assumptions to accommodate pragmatic aspects such as missing or delayed ``reports,`` clock slew, tampered ``reports,`` etc. We believe that such algorithms will have widespread applications in the future, particularly in intrusion-detection system.
Visualization and analysis of lipopolysaccharide distribution in binary phospholipid bilayers
Energy Technology Data Exchange (ETDEWEB)
Henning, Maria Florencia [Instituto de Investigaciones Bioquimicas La Plata (INIBIOLP), CCT-La Plata, CONICET, Facultad de Ciencias Medicas, UNLP, Calles 60 y 120, 1900 La Plata (Argentina); Sanchez, Susana [Laboratory for Fluorescence Dynamics, University of California-Irvine, Irvine, CA (United States); Bakas, Laura, E-mail: lbakas@biol.unlp.edu.ar [Instituto de Investigaciones Bioquimicas La Plata (INIBIOLP), CCT-La Plata, CONICET, Facultad de Ciencias Medicas, UNLP, Calles 60 y 120, 1900 La Plata (Argentina); Departamento de Ciencias Biologicas, Facultad de Ciencias Exactas, UNLP, Calles 47 y 115, 1900 La Plata (Argentina)
2009-05-22
Lipopolysaccharide (LPS) is an endotoxin released from the outer membrane of Gram-negative bacteria during infections. It have been reported that LPS may play a role in the outer membrane of bacteria similar to that of cholesterol in eukaryotic plasma membranes. In this article we compare the effect of introducing LPS or cholesterol in liposomes made of dipalmitoylphosphatidylcholine/dioleoylphosphatidylcholine on the solubilization process by Triton X-100. The results show that liposomes containing LPS or cholesterol are more resistant to solubilization by Triton X-100 than the binary phospholipid mixtures at 4 {sup o}C. The LPS distribution was analyzed on GUVs of DPPC:DOPC using FITC-LPS. Solid and liquid-crystalline domains were visualized labeling the GUVs with LAURDAN and GP images were acquired using a two-photon microscope. The images show a selective distribution of LPS in gel domains. Our results support the hypothesis that LPS could aggregate and concentrate selectively in biological membranes providing a mechanism to bring together several components of the LPS-sensing machinery.
Performance analysis of structured pedigree distributed fusion systems
Arambel, Pablo O.
2009-05-01
Structured pedigree is a way to compress pedigree information. When applied to distributed fusion systems, the approach avoids the well known problem of information double counting resulting from ignoring the cross-correlation among fused estimates. Other schemes that attempt to compute optimal fused estimates require the transmission of full pedigree information or raw data. This usually can not be implemented in practical systems because of the enormous requirements in communications bandwidth. The Structured Pedigree approach achieves data compression by maintaining multiple covariance matrices, one for each uncorrelated source in the network. These covariance matrices are transmitted by each node along with the state estimate. This represents a significant compression when compared to full pedigree schemes. The transmission of these covariance matrices (or a subset of these covariance matrices) allows for an efficient fusion of the estimates, while avoiding information double counting and guaranteeing consistency on the estimates. This is achieved by exploiting the additional partial knowledge on the correlation of the estimates. The approach uses a generalized version of the Split Covariance Intersection algorithm that applies to multiple estimates and multiple uncorrelated sources. In this paper we study the performance of the proposed distributed fusion system by analyzing a simple but instructive example.
Directory of Open Access Journals (Sweden)
Taliat Ola Yusuf
2014-01-01
Full Text Available This paper investigates the influence of blending of metakaolin with silica rich palm oil fuel ash (POFA on the strength distribution of geopolymer mortar. The broadness of strength distribution of quasi-brittle to brittle materials depends strongly on the existence of flaws such as voids, microcracks, and impurities in the material. Blending of materials containing alumina and silica with the objective of improving the performance of geopolymer makes comprehensive characterization necessary. The Weibull distribution is used to study the strength distribution and the reliability of geopolymer mortar specimens prepared from 100% metakaolin, 50% and 70% palm and cured under ambient condition. Mortar prisms and cubes were used to test the materials in flexure and compression, respectively, at 28 days and the results were analyzed using Weibull distribution. In flexure, Weibull modulus increased with POFA replacement, indicating reduced broadness of strength distribution from an increased homogeneity of the material. Modulus, however, decreased with increase in replacement of POFA in the specimens tested under compression. It is concluded that Weibull distribution is suitable for analyses of the blended geopolymer system. While porous microstructure is mainly responsible for flexural failure, heterogeneity of reaction relics is responsible for the compression failure.
A global survey on the seasonal variation of the marginal distribution of daily precipitation
Papalexiou, Simon Michael; Koutsoyiannis, Demetris
2016-08-01
To characterize the seasonal variation of the marginal distribution of daily precipitation, it is important to find which statistical characteristics of daily precipitation actually vary the most from month-to-month and which could be regarded to be invariant. Relevant to the latter issue is the question whether there is a single model capable to describe effectively the nonzero daily precipitation for every month worldwide. To study these questions we introduce and apply a novel test for seasonal variation (SV-Test) and explore the performance of two flexible distributions in a massive analysis of approximately 170,000 monthly daily precipitation records at more than 14,000 stations from all over the globe. The analysis indicates that: (a) the shape characteristics of the marginal distribution of daily precipitation, generally, vary over the months, (b) commonly used distributions such as the Exponential, Gamma, Weibull, Lognormal, and the Pareto, are incapable to describe "universally" the daily precipitation, (c) exponential-tail distributions like the Exponential, mixed Exponentials or the Gamma can severely underestimate the magnitude of extreme events and thus may be a wrong choice, and (d) the Burr type XII and the Generalized Gamma distributions are two good models, with the latter performing exceptionally well.
Analysis of the tropospheric water distribution during FIRE 2
Westphal, Douglas L.
1993-01-01
The Penn State/NCAR mesoscale model, as adapted for use at ARC, was used as a testbed for the development and validation of cloud models for use in General Circulation Models (GCM's). This modeling approach also allows us to intercompare the predictions of the various cloud schemes within the same dynamical framework. The use of the PSU/NCAR mesoscale model also allows us to compare our results with FIRE-II (First International Satellite Cloud Climatology Project Regional Experiment) observations, instead of climate statistics. Though a promising approach, our work to date revealed several difficulties. First, the model by design is limited in spatial coverage and is only run for 12 to 48 hours at a time. Hence the quality of the simulation will depend heavily on the initial conditions. The poor quality of upper-tropospheric measurements of water vapor is well known and the situation is particularly bad for mid-latitude winter since the coupling with the surface is less direct than in summer so that relying on the model to spin-up a reasonable moisture field is not always successful. Though one of the most common atmospheric constituents, water vapor is relatively difficult to measure accurately, especially operationally over large areas. The standard NWS sondes have little sensitivity at the low temperatures where cirrus form and the data from the GOES 6.7 micron channel is difficult to quantify. For this reason, the goals of FIRE Cirrus II included characterizing the three-dimensional distribution of water vapor and clouds. In studying the data from FIRE Cirrus II, it was found that no single special observation technique provides accurate regional distributions of water vapor. The Raman lidar provides accurate measurements, but only at the Hub, for levels up to 10 km, and during nighttime hours. The CLASS sondes are more sensitive to moisture at low temperatures than are the NWS sondes, but the four stations only cover an area of two hundred kilometers on a side
Indian Academy of Sciences (India)
R Saravanan; K S Syed Ali; S Israel
2008-04-01
The local, average and electronic structure of the semiconducting materials Si and Ge has been studied using multipole, maximum entropy method (MEM) and pair distribution function (PDF) analyses, using X-ray powder data. The covalent nature of bonding and the interaction between the atoms are clearly revealed by the two-dimensional MEM maps plotted on (1 0 0) and (1 1 0) planes and one-dimensional density along [1 0 0], [1 1 0] and [1 1 1] directions. The mid-bond electron densities between the atoms are 0.554 e/Å3 and 0.187 e/Å3 for Si and Ge respectively. In this work, the local structural information has also been obtained by analyzing the atomic pair distribution function. An attempt has been made in the present work to utilize the X-ray powder data sets to refine the structure and electron density distribution using the currently available versatile methods, MEM, multipole analysis and determination of pair distribution function for these two systems.
Energy system analysis of fuel cells and distributed generation
DEFF Research Database (Denmark)
Mathiesen, Brian Vad; Lund, Henrik
2007-01-01
on the energy system in which they are used. Consequently, coherent energy systems analyses of specific and complete energy systems must be conducted in order to evaluate the benefits of FC technologies and in order to be able to compare alternative solutions. In relation to distributed generation, FC...... can be used for such analyses. Moreover, the chapter presents the results of evaluating the overall system fuel savings achieved by introducing different FC applications into different energy systems. Natural gas-based and hydrogen-based micro FC-CHP, natural gas local FC-CHP plants for district...... technologies have different strengths and weaknesses in different energy systems, but often they do not have the expected effect. Specific analyses of each individual country must be conducted including scenarios of expansion of e.g. wind power in order to evaluate where and when the best use of FC...
Parallelization of Finite Element Analysis Codes Using Heterogeneous Distributed Computing
Ozguner, Fusun
1996-01-01
Performance gains in computer design are quickly consumed as users seek to analyze larger problems to a higher degree of accuracy. Innovative computational methods, such as parallel and distributed computing, seek to multiply the power of existing hardware technology to satisfy the computational demands of large applications. In the early stages of this project, experiments were performed using two large, coarse-grained applications, CSTEM and METCAN. These applications were parallelized on an Intel iPSC/860 hypercube. It was found that the overall speedup was very low, due to large, inherently sequential code segments present in the applications. The overall execution time T(sub par), of the application is dependent on these sequential segments. If these segments make up a significant fraction of the overall code, the application will have a poor speedup measure.
BEANS - a software package for distributed Big Data analysis
Hypki, Arkadiusz
2016-01-01
BEANS software is a web based, easy to install and maintain, new tool to store and analyse data in a distributed way for a massive amount of data. It provides a clear interface for querying, filtering, aggregating, and plotting data from an arbitrary number of datasets. Its main purpose is to simplify the process of storing, examining and finding new relations in the so-called Big Data. Creation of BEANS software is an answer to the growing needs of the astronomical community to have a versatile tool to store, analyse and compare the complex astrophysical numerical simulations with observations (e.g. simulations of the Galaxy or star clusters with the Gaia archive). However, this software was built in a general form and it is ready to use in any other research field or open source software.
Distributed Detection over Time Varying Networks: Large Deviations Analysis
Bajovic, Dragana; Xavier, Joao; Sinopoli, Bruno; Moura, Jose M F
2010-01-01
We apply large deviations theory to study asymptotic performance of running consensus distributed detection in sensor networks. Running consensus is a stochastic approximation type algorithm, recently proposed. At each time step k, the state at each sensor is updated by a local averaging of the sensor's own state and the states of its neighbors (consensus) and by accounting for the new observations (innovation). We assume Gaussian, spatially correlated observations. We allow the underlying network be time varying, provided that the graph that collects the union of links that are online at least once over a finite time window is connected. This paper shows through large deviations that, under stated assumptions on the network connectivity and sensors' observations, the running consensus detection asymptotically approaches in performance the optimal centralized detection. That is, the Bayes probability of detection error (with the running consensus detector) decays exponentially to zero as k goes to infinity at...
Legendre analysis of differential distributions in hadronic reactions
Azimov, Yakov I.; Strakovsky, Igor I.; Briscoe, William J.; Workman, Ron L.
2017-02-01
Modern experimental facilities have provided a tremendous volume of reaction data, often with wide energy and angular coverage, and with increasing precision. For reactions with two hadrons in the final state, these data are often presented as multiple sets of panels, with angular distributions at numerous specific energies. Such presentations have limited visual appeal, and their physical content is typically extracted through some model-dependent treatment. Instead, we explore the use of a Legendre series expansion with a relatively small number of essential coefficients. This approach has been applied in several recent experimental investigations. We present some general properties of the Legendre coefficients in the helicity framework and consider what physical information can be extracted without any model-dependent assumptions.
Statistical Analysis of Data for Timber Strengths
DEFF Research Database (Denmark)
Sørensen, John Dalsgaard
2003-01-01
. The statistical fits have generally been made using all data and the lower tail of the data. The Maximum Likelihood Method and the Least Square Technique have been used to estimate the statistical parameters in the selected distributions. The results show that the 2-parameter Weibull distribution gives the best...
Distributed Parallel Computing in Data Analysis of Osteoporosis.
Waleska Simões, Priscyla; Venson, Ramon; Comunello, Eros; Casagrande, Rogério Antônio; Bigaton, Everson; da Silva Carlessi, Lucas; da Rosa, Maria Inês; Martins, Paulo João
2015-01-01
This research aimed to compare the performance of two models of load balancing (Proportional and Autotuned algorithms) of the JPPF platform in the processing of data mining from a database with osteoporosis and osteopenia. When performing the analysis of execution times, it was observed that the Proportional algorithm performed better in all cases.