On modeling of lifetime data using two-parameter Gamma and Weibull distributions
Shanker, Rama; Shukla, Kamlesh Kumar; Shanker, Ravi; Leonida, Tekie Asehun
2016-01-01
The analysis and modeling of lifetime data are crucial in almost all applied sciences including medicine, insurance, engineering, behavioral sciences and finance, amongst others. The main objective of this paper is to have a comparative study of two-parameter gamma and Weibull distributions for mode
An EOQ Model with Two-Parameter Weibull Distribution Deterioration and Price-Dependent Demand
Mukhopadhyay, Sushanta; Mukherjee, R. N.; Chaudhuri, K. S.
2005-01-01
An inventory replenishment policy is developed for a deteriorating item and price-dependent demand. The rate of deterioration is taken to be time-proportional and the time to deterioration is assumed to follow a two-parameter Weibull distribution. A power law form of the price dependence of demand is considered. The model is solved analytically…
Weibull analyses of bacterial interaction forces measured using AFM
van der Mei, Henderina; de Vries, Jacob; Busscher, Hendrik
2010-01-01
Statistically significant conclusions from interaction forces obtained by AFM are difficult to draw because of large data spreads. Weibull analysis, common in macroscopic bond-strength analyses, takes advantage of this spread to derive a Weibull distribution, yielding the probability of occurrence o
1981-12-01
the variance of point estimators are given by Mendenhall and Scheaffer (Ref 17:269), for both biased and unbiased estimations. In addition to this...Weibull Distribution. Thesis, Wright-Patterson AFB, Ohio: Air Force Institute of Technology, December 1980. 17. Mendenhall, W. and R. L. Scheaffer
Vanfleteren, J R; De Vreese, A; Braeckman, B P
1998-11-01
We have fitted Gompertz, Weibull, and two- and three-parameter logistic equations to survival data obtained from 77 cohorts of Caenorhabditis elegans in axenic culture. Statistical analysis showed that the fitting ability was in the order: three-parameter logistic > two-parameter logistic = Weibull > Gompertz. Pooled data were better fit by the logistic equations, which tended to perform equally well as population size increased, suggesting that the third parameter is likely to be biologically irrelevant. Considering restraints imposed by the small population sizes used, we simply conclude that the two-parameter logistic and Weibull mortality models for axenically grown C. elegans generally provided good fits to the data, whereas the Gompertz model was inappropriate in many cases. The survival curves of several short- and long-lived mutant strains could be predicted by adjusting only the logistic curve parameter that defines mean life span. We conclude that life expectancy is genetically determined; the life span-altering mutations reported in this study define a novel mean life span, but do not appear to fundamentally alter the aging process.
Directory of Open Access Journals (Sweden)
A. Calzado
2013-04-01
Full Text Available Aim of study: The aim of this work was to model diameter distributions of Quercus suber stands. The ultimate goal was to construct models enabling the development of more affordable forest inventory methods. This is the first study of this type on cork oak forests in the area.Area of study: The area of study is “Los Alcornocales” Natural Park (Cádiz-Málaga, Spain.Material and methods: The diameter distributions of 100 permanent plots were modelled with the two-parameter Weibull function. Distribution parameters were fitted with the non-linear regression, maximum likelihood, moment and percentile-based methods. Goodness of fit with the different methods was compared in terms of number of plots rejected by the Kolmogorov-Smirnov test, bias, mean square error and mean absolute error. The scale and shape parameters in the Weibull function were related to the stand variables by using the parameter prediction model.Main results: The best fitting was obtained with the non-linear regression approach, using as initial values those obtained by maximum likelihood method, the percentage of rejections by the Kolmogorov-Smirnov test was 2% of the total number of cases. The scale parameter (b was successfully modelled in terms of the quadratic mean diameter under cork (R2 adj = 0.99. The shape parameter (c was modelled by using maximum diameter, minimum diameter and plot elevation (R2 adj = 0.40.Research highlights: The proposed model diameter distribution can be a highly useful tool for the inventorying and management of cork oak forests.Key words: maximum likelihood method; moment method; non linear regression approach; parameter prediction model; percentile method; scale parameter; shape parameter.
Institute of Scientific and Technical Information of China (English)
刘方亮; 刘井泉; 刘伟
2011-01-01
核电站设备可靠性数据的处理是电站进行以可靠性为中心的维修(RCM)和寿期管理(LCM)的基础.在核电站失效数据的实际处理过程中,常会面临失效样本少、维修导致数据分布不独立等问题.为解决上述问题,本文提出以双参数威布尔分布作为寿命模型、采用贝叶斯方法来处理小样本失效数据的方法,并结合核电站运行数据进行验证.结果表明,本方法在处理样本较少以及存在维修老化问题时,具有更好的适用性和准确度.%The equipment reliability data processing is the basis of reliability centered maintenance (RCM) and life cycle management (LCM) in nuclear power plant. However, in actual failure data processing, the problems such as small-sample and non-independent data caused by maintenance are put forward. To resolve the problems, a processing method combined double-parameter Weibull distribution as the life model and Bayesian method for small samples was proposed, and was validated using actual nuclear power plant operating data. The results show that the processing method has better applicability and accuracy to deal with the situation of small samples and the problems of repairing and aging in nuclear power plant.
Abaidoo-Ayin, Harold K; Boakye, Prince G; Jones, Kerby C; Wyatt, Victor T; Besong, Samuel A; Lumor, Stephen E
2017-08-01
This study investigated the compositional characteristics and shelf-life of Njangsa seed oil (NSO). Oil from Njangsa had a high polyunsaturated fatty acid (PUFA) content of which alpha eleostearic acid (α-ESA), an unusual conjugated linoleic acid was the most prevalent (about 52%). Linoleic acid was also present in appreciable amounts (approximately 34%). Our investigations also indicated that the acid-catalyzed transesterification of NSO resulted in lower yields of α-ESA methyl esters, due to isomerization, a phenomenon which was not observed under basic conditions. The triacylglycerol (TAG) profile analysis showed the presence of at least 1 α-ESA fatty acid chain in more than 95% of the oil's TAGs. Shelf-life was determined by the Weibull Hazard Sensory Method, where the end of shelf-life was defined as the time at which 50% of panelists found the flavor of NSO to be unacceptable. This was determined as 21 wk. Our findings therefore support the potential commercial viability of NSO as an important source of physiologically beneficial PUFAs. © 2017 Institute of Food Technologists®.
Investigation of Weibull statistics in fracture analysis of cast aluminum
Holland, F. A., Jr.; Zaretsky, E. V.
1989-01-01
The fracture strengths of two large batches of A357-T6 cast aluminum coupon specimens were compared by using two-parameter Weibull analysis. The minimum number of these specimens necessary to find the fracture strength of the material was determined. The applicability of three-parameter Weibull analysis was also investigated. A design methodolgy based on the combination of elementary stress analysis and Weibull statistical analysis is advanced and applied to the design of a spherical pressure vessel shell. The results from this design methodology are compared with results from the applicable ASME pressure vessel code.
Investigation of Weibull statistics in fracture analysis of cast aluminum
Holland, F. A., Jr.; Zaretsky, E. V.
1989-01-01
The fracture strengths of two large batches of A357-T6 cast aluminum coupon specimens were compared by using two-parameter Weibull analysis. The minimum number of these specimens necessary to find the fracture strength of the material was determined. The applicability of three-parameter Weibull analysis was also investigated. A design methodolgy based on the combination of elementary stress analysis and Weibull statistical analysis is advanced and applied to the design of a spherical pressure vessel shell. The results from this design methodology are compared with results from the applicable ASME pressure vessel code.
A MULTIVARIATE WEIBULL DISTRIBUTION
Directory of Open Access Journals (Sweden)
Cheng Lee
2010-07-01
Full Text Available A multivariate survival function of Weibull Distribution is developed by expanding the theorem by Lu and Bhattacharyya. From the survival function, the probability density function, the cumulative probability function, the determinant of the Jacobian Matrix, and the general moment are derived.
Transmuted Complementary Weibull Geometric Distribution
Directory of Open Access Journals (Sweden)
Ahmed Z. A fify
2014-12-01
Full Text Available This paper provides a new generalization of the complementary Weibull geometric distribution that introduced by Tojeiro et al. (2014, using the quadratic rank transmutation map studied by Shaw and Buckley (2007. The new distribution is referred to as transmuted complementary Weibull geometric distribution (TCWGD. The TCWG distribution includes as special cases the complementary Weibull geometric distribution (CWGD, complementary exponential geometric distribution(CEGD,Weibull distribution (WD and exponential distribution (ED. Various structural properties of the new distribution including moments, quantiles, moment generating function and RØnyi entropy of the subject distribution are derived. We proposed the method of maximum likelihood for estimating the model parameters and obtain the observed information matrix. A real data set are used to compare the exibility of the transmuted version versus the complementary Weibull geometric distribution.
On the Weibull distribution for wind energy assessment
DEFF Research Database (Denmark)
Batchvarova, Ekaterina; Gryning, Sven-Erik
2014-01-01
The two parameter Weibull distribution is traditionally used to describe the long term fluctuations in the wind speed as part of the theoretical framework for wind energy assessment of wind farms. The Weibull distribution is described by a shape and a scale parameter. Here, based on recent long......-term measurements performed by a wind lidar, the vertical profile of the shape parameter will be discussed for a sub-urban site, a coastal site and a marine site. The profile of the shape parameter was found to be substantially different over land and sea. A parameterization of the vertical behavior of the shape...
The Weibull distribution a handbook
Rinne, Horst
2008-01-01
The Most Comprehensive Book on the SubjectChronicles the Development of the Weibull Distribution in Statistical Theory and Applied StatisticsExploring one of the most important distributions in statistics, The Weibull Distribution: A Handbook focuses on its origin, statistical properties, and related distributions. The book also presents various approaches to estimate the parameters of the Weibull distribution under all possible situations of sampling data as well as approaches to parameter and goodness-of-fit testing.Describes the Statistical Methods, Concepts, Theories, and Applications of T
Biological implications of the Weibull and Gompertz models of aging.
Ricklefs, Robert E; Scheuerlein, Alex
2002-02-01
Gompertz and Weibull functions imply contrasting biological causes of demographic aging. The terms describing increasing mortality with age are multiplicative and additive, respectively, which could result from an increase in the vulnerability of individuals to extrinsic causes in the Gompertz model and the predominance of intrinsic causes at older ages in the Weibull model. Experiments that manipulate extrinsic mortality can distinguish these biological models. To facilitate analyses of experimental data, we defined a single index for the rate of aging (omega) for the Weibull and Gompertz functions. Each function described the increase in aging-related mortality in simulated ages at death reasonably well. However, in contrast to the Weibull omega(W), the Gompertz omega(G) was sensitive to variation in the initial mortality rate independently of aging-related mortality. Comparisons between wild and captive populations appear to support the intrinsic-causes model for birds, but give mixed support for both models in mammals.
ZERODUR strength modeling with Weibull statistical distributions
Hartmann, Peter
2016-07-01
large data set. With only 20 specimens per sample such differentiation is not possible. This requires 100 specimens per set, the more the better. The validity of the statistical evaluation methods is discussed with several examples. These considerations are of special importance because of their consequences on the prognosis methods and results. Especially the use of the two parameter Weibull distribution for high strength surfaces has led to non-realistic results. Extrapolation down to low acceptable probability of failure covers a wide range without data points existing and is mainly influenced by the slope determined by the high strength specimens. In the past this misconception has prevented the use of brittle materials for stress loads, which they could have endured easily.
Scaling Analysis of the Tensile Strength of Bamboo Fibers Using Weibull Statistics
Directory of Open Access Journals (Sweden)
Jiaxing Shao
2013-01-01
Full Text Available This study demonstrates the effect of weak-link scaling on the tensile strength of bamboo fibers. The proposed model considers the random nature of fiber strength, which is reflected by using a two-parameter Weibull distribution function. Tension tests were performed on samples that could be scaled in length. The size effects in fiber length on the strength were analyzed based on Weibull statistics. The results verify the use of Weibull parameters from specimen testing for predicting the strength distributions of fibers of longer gauge lengths.
Comparison of Parameter Estimation Methods for Transformer Weibull Lifetime Modelling
Institute of Scientific and Technical Information of China (English)
ZHOU Dan; LI Chengrong; WANG Zhongdong
2013-01-01
Two-parameter Weibull distribution is the most widely adopted lifetime model for power transformers.An appropriate parameter estimation method is essential to guarantee the accuracy of a derived Weibull lifetime model.Six popular parameter estimation methods (i.e.the maximum likelihood estimation method,two median rank regression methods including the one regressing X on Y and the other one regressing Y on X,the Kaplan-Meier method,the method based on cumulative hazard plot,and the Li's method) are reviewed and compared in order to find the optimal one that suits transformer's Weibull lifetime modelling.The comparison took several different scenarios into consideration:10 000 sets of lifetime data,each of which had a sampling size of 40 ～ 1 000 and a censoring rate of 90％,were obtained by Monte-Carlo simulations for each scienario.Scale and shape parameters of Weibull distribution estimated by the six methods,as well as their mean value,median value and 90％ confidence band are obtained.The cross comparison of these results reveals that,among the six methods,the maximum likelihood method is the best one,since it could provide the most accurate Weibull parameters,i.e.parameters having the smallest bias in both mean and median values,as well as the shortest length of the 90％ confidence band.The maximum likelihood method is therefore recommended to be used over the other methods in transformer Weibull lifetime modelling.
Fitting Ranked Linguistic Data with Two-Parameter Functions
Directory of Open Access Journals (Sweden)
Wentian Li
2010-07-01
Full Text Available It is well known that many ranked linguistic data can fit well with one-parameter models such as Zipf’s law for ranked word frequencies. However, in cases where discrepancies from the one-parameter model occur (these will come at the two extremes of the rank, it is natural to use one more parameter in the fitting model. In this paper, we compare several two-parameter models, including Beta function, Yule function, Weibull function—all can be framed as a multiple regression in the logarithmic scale—in their fitting performance of several ranked linguistic data, such as letter frequencies, word-spacings, and word frequencies. We observed that Beta function fits the ranked letter frequency the best, Yule function fits the ranked word-spacing distribution the best, and Altmann, Beta, Yule functions all slightly outperform the Zipf’s power-law function in word ranked- frequency distribution.
An Extension to the Weibull Process Model
1981-11-01
Subt5l . TYPE OF REPORT & PERIOD COVERED AN EXTENSION+TO THE WEIBULL PROCESS MODEL 6. PERFORMING O’G. REPORT NUMBER I. AuTHOR() S. CONTRACT OR GRANT...indicatinq its imrportance to applications. 4 AN EXTENSION TO TE WEIBULL PROCESS MODEL 1. INTRODUCTION Recent papers by Bain and Engelhardt (1980)1 and Crow
Directory of Open Access Journals (Sweden)
B.B. Sagar
2016-09-01
Full Text Available The aim of this paper was to estimate the number of defects in software and remove them successfully. This paper incorporates Weibull distribution approach along with inflection S-shaped Software Reliability Growth Models (SRGM. In this combination two parameter Weibull distribution methodology is used. Relative Prediction Error (RPE is calculated to predict the validity criterion of the developed model. Experimental results on actual data from five data sets are compared with two other existing models, which expose that the proposed software reliability growth model predicts better estimation to remove the defects. This paper presents best software reliability growth model with including feature of both Weibull distribution and inflection S-shaped SRGM to estimate the defects of software system, and provide help to researchers and software industries to develop highly reliable software products.
Censored Weibull Distributed Data in Experimental Design
Støtvig, Jeanett Gunneklev
2014-01-01
Give an introduction to experimental design. Investigate how four methods handle Weibull distributed censored data, where the four methods are the quick and dirty method, the maximum likelihood method, single imputation and multiple imputation.
The Transmuted Generalized Inverse Weibull Distribution
Directory of Open Access Journals (Sweden)
Faton Merovci
2014-05-01
Full Text Available A generalization of the generalized inverse Weibull distribution the so-called transmuted generalized inverse Weibull distribution is proposed and studied. We will use the quadratic rank transmutation map (QRTM in order to generate a flexible family of probability distributions taking the generalized inverseWeibull distribution as the base value distribution by introducing a new parameter that would offer more distributional flexibility. Various structural properties including explicit expressions for the moments, quantiles, and moment generating function of the new distribution are derived. We propose the method of maximum likelihood for estimating the model parameters and obtain the observed information matrix. A real data set are used to compare the flexibility of the transmuted version versus the generalized inverse Weibull distribution.
A CLASS OF WEIGHTED WEIBULL DISTRIBUTION
Directory of Open Access Journals (Sweden)
Saman Shahbaz
2010-07-01
Full Text Available The weighted Weibull model is proposed following the method of Azzalini (1985. Basic properties of the distribution; including moments, generating function, hazard rate function and estimation of parameters; have been studied. The weighted Weibull model is proposed following the method of Azzalini (1985. Basic properties of the distribution; including moments, generating function, hazard rate function and estimation of parameters; have been studied.
The Weibull - log Weibull transition of interoccurrence times for synthetic and natural earthquakes
Hasumi, Tomohiro; Akimoto, Takuma; Aizawa, Yoji
2008-01-01
We have studied interoccurrence time distributions by analyzing the synthetic and three natural catalogs of the Japan Meteorological Agency (JMA), the Southern California Earthquake Data Center (SCEDC), and Taiwan Central Weather Bureau (TCWB) and revealed the universal feature of the interoccurrence time statistics, Weibull - log Weibull transition. This transition reinforces the view that the interoccurrence time statistics possess Weibull statistics and log-Weibull statistics. Here in this paper, the crossover magnitude from the superposition regime to the Weibull regime $m_c^2$ is proportional to the plate velocity. In addition, we have found the region-independent relation, $m_c^2/m_{max} = 0.54 \\pm 0.004$.
Energy Technology Data Exchange (ETDEWEB)
Jaramillo, O.A.; Borja, M.A.
2004-07-01
The International Standard IEC 61400-12 and other international recommendations suggest the use of the two-parameter Weibull probability distribution function (PDF) to estimate the Annual Energy Production (AEP) of a wind turbine. Most of the commercial software uses the unimodal Weibull PDF as the default option to carry out estimations of AEP, which in turn, are used to optimise wind farm layouts. Furthermore, AEP is essential data to assess the economic feasibility of a wind power project. However, in some regions of the world, the use of these widely adopted and recommended methods lead to incorrect results. This is the case for the region of La Ventosa in Mexico, where the frequency of the wind speed shows a bimodal distribution. In this work, mathematical formulations by using a Weibull PDF and a bimodal distribution are established to compare the AEP, the capacity factor and the levelised production cost for a specific wind turbine. By combining one year of wind speed data with the hypothetic power performance of the Vestas V27-225 kW wind turbine, it was found that using the Weibull PDF underestimates AEP (and thus the Capacity Factor) by about 12%. (author)
Large-Scale Weibull Analysis of H-451 Nuclear- Grade Graphite Specimen Rupture Data
Nemeth, Noel N.; Walker, Andrew; Baker, Eric H.; Murthy, Pappu L.; Bratton, Robert L.
2012-01-01
A Weibull analysis was performed of the strength distribution and size effects for 2000 specimens of H-451 nuclear-grade graphite. The data, generated elsewhere, measured the tensile and four-point-flexure room-temperature rupture strength of specimens excised from a single extruded graphite log. Strength variation was compared with specimen location, size, and orientation relative to the parent body. In our study, data were progressively and extensively pooled into larger data sets to discriminate overall trends from local variations and to investigate the strength distribution. The CARES/Life and WeibPar codes were used to investigate issues regarding the size effect, Weibull parameter consistency, and nonlinear stress-strain response. Overall, the Weibull distribution described the behavior of the pooled data very well. However, the issue regarding the smaller-than-expected size effect remained. This exercise illustrated that a conservative approach using a two-parameter Weibull distribution is best for designing graphite components with low probability of failure for the in-core structures in the proposed Generation IV (Gen IV) high-temperature gas-cooled nuclear reactors. This exercise also demonstrated the continuing need to better understand the mechanisms driving stochastic strength response. Extensive appendixes are provided with this report to show all aspects of the rupture data and analytical results.
Weibull Distributions for the Preterm Delivery
Directory of Open Access Journals (Sweden)
Kavitha, N
2014-06-01
Full Text Available The purposes of this study are to evaluate the levels of CRH at pregnancy by using Weibull distributions. Also this study found the rate of change in placental CRH and the level of maternal cortisol in preterm delivery by the mathematical formulas.
Renormalizable two-parameter piecewise isometries.
Lowenstein, J H; Vivaldi, F
2016-06-01
We exhibit two distinct renormalization scenarios for two-parameter piecewise isometries, based on 2π/5 rotations of a rhombus and parameter-dependent translations. Both scenarios rely on the recently established renormalizability of a one-parameter triangle map, which takes place if and only if the parameter belongs to the algebraic number field K=Q(5) associated with the rotation matrix. With two parameters, features emerge which have no counterpart in the single-parameter model. In the first scenario, we show that renormalizability is no longer rigid: whereas one of the two parameters is restricted to K, the second parameter can vary continuously over a real interval without destroying self-similarity. The mechanism involves neighbouring atoms which recombine after traversing distinct return paths. We show that this phenomenon also occurs in the simpler context of Rauzy-Veech renormalization of interval exchange transformations, here regarded as parametric piecewise isometries on a real interval. We explore this analogy in some detail. In the second scenario, which involves two-parameter deformations of a three-parameter rhombus map, we exhibit a weak form of rigidity. The phase space splits into several (non-convex) invariant components, on each of which the renormalization still has a free parameter. However, the foliations of the different components are transversal in parameter space; as a result, simultaneous self-similarity of the component maps requires that both of the original parameters belong to the field K.
NEW DOCTORAL DEGREE Parameter estimation problem in the Weibull model
Marković, Darija
2009-01-01
In this dissertation we consider the problem of the existence of best parameters in the Weibull model, one of the most widely used statistical models in reliability theory and life data theory. Particular attention is given to a 3-parameter Weibull model. We have listed some of the many applications of this model. We have described some of the classical methods for estimating parameters of the Weibull model, two graphical methods (Weibull probability plot and hazard plot), and two analyt...
General collision branching processes with two parameters
Institute of Scientific and Technical Information of China (English)
CHEN AnYue; LI JunPing
2009-01-01
A new class of branching models, the general collision branching processes with two parameters, is considered in this paper. For such models, it is necessary to evaluate the absorbing probabilities and mean extinction times for both absorbing states. Regularity and uniqueness criteria are firstly established. Explicit expressions are then obtained for the extinction probability vector, the mean extinction times and the conditional mean extinction times. The explosion behavior of these models is investigated and an explicit expression for mean explosion time is established. The mean global holding time is also obtained. It is revealed that these properties are substantially different between the super-explosive and sub-explosive cases.
Polynomial approximations of the Normal toWeibull Distribution transformation
Directory of Open Access Journals (Sweden)
Andrés Feijóo
2014-09-01
Full Text Available Some of the tools that are generally employed in power system analysis need to use approaches based on statistical distributions for simulating the cumulative behavior of the different system devices. For example, the probabilistic load flow. The presence of wind farms in power systems has increased the use of Weibull and Rayleigh distributions among them. Not only the distributions themselves, but also satisfying certain constraints such as correlation between series of data or even autocorrelation can be of importance in the simulation. Correlated Weibull or Rayleigh distributions can be obtained by transforming correlated Normal distributions, and it can be observed that certain statistical values such as the means and the standard deviations tend to be retained when operating such transformations, although why this happens is not evident. The objective of this paper is to analyse the consequences of using such transformations. The methodology consists of comparing the results obtained by means of a direct transformation and those obtained by means of approximations based on the use of first and second degree polynomials. Simulations have been carried out with series of data which can be interpreted as wind speeds. The use of polynomial approximations gives accurate results in comparison with direct transformations and provides an approach that helps explain why the statistical values are retained during the transformations.
General collision branching processes with two parameters
Institute of Scientific and Technical Information of China (English)
2009-01-01
A new class of branching models,the general collision branching processes with two parameters,is considered in this paper.For such models,it is necessary to evaluate the absorbing probabilities and mean extinction times for both absorbing states.Regularity and uniqueness criteria are firstly established.Explicit expressions are then obtained for the extinction probability vector,the mean extinction times and the conditional mean extinction times.The explosion behavior of these models is investigated and an explicit expression for mean explosion time is established.The mean global holding time is also obtained.It is revealed that these properties are substantially different between the super-explosive and sub-explosive cases.
Integral points in two-parameter orbits
Corvaja, Pietro; Tucker, Thomas J; Zannier, Umberto
2012-01-01
Let K be a number field, let f: P_1 --> P_1 be a nonconstant rational map of degree greater than 1, let S be a finite set of places of K, and suppose that u, w in P_1(K) are not preperiodic under f. We prove that the set of (m,n) in N^2 such that f^m(u) is S-integral relative to f^n(w) is finite and effectively computable. This may be thought of as a two-parameter analog of a result of Silverman on integral points in orbits of rational maps. This issue can be translated in terms of integral points on an open subset of P_1^2; then one can apply a modern version of the method of Runge, after increasing the number of components at infinity by iterating the rational map. Alternatively, an ineffective result comes from a well-known theorem of Vojta.
Transmuted New Generalized Inverse Weibull Distribution
Directory of Open Access Journals (Sweden)
Muhammad Shuaib Khan
2017-06-01
Full Text Available This paper introduces the transmuted new generalized inverse Weibull distribution by using the quadratic rank transmutation map (QRTM scheme studied by Shaw et al. (2007. The proposed model contains the twenty three lifetime distributions as special sub-models. Some mathematical properties of the new distribution are formulated, such as quantile function, Rényi entropy, mean deviations, moments, moment generating function and order statistics. The method of maximum likelihood is used for estimating the model parameters. We illustrate the flexibility and potential usefulness of the new distribution by using reliability data.
Beam Elements on Linear Variable Two-Parameter Elastic Foundation
Directory of Open Access Journals (Sweden)
Iancu-Bogdan Teodoru
2008-01-01
Full Text Available The traditional way to overcome the shortcomings of the Winkler foundation model is to incorporate spring coupling by assemblages of mechanical elements such as springs, flexural elements (beams in one-dimension, 1-D, plates in 2-D, shear-only layers and deformed, pretensioned membranes. This is the class of two-parameter foundations ? named like this because they have the second parameter which introduces interactions between adjacent springs, in addition to the first parameter from the ordinary Winkler?s model. This class of models includes Wieghardt, Filonenko-Borodich, Hetényi and Pasternak foundations. Mathematically, the equations to describe the reaction of the two-parameter foundations are equilibrium, and the only difference is the definition of the parameters. In order to analyse the bending behavior of a Euler-Bernoulli beam resting on linear variable two-parameter elastic foundation a (displacement Finite Element (FE formulation, based on the cubic displacement function of the governing differential equation, is introduced.
Using the Weibull distribution reliability, modeling and inference
McCool, John I
2012-01-01
Understand and utilize the latest developments in Weibull inferential methods While the Weibull distribution is widely used in science and engineering, most engineers do not have the necessary statistical training to implement the methodology effectively. Using the Weibull Distribution: Reliability, Modeling, and Inference fills a gap in the current literature on the topic, introducing a self-contained presentation of the probabilistic basis for the methodology while providing powerful techniques for extracting information from data. The author explains the use of the Weibull distribution
Directory of Open Access Journals (Sweden)
Sghaier T
2016-10-01
Full Text Available The objective of this study was to evaluate the effectiveness of both Normal and two-parameter Weibull distributions in describing diameter distribution of Tetraclinis articulata stands in north-east Tunisia. The parameters of the Weibull function were estimated using the moments method and maximum likelihood approaches. The data used in this study came from temporary plots. The three diameter distribution models were compared firstly by estimating the parameters of the distribution directly from individual tree measurements taken in each plot (parameter estimation method, and secondly by predicting the same parameters from stand variables (parameter prediction method. The comparison was based on bias, mean absolute error, mean square error and the Reynolds’ index error (as a percentage. On the basis of the parameter estimation method, the Normal distribution gave slightly better results, whereas the Weibull distribution with the maximum likelihood approach gave the best results for the parameter prediction method. Hence, in the latter case, the Weibull distribution with the maximum likelihood approach appears to be the most suitable to estimate the parameters for reducing the different comparison criteria for the distribution of trees by diameter class in Tetraclinis articulata forests in Tunisia.
Weibull model of multiplicity distribution in hadron-hadron collisions
Dash, Sadhana; Nandi, Basanta K.; Sett, Priyanka
2016-06-01
We introduce the use of the Weibull distribution as a simple parametrization of charged particle multiplicities in hadron-hadron collisions at all available energies, ranging from ISR energies to the most recent LHC energies. In statistics, the Weibull distribution has wide applicability in natural processes that involve fragmentation processes. This provides a natural connection to the available state-of-the-art models for multiparticle production in hadron-hadron collisions, which involve QCD parton fragmentation and hadronization. The Weibull distribution describes the multiplicity data at the most recent LHC energies better than the single negative binomial distribution.
Distributed Fuzzy CFAR Detection for Weibull Clutter
Zaimbashi, Amir; Taban, Mohammad Reza; Nayebi, Mohammad Mehdi
In Distributed detection systems, restricting the output of the local decision to one bit certainly implies a substantial information loss. In this paper, we consider the fuzzy detection, which uses a function called membership function for mapping the observation space of each local detector to a value between 0 and 1, indicating the degree of assurance about presence or absence of a signal. In this case, we examine the problem of distributed Maximum Likelihood (ML) and Order Statistic (OS) constant false alarm rate (CFAR) detections using fuzzy fusion rules such as “Algebraic Product” (AP), “Algebraic Sum” (AS), “Union” (Un) and “Intersection” (IS) in the fusion centre. For the Weibull clutter, the expression of the membership function based on the ML or OS CFAR processors in the local detectors is also obtained. For comparison, we consider a binary distributed detector, which uses the Maximum Likelihood and Algebraic Product (MLAP) or Order Statistic and Algebraic Product (OSAP) CFAR processors as the local detectors. In homogenous and non homogenous situations, multiple targets or clutter edge, the performances of the fuzzy and binary distributed detectors are analyzed and compared. The simulation results indicate the superior and robust performance of the distributed systems using fuzzy detection in the homogenous and non homogenous situations.
comparison of estimation methods for fitting weibull distribution to ...
African Journals Online (AJOL)
Tersor
JOURNAL OF RESEARCH IN FORESTRY, WILDLIFE AND ENVIRONMENT VOLUME 7, No.2 SEPTEMBER, 2015. ... method was more accurate in fitting the Weibull distribution to the natural stand. ... appropriate for mixed age group.
Modeling particle size distributions by the Weibull distribution function
Energy Technology Data Exchange (ETDEWEB)
Fang, Zhigang (Rogers Tool Works, Rogers, AR (United States)); Patterson, B.R.; Turner, M.E. Jr (Univ. of Alabama, Birmingham, AL (United States))
1993-10-01
A method is proposed for modeling two- and three-dimensional particle size distributions using the Weibull distribution function. Experimental results show that, for tungsten particles in liquid phase sintered W-14Ni-6Fe, the experimental cumulative section size distributions were well fit by the Weibull probability function, which can also be used to compute the corresponding relative frequency distributions. Modeling the two-dimensional section size distributions facilitates the use of the Saltykov or other methods for unfolding three-dimensional (3-D) size distributions with minimal irregularities. Fitting the unfolded cumulative 3-D particle size distribution with the Weibull function enables computation of the statistical distribution parameters from the parameters of the fit Weibull function.
determination of weibull parameters and analysis of wind power ...
African Journals Online (AJOL)
HOD
Resulting from the analysis, the values of the average wind speed, the average daily wind power, the ... Keywords: Wind power potential, Energy production, Weibull distribution, Wind ... was added globally in 2015 indicating a 23.2% increase.
A Weibull distribution accrual failure detector for cloud computing.
Liu, Jiaxi; Wu, Zhibo; Wu, Jin; Dong, Jian; Zhao, Yao; Wen, Dongxin
2017-01-01
Failure detectors are used to build high availability distributed systems as the fundamental component. To meet the requirement of a complicated large-scale distributed system, accrual failure detectors that can adapt to multiple applications have been studied extensively. However, several implementations of accrual failure detectors do not adapt well to the cloud service environment. To solve this problem, a new accrual failure detector based on Weibull Distribution, called the Weibull Distribution Failure Detector, has been proposed specifically for cloud computing. It can adapt to the dynamic and unexpected network conditions in cloud computing. The performance of the Weibull Distribution Failure Detector is evaluated and compared based on public classical experiment data and cloud computing experiment data. The results show that the Weibull Distribution Failure Detector has better performance in terms of speed and accuracy in unstable scenarios, especially in cloud computing.
On Generalized Upper(kRecord Values From Weibull Distribution
Directory of Open Access Journals (Sweden)
Jerin Paul
2015-09-01
Full Text Available In this paper we study the generalized upper(krecord values arising from Weibull distribution. Expressions for the moments and product moments of those generalized upper(krecord values are derived. Some properties of generalized upper(krecord values which characterize the Weibull distribution have been established. Also some distributional properties of generalized upper(krecord values arising from Weibull distribution are considered and used for suggesting an estimator for the shape parameter of Weibull distribution. The location and scale parameters are estimated using the Best Linear Unbiased Estimation procedure. Prediction of a future record using Best Linear Unbiased Predictor has been studied. A real life data is used to illustrate the results generated in this work.
Bayesian Estimation and Prediction for Flexible Weibull Model under Type-II Censoring Scheme
Directory of Open Access Journals (Sweden)
Sanjay Kumar Singh
2013-01-01
Full Text Available We have developed the Bayesian estimation procedure for flexible Weibull distribution under Type-II censoring scheme assuming Jeffrey's scale invariant (noninformative and Gamma (informative priors for the model parameters. The interval estimation for the model parameters has been performed through normal approximation, bootstrap, and highest posterior density (HPD procedures. Further, we have also derived the predictive posteriors and the corresponding predictive survival functions for the future observations based on Type-II censored data from the flexible Weibull distribution. Since the predictive posteriors are not in the closed form, we proposed to use the Monte Carlo Markov chain (MCMC methods to approximate the posteriors of interest. The performance of the Bayes estimators has also been compared with the classical estimators of the model parameters through the Monte Carlo simulation study. A real data set representing the time between failures of secondary reactor pumps has been analysed for illustration purpose.
Weibull model of Multiplicity Distribution in hadron-hadron collisions
Dash, Sadhana
2014-01-01
We introduce the Weibull distribution as a simple parametrization of charged particle multiplicities in hadron-hadron collisions at all available energies, ranging from ISR energies to the most recent LHC energies. In statistics, the Weibull distribution has wide applicability in natural processes involving fragmentation processes. This gives a natural connection to the available state-of-the-art models for multi-particle production in hadron hadron collisions involving QCD parton fragmentation and hadronization.
Heo, Jun-Haeng; Boes, D. C.; Salas, J. D.
2001-02-01
Parameter estimation in a regional flood frequency setting, based on a Weibull model, is revisited. A two parameter Weibull distribution at each site, with common shape parameter over sites that is rationalized by a flood index assumption, and with independence in space and time, is assumed. The estimation techniques of method of moments and method of probability weighted moments are studied by proposing a family of estimators for each technique and deriving the asymptotic variance of each estimator. Then a single estimator and its asymptotic variance for each technique, suggested by trying to minimize the asymptotic variance over the family of estimators, is obtained. These asymptotic variances are compared to the Cramer-Rao Lower Bound, which is known to be the asymptotic variance of the maximum likelihood estimator. A companion paper considers the application of this model and these estimation techniques to a real data set. It includes a simulation study designed to indicate the sample size required for compatibility of the asymptotic results to fixed sample sizes.
An EOQ Model for Time Dependent Weibull Deterioration with Linear Demand and Shortages
Directory of Open Access Journals (Sweden)
Umakanta Mishra
2012-06-01
Full Text Available Background. The study of control and maintenance of production inventories of deteriorating items with and without shortages has grown in its importance recently. The effect of deterioration is very important in many inventory systems. Deterioration is defined as decay or damage such that the item cannot be used for its original purpose. Methods: In this article order level inventory models have been developed for deteriorating items with linear demand and Weibull deterioration. In developing the model we have assumed that the production rate and the demand rate are time dependent. The unit production cost is inversely proportional to demand. Inventory-production system has two parameters Weibull deterioration. Results and conclusions: Two models have been developed considering without shortage cases and with shortage case where the shortages are completely backlogged. The objective of the model is to develop an optimal policy that minimizes the total average cost. Sensitivity analysis has been carried out to show the effect of changes in the parameter on the optimum total average cost.
Directory of Open Access Journals (Sweden)
Manna S.K.
2008-01-01
Full Text Available In this paper, we consider the problem of simultaneous determination of retail price and lot-size (RPLS under the assumption that the supplier offers a fixed credit period to the retailer. It is assumed that the item in stock deteriorates over time at a rate that follows a two-parameter Weibull distribution and that the price-dependent demand is represented by a constant-price-elasticity function of retail price. The RPLS decision model is developed and solved analytically. Results are illustrated with the help of a base example. Computational results show that the supplier earns more profits when the credit period is greater than the replenishment cycle length. Sensitivity analysis of the solution to changes in the value of input parameters of the base example is also discussed.
Directory of Open Access Journals (Sweden)
Islam Khandaker Dahirul
2016-01-01
Full Text Available This paper explores wind speed distribution using Weibull probability distribution and Rayleigh distribution methods that are proven to provide accurate and efficient estimation of energy output in terms of wind energy conversion systems. Two parameters of Weibull (shape and scale parameters k and c respectively and scale parameter of Rayleigh distribution have been determined based on hourly time-series wind speed data recorded from October 2014 to October 2015 at Saint Martin’s island, Bangladesh. This research has been carried out to examine three numerical methods namely Graphical Method (GM, Empirical Method (EM, Energy Pattern Factor method (EPF to estimate Weibull parameters. Also, Rayleigh distribution method has been analyzed throughout the study. The results in the research revealed that the Graphical method followed by Empirical method and Energy Pattern Factor method were the most accurate and efficient way for determining the value of k and c to approximate wind speed distribution in terms of estimating power error. Rayleigh distribution gives the most power error in the research. Potential for wind energy development in Saint Martin’s island, Bangladesh as found from the data analysis has been explained in this paper.
Transformation of state space for two-parameter Markov processes
Institute of Scientific and Technical Information of China (English)
周健伟
1996-01-01
Let X=(X) be a two-parameter *-Markov process with a transition function (p1, p2, p), where X, takes values in the state space (Er,), T=[0,)2. For each r T, let f, be a measurable transformation of (E,) into the state space (E’r, ). Set Y,=f,(X,), r T. A sufficient condition is given for the process Y=(Yr) still to be a two-parameter *-Markov process with a transition function in terms of transition function (p1, p2, p) and fr. For *-Markov families of two-parameter processes with a transition function, a similar problem is also discussed.
A Weibull characterization for tensile fracture of multicomponent brittle fibers
Barrows, R. G.
1977-01-01
Necessary to the development and understanding of brittle fiber reinforced composites is a means to statistically describe fiber strength and strain-to-failure behavior. A statistical characterization for multicomponent brittle fibers is presented. The method, which is an extension of usual Weibull distribution procedures, statistically considers the components making up a fiber (e.g., substrate, sheath, and surface) as separate entities and taken together as in a fiber. Tensile data for silicon carbide fiber and for an experimental carbon-boron alloy fiber are evaluated in terms of the proposed multicomponent Weibull characterization.
Survival Analysis of Patients with Breast Cancer using Weibull Parametric Model.
Baghestani, Ahmad Reza; Moghaddam, Sahar Saeedi; Majd, Hamid Alavi; Akbari, Mohammad Esmaeil; Nafissi, Nahid; Gohari, Kimiya
2015-01-01
The Cox model is known as one of the most frequently-used methods for analyzing survival data. However, in some situations parametric methods may provide better estimates. In this study, a Weibull parametric model was employed to assess possible prognostic factors that may affect the survival of patients with breast cancer. We studied 438 patients with breast cancer who visited and were treated at the Cancer Research Center in Shahid Beheshti University of Medical Sciences during 1992 to 2012; the patients were followed up until October 2014. Patients or family members were contacted via telephone calls to confirm whether they were still alive. Clinical, pathological, and biological variables as potential prognostic factors were entered in univariate and multivariate analyses. The log-rank test and the Weibull parametric model with a forward approach, respectively, were used for univariate and multivariate analyses. All analyses were performed using STATA version 11. A P-value lower than 0.05 was defined as significant. On univariate analysis, age at diagnosis, level of education, type of surgery, lymph node status, tumor size, stage, histologic grade, estrogen receptor, progesterone receptor, and lymphovascular invasion had a statistically significant effect on survival time. On multivariate analysis, lymph node status, stage, histologic grade, and lymphovascular invasion were statistically significant. The one-year overall survival rate was 98%. Based on these data and using Weibull parametric model with a forward approach, we found out that patients with lymphovascular invasion were at 2.13 times greater risk of death due to breast cancer.
Compressed data separation via dual frames based split-analysis with Weibull matrices
Institute of Scientific and Technical Information of China (English)
CAI Yun; LI Song
2013-01-01
In this paper, we consider data separation problem, where the original signal is composed of two distinct subcomponents, via dual frames based Split-analysis approach. We show that the two distinct subcomponents, which are sparse in two diff erent general frames respectively, can be exactly recovered with high probability, when the measurement matrix is a Weibull random matrix (not Gaussian) and the two frames satisfy a mutual coherence property. Our result may be significant for analysing Split-analysis model for data separation.
A New Approach for Parameter Estimation of Mixed Weibull Distribution:A Case Study in Spindle
Institute of Scientific and Technical Information of China (English)
Dongwei Gu; Zhiqiong Wang; Guixiang Shen; Yingzhi Zhang; Xilu Zhao
2016-01-01
In order to improve the accuracy and efficiency of graphical method and maximum likelihood estimation ( MLE) in Mixed Weibull distribution parameters estimation, Graphical-GA combines the advantage of graphical method and genetic algorithm ( GA) is proposed. Firstly, with the analysis of Weibull probability paper (WPP), mixed Weibull is identified to data fitting. Secondly, the observed value of shape and scale parameters are obtained by graphical method with least square, then optimizing the parameters of mixed Weibull with GA. Thirdly, with the comparative analysis on graphical method, piecewise Weibull and two⁃Weibull, it shows graphical⁃GA mixed Weibull is the best. Finally, the spindle MTBF point estimation and interval estimation are got based on mixed Weibull distribution. The results indicate that graphical⁃GA are improved effectively and the evaluation of spindle can provide the basis for design and reliability growth.
Analysis of tensile bond strengths using Weibull statistics.
Burrow, Michael F; Thomas, David; Swain, Mike V; Tyas, Martin J
2004-09-01
Tensile strength tests of restorative resins bonded to dentin, and the resultant strengths of interfaces between the two, exhibit wide variability. Many variables can affect test results, including specimen preparation and storage, test rig design and experimental technique. However, the more fundamental source of variability, that associated with the brittle nature of the materials, has received little attention. This paper analyzes results from micro-tensile tests on unfilled resins and adhesive bonds between restorative resin composite and dentin in terms of reliability using the Weibull probability of failure method. Results for the tensile strengths of Scotchbond Multipurpose Adhesive (3M) and Clearfil LB Bond (Kuraray) bonding resins showed Weibull moduli (m) of 6.17 (95% confidence interval, 5.25-7.19) and 5.01 (95% confidence interval, 4.23-5.8). Analysis of results for micro-tensile tests on bond strengths to dentin gave moduli between 1.81 (Clearfil Liner Bond 2V) and 4.99 (Gluma One Bond, Kulzer). Material systems with m in this range do not have a well-defined strength. The Weibull approach also enables the size dependence of the strength to be estimated. An example where the bonding area was changed from 3.1 to 1.1 mm diameter is shown. Weibull analysis provides a method for determining the reliability of strength measurements in the analysis of data from bond strength and tensile tests on dental restorative materials.
SEMI-COMPETING RISKS ON A TRIVARIATE WEIBULL SURVIVAL MODEL
Directory of Open Access Journals (Sweden)
Jenq-Daw Lee
2008-07-01
Full Text Available A setting of a trivairate survival function using semi-competing risks concept is proposed, in which a terminal event can only occur after other events. The Stanford Heart Transplant data is reanalyzed using a trivariate Weibull distribution model with the proposed survival function.
ASYMPTOTIC PROPERTIES OF MLE FOR WEIBULL DISTRIBUTION WITH GROUPED DATA
Institute of Scientific and Technical Information of China (English)
XUE Hongqi; SONG Lixin
2002-01-01
A grouped data model for Weibull distribution is considered. Under mild con-ditions, the maximum likelihood estimators(MLE) are shown to be identifiable, strongly consistent, asymptotically normal, and satisfy the law of iterated logarithm. Newton iter- ation algorithm is also considered, which converges to the unique solution of the likelihood equation. Moreover, we extend these results to a random case.
Bubbling and bistability in two parameter discrete systems
Indian Academy of Sciences (India)
G Ambika; N V Sujatha
2000-05-01
We present a graphical analysis of the mechanisms underlying the occurrences of bubbling sequences and bistability regions in the bifurcation scenario of a special class of one dimensional two parameter maps. The main result of the analysis is that whether it is bubbling or bistability is decided by the sign of the third derivative at the inﬂection point of the map function.
Optimal Two Parameter Bounds for the Seiffert Mean
Directory of Open Access Journals (Sweden)
Hui Sun
2013-01-01
Full Text Available We obtain sharp bounds for the Seiffert mean in terms of a two parameter family of means. Our results generalize and extend the recent bounds presented in the Journal of Inequalities and Applications (2012 and Abstract and Applied Analysis (2012.
ASYMPTOTIC PROPERTIES OF MLE FOR WEIBULL DISTRIBUTION WITH GROUPED DATA
Institute of Scientific and Technical Information of China (English)
XUEHongqi; SONGLixin
2002-01-01
A grouped data model for weibull distribution is considered.Under mild conditions .the maximum likelihood estimators(MLE)are shown to be identifiable,strongly consistent,asymptotically normal,and satisfy the law of iterated logarithm .Newton iteration algorthm is also condsidered,which converges to the unique solution of the likelihood equation.Moreover,we extend these results to a random case.
Packing fraction of particles with a Weibull size distribution
Brouwers, H. J. H.
2016-07-01
This paper addresses the void fraction of polydisperse particles with a Weibull (or Rosin-Rammler) size distribution. It is demonstrated that the governing parameters of this distribution can be uniquely related to those of the lognormal distribution. Hence, an existing closed-form expression that predicts the void fraction of particles with a lognormal size distribution can be transformed into an expression for Weibull distributions. Both expressions contain the contraction coefficient β. Likewise the monosized void fraction φ1, it is a physical parameter which depends on the particles' shape and their state of compaction only. Based on a consideration of the scaled binary void contraction, a linear relation for (1 - φ1)β as function of φ1 is proposed, with proportionality constant B, depending on the state of compaction only. This is validated using computational and experimental packing data concerning random close and random loose packing arrangements. Finally, using this β, the closed-form analytical expression governing the void fraction of Weibull distributions is thoroughly compared with empirical data reported in the literature, and good agreement is found. Furthermore, the present analysis yields an algebraic equation relating the void fraction of monosized particles at different compaction states. This expression appears to be in good agreement with a broad collection of random close and random loose packing data.
Weibull Effective Area for Hertzian Ring Crack Initiation Stress
Energy Technology Data Exchange (ETDEWEB)
Jadaan, Osama M. [University of Wisconsin, Platteville; Wereszczak, Andrew A [ORNL; Johanns, Kurt E [ORNL
2011-01-01
Spherical or Hertzian indentation is used to characterize and guide the development of engineered ceramics under consideration for diverse applications involving contact, wear, rolling fatigue, and impact. Ring crack initiation can be one important damage mechanism of Hertzian indentation. It is caused by sufficiently-high, surface-located, radial tensile stresses in an annular ring located adjacent to and outside of the Hertzian contact circle. While the maximum radial tensile stress is known to be dependent on the elastic properties of the sphere and target, the diameter of the sphere, the applied compressive force, and the coefficient of friction, the Weibull effective area too will be affected by those parameters. However, the estimations of a maximum radial tensile stress and Weibull effective area are difficult to obtain because the coefficient of friction during Hertzian indentation is complex, likely intractable, and not known a priori. Circumventing this, the Weibull effective area expressions are derived here for the two extremes that bracket all coefficients of friction; namely, (1) the classical, frictionless, Hertzian case where only complete slip occurs, and (2) the case where no slip occurs or where the coefficient of friction is infinite.
Interval Estimations of the Two-Parameter Exponential Distribution
Directory of Open Access Journals (Sweden)
Lai Jiang
2012-01-01
Full Text Available In applied work, the two-parameter exponential distribution gives useful representations of many physical situations. Confidence interval for the scale parameter and predictive interval for a future independent observation have been studied by many, including Petropoulos (2011 and Lawless (1977, respectively. However, interval estimates for the threshold parameter have not been widely examined in statistical literature. The aim of this paper is to, first, obtain the exact significance function of the scale parameter by renormalizing the p∗-formula. Then the approximate Studentization method is applied to obtain the significance function of the threshold parameter. Finally, a predictive density function of the two-parameter exponential distribution is derived. A real-life data set is used to show the implementation of the method. Simulation studies are then carried out to illustrate the accuracy of the proposed methods.
Two-parameter Levy processes along decreasing paths
Covo, Shai
2010-01-01
Let {X_{t_1,t_2}: t_1,t_2 >= 0} be a two-parameter L\\'evy process on R^d. We study basic properties of the one-parameter process {X_{x(t),y(t)}: t \\in T} where x and y are, respectively, nondecreasing and nonincreasing nonnegative continuous functions on the interval T. We focus on and characterize the case where the process has stationary increments.
Indian Academy of Sciences (India)
SHIV KUMAR; ABHAY KUMAR SINGH; MANOJ KUMAR PATEL
2016-09-01
In this study, we have discussed the development of an inventory model when the deterioration rate of the item follows Weibull two parameter distributions under the effect of selling price and time dependent demand, since, not only the selling price, but also the time is a crucial factor to enhance the demand in the market as well as affecting the overall finance. In the present model, shortages are approved and also partially backlogged. Optimum inventory level, the optimal length of a cycle and the expressions for profit function under various cost considerations are obtained using differential equations. These are illustrated graphically with the help of numerical examples. The sensitivity analysis of the standards of the parameters has been performed tostudy the effect on inventory optimizations.
Roy, Aparna; Chakraborty, Sumit; Kundu, Sarada Prasad; Basak, Ratan Kumar; Majumder, Subhasish Basu; Adhikari, Basudam
2012-03-01
Chemically modified jute fibres are potentially useful as natural reinforcement in composite materials. Jute fibres were treated with 0.25%-1.0% sodium hydroxide (NaOH) solution for 0.5-48 h. The hydrophilicity, surface morphology, crystallinity index, thermal and mechanical characteristics of untreated and alkali treated fibres were studied.The two-parameter Weibull distribution model was applied to deal with the variation in mechanical properties of the natural fibres. Alkali treatment enhanced the tensile strength and elongation at break by 82% and 45%, respectively but decreased the hydrophilicity by 50.5% and the diameter of the fibres by 37%. Copyright © 2011 Elsevier Ltd. All rights reserved.
Two-parameter asymptotics in magnetic Weyl calculus
Lein, Max
2010-12-01
This paper is concerned with small parameter asymptotics of magnetic quantum systems. In addition to a semiclassical parameter ɛ, the case of small coupling λ to the magnetic vector potential naturally occurs in this context. Magnetic Weyl calculus is adapted to incorporate both parameters, at least one of which needs to be small. Of particular interest is the expansion of the Weyl product which can be used to expand the product of operators in a small parameter, a technique which is prominent to obtain perturbation expansions. Three asymptotic expansions for the magnetic Weyl product of two Hörmander class symbols are proven as (i) ɛ ≪ 1 and λ ≪ 1, (ii) ɛ ≪ 1 and λ = 1, as well as (iii) ɛ = 1 and λ ≪ 1. Expansions (i) and (iii) are impossible to obtain with ordinary Weyl calculus. Furthermore, I relate the results derived by ordinary Weyl calculus with those obtained with magnetic Weyl calculus by one- and two-parameter expansions. To show the power and versatility of magnetic Weyl calculus, I derive the semirelativistic Pauli equation as a scaling limit from the Dirac equation up to errors of fourth order in 1/c.
Cosmology on all scales: a two-parameter perturbation expansion
Goldberg, Sophia R; Malik, Karim A
2016-01-01
We propose and construct a two-parameter perturbative expansion around a Friedmann-Lema\\^{i}tre-Robertson-Walker geometry that can be used to model high-order gravitational effects in the presence of non-linear structure. This framework reduces to the weak-field and slow-motion post-Newtonian treatment of gravity in the appropriate limits, but also includes the low-amplitude large-scale fluctuations that are important for cosmological modelling. We derive a set of field equations that can be applied to the late Universe, where non-linear structure exists on supercluster scales, and perform a detailed investigation of the associated gauge problem. This allows us to identify a consistent set of perturbed quantities in both the gravitational and matter sectors, and to construct a set of gauge-invariant quantities that correspond to each of them. The field equations, written in terms of these quantities, take on a relatively simple form, and allow the effects of small-scale structure on the large-scale properties...
Mirror symmetry for two-parameter models, 1
Candelas, Philip; Font, A; Katz, S; Morrison, Douglas Robert Ogston; Candelas, Philip; Ossa, Xenia de la; Font, Anamaria; Katz, Sheldon; Morrison, David R.
1994-01-01
We study, by means of mirror symmetry, the quantum geometry of the K\\"ahler-class parameters of a number of Calabi-Yau manifolds that have $b_{11}=2$. Our main interest lies in the structure of the moduli space and in the loci corresponding to singular models. This structure is considerably richer when there are two parameters than in the various one-parameter models that have been studied hitherto. We describe the intrinsic structure of the point in the (compactification of the) moduli space that corresponds to the large complex structure or classical limit. The instanton expansions are of interest owing to the fact that some of the instantons belong to families with continuous parameters. We compute the Yukawa couplings and their expansions in terms of instantons of genus zero. By making use of recent results of Bershadsky et al. we compute also the instanton numbers for instantons of genus one. For particular values of the parameters the models become birational to certain models with one parameter. The co...
Directory of Open Access Journals (Sweden)
Jae Phil Park
2016-06-01
Full Text Available The typical experimental procedure for testing stress corrosion cracking initiation involves an interval-censored reliability test. Based on these test results, the parameters of a Weibull distribution, which is a widely accepted crack initiation model, can be estimated using maximum likelihood estimation or median rank regression. However, it is difficult to determine the appropriate number of test specimens and censoring intervals required to obtain sufficiently accurate Weibull estimators. In this study, we compare maximum likelihood estimation and median rank regression using a Monte Carlo simulation to examine the effects of the total number of specimens, test duration, censoring interval, and shape parameters of the true Weibull distribution on the estimator uncertainty. Finally, we provide the quantitative uncertainties of both Weibull estimators, compare them with the true Weibull parameters, and suggest proper experimental conditions for developing a probabilistic crack initiation model through crack initiation tests.
Moment series for the coefficient of variation in Weibull sampling
Energy Technology Data Exchange (ETDEWEB)
Bowman, K.O.; Shenton, L.R.
1981-01-01
For the 2-parameter Weibull distribution function F(t) = 1 - exp(-t/b)/sup c/, t > 0, with c and b positive, a moment estimator c* for c is the solution of the equationGAMMA(1 + 2/c*)/GAMMA/sup 2/ (1 + 1/c*) = 1 + v*/sup 2/ where v* is the coefficient of variation in the form ..sqrt..m/sub 2//m/sub 1/', m/sub 1/' being the sample mean, m/sub 2/ the sample second central moment (it is trivial in the present context to replace m/sub 2/ by the variance). One approach to the moments of c* (Bowman and Shenton, 1981) is to set-up moment series for the scale-free v*. The series are apparently divergent and summation algorithms are essential; we consider methods due to Levin (1973) and one, introduced ourselves (Bowman and Shenton, 1976).
Closed form expressions for moments of the beta Weibull distribution
Directory of Open Access Journals (Sweden)
Gauss M Cordeiro
2011-06-01
Full Text Available The beta Weibull distribution was first introduced by Famoye et al. (2005 and studied by these authors and Lee et al. (2007. However, they do not give explicit expressions for the moments. In this article, we derive explicit closed form expressions for the moments of this distribution, which generalize results available in the literature for some sub-models. We also obtain expansions for the cumulative distribution function and Rényi entropy. Further, we discuss maximum likelihood estimation and provide formulae for the elements of the expected information matrix. We also demonstrate the usefulness of this distribution on a real data set.A distribuição beta Weibull (BW foi primeiramente introduzida por Famoye et al. (2005, e estudada por estes autores e Lee et al. (2007. No entanto, eles não fornecem expressões explícitas para os momentos. Neste artigo, nós obtemos expressões explícitas, em forma fechada, para os momentos desta distribuição, o que generaliza resultados disponíveis na literatura para alguns sub-modelos. Nós obtemos expansões para a função de distribuição acumulada e entropia de Rényi. Além disso, discutimos sobre estimação por máxima verossimilhança e fornecemos fórmulaspara os elementos da matriz de informação de Fisher. Nós também mostramos a utilidade desta distribuição em um conjunto de dados reais.
Modeling root reinforcement using root-failure Weibull survival function
Directory of Open Access Journals (Sweden)
M. Schwarz
2013-03-01
Full Text Available Root networks contribute to slope stability through complicated interactions that include mechanical compression and tension. Due to the spatial heterogeneity of root distribution and the dynamic of root turnover, the quantification of root reinforcement on steep slope is challenging and consequently the calculation of slope stability as well. Although the considerable advances in root reinforcement modeling, some important aspect remain neglected. In this study we address in particular to the role of root strength variability on the mechanical behaviors of a root bundle. Many factors may contribute to the variability of root mechanical properties even considering a single class of diameter. This work presents a new approach for quantifying root reinforcement that considers the variability of mechanical properties of each root diameter class. Using the data of laboratory tensile tests and field pullout tests, we calibrate the parameters of the Weibull survival function to implement the variability of root strength in a numerical model for the calculation of root reinforcement (RBMw. The results show that, for both laboratory and field datasets, the parameters of the Weibull distribution may be considered constant with the exponent equal to 2 and the normalized failure displacement equal to 1. Moreover, the results show that the variability of root strength in each root diameter class has a major influence on the behavior of a root bundle with important implications when considering different approaches in slope stability calculation. Sensitivity analysis shows that the calibration of the tensile force and the elasticity of the roots are the most important equations, as well as the root distribution. The new model allows the characterization of root reinforcement in terms of maximum pullout force, stiffness, and energy. Moreover, it simplifies the implementation of root reinforcement in slope stability models. The realistic quantification of root
Klein, Claude A.; Miller, Richard P.
2001-09-01
For the purpose of assessing the strength of engineering ceramics, it is common practice to interpret the measured stresses at fracture in the light of a semi-empirical expression derived from Weibull's theory of brittle fracture, i.e., ln[-ln(1-P)]=-mln((sigma) N)+mln((sigma) ), where P is the cumulative failure probability, (sigma) is the applied tensile stress, m is the Weibull modulus, and (sigma) N is the nominal strength. The strength of (sigma) N, however, does not represent a true measure because it depends not only on the test method but also on the size of the volume or the surface subjected to tensile stresses. In this paper we intend to first clarify issues relating to the application of Weibull's theory of fracture and then make use of the theory to assess the results of equibiaxial flexure testing that was carried out on polycrystalline infrared-transmitting materials. These materials are brittle ceramics, which most frequently fail as a consequence of tensile stresses acting on surface flaws. Since equibiaxial flexure testing is the preferred method of measuring the strength of optical ceramics, we propose to formulate the failure-probability equation in terms of a characteristic strength, (sigma) C, for biaxial loadings, i.e., P=1-exp{-(pi) (ro/cm)2[(Gamma) (1+1/m)]m((sigma) /(sigma) C)m}, where ro is the radius of the loading ring (in centimeter) and (Gamma) (z) designates the gamma function. A Weibull statistical analysis of equibiaxial strength data thus amounts to obtaining the parameters m and (sigma) C, which is best done by directly fitting estimated Pi vs i data to the failure-probability equation; this procedure avoids distorting the distribution through logarithmic linearization and can be implemented by performing a non-linear bivariate regression. Concentric- ring fracture testing performed on five sets of Raytran materials validates the procedure in the sense that the two parameters model appears to describe the experimental failure
Constant-step stress accelerated life test of VFD under Weibull distribution case
Institute of Scientific and Technical Information of China (English)
ZHANG Jian-ping; GENG Xin-min
2005-01-01
Constant-step stress accelerated life test of Vacuum Fluorescent Display (VFD) was conducted with increased cathode temperature. Statistical analysis was done by applying Weibull distribution for describing the life, and Least Square Method (LSM)for estimating Weibull parameters. Self-designed special software was used to predict the VFD life. Numerical results showed that the average life of VFD is over 30000 h, that the VFD life follows Weibull distribution, and that the life-stress relationship satisfies linear Arrhenius equation completely. Accurate calculation of the key parameter enabled rapid estimation of VFD life.
Iskandar, Ismed; Satria Gondokaryono, Yudi
2016-02-01
In reliability theory, the most important problem is to determine the reliability of a complex system from the reliability of its components. The weakness of most reliability theories is that the systems are described and explained as simply functioning or failed. In many real situations, the failures may be from many causes depending upon the age and the environment of the system and its components. Another problem in reliability theory is one of estimating the parameters of the assumed failure models. The estimation may be based on data collected over censored or uncensored life tests. In many reliability problems, the failure data are simply quantitatively inadequate, especially in engineering design and maintenance system. The Bayesian analyses are more beneficial than the classical one in such cases. The Bayesian estimation analyses allow us to combine past knowledge or experience in the form of an apriori distribution with life test data to make inferences of the parameter of interest. In this paper, we have investigated the application of the Bayesian estimation analyses to competing risk systems. The cases are limited to the models with independent causes of failure by using the Weibull distribution as our model. A simulation is conducted for this distribution with the objectives of verifying the models and the estimators and investigating the performance of the estimators for varying sample size. The simulation data are analyzed by using Bayesian and the maximum likelihood analyses. The simulation results show that the change of the true of parameter relatively to another will change the value of standard deviation in an opposite direction. For a perfect information on the prior distribution, the estimation methods of the Bayesian analyses are better than those of the maximum likelihood. The sensitivity analyses show some amount of sensitivity over the shifts of the prior locations. They also show the robustness of the Bayesian analysis within the range
On two-parameter equations of state and the limitations of a hard sphere Peng-Robinson equation
Harmens, A.; Jeremiah, Dawn E.
Simple two-parameter equations of state are exceptionally effective for calculations on systems of small, uncomplicated molecules. They are therefore extremely useful for vapour-liquid equilibrium calculations in cryogenic and light hydrocarbon process design. In a search for further improvement three two-parameter equations of state with a co-volume repulsion term and three with a hard sphere repulsion term have been investigated. Their characteristic constants at the critical point have been compared. The procedure for fitting the two parameters to empirical data in the subcritical region was analysed. A perturbed hard sphere equation with a Peng-Robinson attraction term was shown to be unsuitable for application over a wide range of p, T conditions. A similar equation with a Redlich-Kwong attraction term gives good service in the cryogenic range.
A Study on The Mixture of Exponentiated-Weibull Distribution
Directory of Open Access Journals (Sweden)
Adel Tawfik Elshahat
2016-12-01
Full Text Available Mixtures of measures or distributions occur frequently in the theory and applications of probability and statistics. In the simplest case it may, for example, be reasonable to assume that one is dealing with the mixture in given proportions of a finite number of normal populations with different means or variances. The mixture parameter may also be denumerable infinite, as in the theory of sums of a random number of random variables, or continuous, as in the compound Poisson distribution. The use of finite mixture distributions, to control for unobserved heterogeneity, has become increasingly popular among those estimating dynamic discrete choice models. One of the barriers to using mixture models is that parameters that could previously be estimated in stages must now be estimated jointly: using mixture distributions destroys any additive reparability of the log likelihood function. In this thesis, the maximum likelihood estimators have been obtained for the parameters of the mixture of exponentiated Weibull distribution when sample is available from censoring scheme. The maximum likelihood estimators of the parameters and the asymptotic variance covariance matrix have been also obtained. A numerical illustration for these new results is given.
Designing a Repetitive Group Sampling Plan for Weibull Distributed Processes
Directory of Open Access Journals (Sweden)
Aijun Yan
2016-01-01
Full Text Available Acceptance sampling plans are useful tools to determine whether the submitted lots should be accepted or rejected. An efficient and economic sampling plan is very desirable for the high quality levels required by the production processes. The process capability index CL is an important quality parameter to measure the product quality. Utilizing the relationship between the CL index and the nonconforming rate, a repetitive group sampling (RGS plan based on CL index is developed in this paper when the quality characteristic follows the Weibull distribution. The optimal plan parameters of the proposed RGS plan are determined by satisfying the commonly used producer’s risk and consumer’s risk at the same time by minimizing the average sample number (ASN and then tabulated for different combinations of acceptance quality level (AQL and limiting quality level (LQL. The results show that the proposed plan has better performance than the single sampling plan in terms of ASN. Finally, the proposed RGS plan is illustrated with an industrial example.
Institute of Scientific and Technical Information of China (English)
A.Suresh Babu; V.Jayabalan
2009-01-01
In recent times, conventional materials are replaced by metal matrix composites (MMCs) due to their high specific strength and modulus.Strength reliability, one of the key factors restricting wider use of composite materials in various applications, is commonly characterized by Weibull strength distribution function.In the present work, statistical analysis of the strength data of 15% volume alumina particle (mean size 15 μm)reinforced in aluminum alloy (1101 grade alloy) fabricated by stir casting method was carried out using Weibull probability model.Twelve tension tests were performed according to ASTM B577 standards and the test data, the corresponding Weibull distribution was obtained.Finally the reliability of the composite behavior in terms of its fracture strength was presented to ensure the reliability of composites for suitable applications.An important implication of the present study is that the Weibull distribution describes the experimentally measured strength data more appropriately.
Unification of the Two-Parameter Equation of State and the Principle of Corresponding States
DEFF Research Database (Denmark)
Mollerup, Jørgen
1998-01-01
A two-parameter equation of state is a two-parameter corresponding states model. A two-parameter corresponding states model is composed of two scale factor correlations and a reference fluid equation of state. In a two-parameter equation of state the reference equation of state is the two......-parameter equation of state itself. If we retain the scale factor correlations derived from a two-parameter equation of state, but replace the two-parameter equation of state with a more accurate pure component equation of state for the reference fluid, we can improve the existing models of equilibrium properties...... without refitting any model parameters, and without imposing other restrictions as regards to species and mixing rules as already imposed by the two-parameter equation of state. The theory and procedure is outlined in the paper....
Gobinda Chandra Panda; Pravat Kumar Sukla
2013-01-01
Background: Physical decay or deterioration of goods in stock is an important feature of real inventory systems. Material and methods: In the present paper, we discuss an production inventory model for a Weibull deteriorating item over a finite planning horizon with a linearly time-varying demand rate and a uniform production rate, allowing shortages, which are completely backlogged. Results and conclusions: A production inventory model is developed for a Weibull deteriorating...
Abul Kalam Azad; Mohammad Golam Rasul; Talal Yusaf
2014-01-01
The best Weibull distribution methods for the assessment of wind energy potential at different altitudes in desired locations are statistically diagnosed in this study. Seven different methods, namely graphical method (GM), method of moments (MOM), standard deviation method (STDM), maximum likelihood method (MLM), power density method (PDM), modified maximum likelihood method (MMLM) and equivalent energy method (EEM) were used to estimate the Weibull parameters and six statistical tools, name...
Two-parameter non-linear spacetime perturbations gauge transformations and gauge invariance
Bruni, M; Sopuerta, C F; Bruni, Marco; Gualtieri, Leonardo; Sopuerta, Carlos F.
2003-01-01
An implicit fundamental assumption in relativistic perturbation theory is that there exists a parametric family of spacetimes that can be Taylor expanded around a background. The choice of the latter is crucial to obtain a manageable theory, so that it is sometime convenient to construct a perturbative formalism based on two (or more) parameters. The study of perturbations of rotating stars is a good example: in this case one can treat the stationary axisymmetric star using a slow rotation approximation (expansion in the angular velocity Omega), so that the background is spherical. Generic perturbations of the rotating star (say parametrized by lambda) are then built on top of the axisymmetric perturbations in Omega. Clearly, any interesting physics requires non-linear perturbations, as at least terms lambda Omega need to be considered. In this paper we analyse the gauge dependence of non-linear perturbations depending on two parameters, derive explicit higher order gauge transformation rules, and define gaug...
Sanford, W. E.
2015-12-01
Age distributions of base flow to streams are important to estimate for predicting the timing of water-quality responses to changes in distributed inputs of nutrients or pollutants at the land surface. Simple models of shallow aquifers will predict exponential age distributions, but more realistic 3-D stream-aquifer geometries will cause deviations from an exponential curve. In addition, in fractured rock terrains the dual nature of the effective and total porosity of the system complicates the age distribution further. In this study shallow groundwater flow and advective transport were simulated in two regions in the Eastern United States—the Delmarva Peninsula and the upper Potomac River basin. The former is underlain by layers of unconsolidated sediment, while the latter consists of folded and fractured sedimentary rocks. Transport of groundwater to streams was simulated using the USGS code MODPATH within 175 and 275 watersheds, respectively. For the fractured rock terrain, calculations were also performed along flow pathlines to account for exchange between mobile and immobile flow zones. Porosities at both sites were calibrated using environmental tracer data (3H, 3He, CFCs and SF6) in wells and springs, and with a 30-year tritium record from the Potomac River. Carbonate and siliciclastic rocks were calibrated to have mobile porosity values of one and six percent, and immobile porosity values of 18 and 12 percent, respectively. The age distributions were fitted to Weibull functions. Whereas an exponential function has one parameter that controls the median age of the distribution, a Weibull function has an extra parameter that controls the slope of the curve. A weighted Weibull function was also developed that potentially allows for four parameters, two that control the median age and two that control the slope, one of each weighted toward early or late arrival times. For both systems the two-parameter Weibull function nearly always produced a substantially
Directory of Open Access Journals (Sweden)
Emilio Gómez-Lázaro
2016-02-01
Full Text Available The Weibull probability distribution has been widely applied to characterize wind speeds for wind energy resources. Wind power generation modeling is different, however, due in particular to power curve limitations, wind turbine control methods, and transmission system operation requirements. These differences are even greater for aggregated wind power generation in power systems with high wind penetration. Consequently, models based on one-Weibull component can provide poor characterizations for aggregated wind power generation. With this aim, the present paper focuses on discussing Weibull mixtures to characterize the probability density function (PDF for aggregated wind power generation. PDFs of wind power data are firstly classified attending to hourly and seasonal patterns. The selection of the number of components in the mixture is analyzed through two well-known different criteria: the Akaike information criterion (AIC and the Bayesian information criterion (BIC. Finally, the optimal number of Weibull components for maximum likelihood is explored for the defined patterns, including the estimated weight, scale, and shape parameters. Results show that multi-Weibull models are more suitable to characterize aggregated wind power data due to the impact of distributed generation, variety of wind speed values and wind power curtailment.
Directory of Open Access Journals (Sweden)
Antonio Colmenar-Santos
2014-10-01
Full Text Available Electric power losses are constantly present during the service life of wind farms and must be considered in the calculation of the income arising from selling the produced electricity. It is typical to estimate the electrical losses in the design stage as those occurring when the wind farm operates at rated power, nevertheless, it is necessary to determine a method for checking if the actual losses meet the design requirements during the operation period. In this paper, we prove that the electric losses at rated power should not be considered as a reference level and a simple methodology will be developed to analyse and foresee the actual losses in a set period as a function of the wind resource in such period, defined according to the Weibull distribution, and the characteristics of the wind farm electrical infrastructure. This methodology facilitates a simple way, to determine in the design phase and to check during operation, the actual electricity losses.
Asymptotic Results for the Two-parameter Poisson-Dirichlet Distribution
Feng, Shui
2009-01-01
The two-parameter Poisson-Dirichlet distribution is the law of a sequence of decreasing nonnegative random variables with total sum one. It can be constructed from stable and Gamma subordinators with the two-parameters, $\\alpha$ and $\\theta$, corresponding to the stable component and Gamma component respectively. The moderate deviation principles are established for the two-parameter Poisson-Dirichlet distribution and the corresponding homozygosity when $\\theta$ approaches infinity, and the large deviation principle is established for the two-parameter Poisson-Dirichlet distribution when both $\\alpha$ and $\\theta$ approach zero.
Goh, Segun; Kwon, H. W.; Choi, M. Y.
2014-06-01
We consider the Yule-type multiplicative growth and division process, and describe the ubiquitous emergence of Weibull and log-normal distributions in a single framework. With the help of the integral transform and series expansion, we show that both distributions serve as asymptotic solutions of the time evolution equation for the branching process. In particular, the maximum likelihood method is employed to discriminate between the emergence of the Weibull distribution and that of the log-normal distribution. Further, the detailed conditions for the distinguished emergence of the Weibull distribution are probed. It is observed that the emergence depends on the manner of the division process for the two different types of distribution. Numerical simulations are also carried out, confirming the results obtained analytically.
The Weibull functional form for the energetic particle spectrum at interplanetary shock waves
Laurenza, M.; Consolini, G.; Storini, M.; Pallocchia, G.; Damiani, A.
2016-11-01
Transient interplanetary shock waves are often associated with high energy particle enhancements, which are called energetic storm particle (ESP) events. Here we present a case study of an ESP event, recorded by the SEPT, LET and HET instruments onboard the STEREO B spacecraft, on 3 October 2011, in a wide energy range from 0.1 MeV to ∼ 30 MeV. The obtained particle spectrum is found to be reproduced by a Weibull like shape. Moreover, we show that the Weibull spectrum can be theoretically derived as the asymptotic steady state solution of the diffusion loss equation by assuming anomalous diffusion for particle velocity. The evaluation of Weibull's parameters obtained from particle observations and the power spectral density of the turbulent fluctations in the shock region, support this scenario and suggest that stochastic acceleration can contribute significantly to the acceleration of high energetic particles at collisioness shock waves.
Weibull-k Revisited: “Tall” Profiles and Height Variation of Wind Statistics
DEFF Research Database (Denmark)
Kelly, Mark C.; Troen, Ib; Ejsing Jørgensen, Hans
2014-01-01
with height is less understood. Previously we derived a probabilistic model based on similarity theory for calculating the effects of stability and planetary boundary-layer depth upon long-term mean wind profiles. However, some applications (e.g. wind energy estimation) require the Weibull shape parameter (k......-shape parameter. Further, an alternate model for the vertical profile of Weibull shape parameter is made, improving upon a basis set forth by Wieringa (Boundary-Layer Meteorol, 1989, Vol. 47, 85–110), and connecting with a newly-corrected corollary of the perturbed geostrophic-drag theory of Troen and Petersen...... (European Wind Atlas, 1989, Risø National Laboratory, Roskilde). Comparing the models for Weibull-k profiles, a new interpretation and explanation is given for the vertical variation of the shape of wind-speed distributions. Results of the modelling are shown for a number of sites, with a discussion...
Two-parameter deformed supersymmetric oscillators with SUq1/q2(n|m)-covariance
Algin, A; Arikan, A S; Algin, Abdullah; Arik, Metin; Arikan, Ali Serdar
2003-01-01
A two-parameter deformed superoscillator system with SUq1/q2(n|m)-covariance is presented and used to construct a two-parameter deformed N=2 SUSY algebra. The Fock space representation of the algebra is discussed and the deformed Hamiltonian for such generalized superoscillators is obtained.
Weibull approximation of LiDAR waveforms for estimating the beam attenuation coefficient.
Montes-Hugo, Martin A; Vuorenkoski, Anni K; Dalgleish, Fraser R; Ouyang, Bing
2016-10-03
Tank experiments were performed at different water turbidities to examine relationships between the beam attenuation coefficient (c) and Weibull shape parameters derived from LiDAR waveforms measured with the Fine Structure Underwater LiDAR (FSUIL). Optical inversions were made at 532 nm, within a c range of 0.045-1.52 m-1, and based on a LiDAR system having two field-of-view (15 and 75.7 mrad) and two linear polarizations. Consistently, the Weibull scale parameter or P2 showed the strongest covariation with c and was a more accurate proxy with respect to the LiDAR attenuation coefficient.
Energy Technology Data Exchange (ETDEWEB)
Ramos, A.; Muniz-Calvente, M.; Fernandez, P.; Fernandez Cantel, A.; Lamela, M. J.
2015-10-01
Glass and ceramics present a brittle behaviour so a large scatter in the test results is obtained. This dispersion is mainly due to the inevitable presence of micro-cracks on its surface, edge defects or internal defects, which must be taken into account using an appropriate failure criteria non-deterministic but probabilistic. Among the existing probability distributions, the two or three parameter Weibull distribution is generally used in adjusting material resistance results, although the method of use thereof is not always correct. Firstly, in this work, the results of a large experimental programme using annealed glass specimens of different dimensions based on four-point bending and coaxial double ring tests was performed. Then, the finite element models made for each type of test, the adjustment of the parameters of the three-parameter Weibull distribution function (cdf) (λ: location, β: shape, d: scale) for a certain failure criterion and the calculation of the effective areas from the cumulative distribution function are presented. Summarizing, this work aims to generalize the use of the three-parameter Weibull function in structural glass elements with stress distributions not analytically described, allowing to apply the probabilistic model proposed in general loading distributions. (Author)
Bias in the Weibull Strength Estimation of a SiC Fiber for the Small Gauge Length Case
Morimoto, Tetsuya; Nakagawa, Satoshi; Ogihara, Shinji
It is known that the single-modal Weibull model describes well the size effect of brittle fiber tensile strength. However, some ceramic fibers have been reported that single-modal Weibull model provided biased estimation on the gauge length dependence. A hypothesis on the bias is that the density of critical defects is very small, thus, fracture probability of small gauge length samples distributes in discrete manner, which makes the Weibull parameters dependent on the gauge length. Tyranno ZMI Si-Zr-C-O fiber has been selected as an example fiber. The tensile tests have been done on several gauge lengths. The derived Weibull parameters have shown a dependence on the gauge length. Fracture surfaces were observed with SEM. Then we classified the fracture surfaces into the characteristic fracture patterns. Percentage of each fracture pattern was found dependent on the gauge length, too. This may be an important factor of the Weibull parameter dependence on the gauge length.
Institute of Scientific and Technical Information of China (English)
WANG Ronghua; FEI Heliang
2004-01-01
In this note, the tampered failure rate model is generalized from the step-stress accelerated life testing setting to the progressive stress accelerated life testing for the first time. For the parametric setting where the scale parameter satisfying the equation of the inverse power law is Weibull, maximum likelihood estimation is investigated.
Optimization of a small passive wind turbine based on mixed Weibull-turbulence statistics of wind
2008-01-01
A "low cost full passive structure" of wind turbine system is proposed. The efficiency of such device can be obtained only if the design parameters are mutually adapted through an optimization design approach. An original wind profile generation process mixing Weibull and turbulence statistics is presented. The optimization results are compared with those obtained from a particular but typical time cycle of wind speed.
Weibull statistics effective area and volume in the ball-on-ring testing method
DEFF Research Database (Denmark)
Frandsen, Henrik Lund
2014-01-01
The ball-on-ring method is together with other biaxial bending methods often used for measuring the strength of plates of brittle materials, because machining defects are remote from the high stresses causing the failure of the specimens. In order to scale the measured Weibull strength...
Directory of Open Access Journals (Sweden)
Wahyu Widiyanto
2013-06-01
Full Text Available Wind characteristics especially the event probability have been more studied in the relation to wind energy availability in an area. Nevertheless, in the relation to coastal structure, it is still rare to be unveiled in a paper particulary in Indonesia. In this article, therefore, it is studied probability distribution commonly used to wind energy analysis i.e. Weibull and Rayleigh distribution. The distribution is applied to analyze wind data in Cilacap Coast. Wind data analyzed is from Board of Meteorology, Climatology and Geophysics, Cilacap branch, along two years (2009 – 2011. Mean, varians and standard deviation are founded to calculate shape factor (k and scale factor (c which must be available to arrange distribution function of Weibull and Rayleigh. In the region, it gains a result that wind speed probabilities follow Weibull and Rayleigh function fairly. Shape parameter value has been gotten k = 3,26, while scale parameter has been gotten respectively c = 3,64 for Weibull and Cr = 2,44 for Rayleigh. Value of k ≥ 3 indicates the region has regular and steady wind. Besides, mean speed of wind is 3,3 m/s.
Weibull-Based Design Methodology for Rotating Structures in Aircraft Engines
Directory of Open Access Journals (Sweden)
Erwin V. Zaretsky
2003-01-01
Full Text Available The NASA Energy-Efficient Engine (E3-Engine is used as the basis of a Weibull-based life and reliability analysis. Each component's life, and thus the engine's life, is defined by high-cycle fatigue or low-cycle fatigue. Knowing the cumulative life distribution of each of the components making up the engine as represented by a Weibull slope is a prerequisite to predicting the life and reliability of the entire engine. As the engine's Weibull slope increases, the predicted life decreases. The predicted engine lives L5 (95% probability of survival of approximately 17,000 and 32,000 hr do correlate with current engine-maintenance practices without and with refurbishment, respectively. The individual high-pressure turbine (HPT blade lives necessary to obtain a blade system life L0.1 (99.9% probability of survival of 9000 hr for Weibull slopes of 3, 6, and 9 are 47,391; 20,652; and 15,658 hr, respectively. For a design life of the HPT disks having probable points of failure equal to or greater than 36,000 hr at a probability of survival of 99.9%, the predicted disk system life L0.1 can vary from 9408 to 24,911 hr.
Modified Weibull Distribution for Analyzing the Tensile Strength of Bamboo Fibers
Directory of Open Access Journals (Sweden)
Fang Wang
2014-12-01
Full Text Available There is growing evidence that the standard Weibull strength distribution is not always accurate for the description of variability in tensile strength and its dependence on the gauge size of brittle fibers. In this work, a modified Weibull model by incorporating the diameter variation of bamboo fiber is proposed to investigate the effect of fiber length and diameter on the tensile strength. Fiber strengths are obtained for lengths ranging from 20 to 60 mm and diameters ranging from 196.6 to 584.3 μm through tensile tests. It is shown that as the within-fiber diameter variation increases, the fracture strength of the bamboo fiber decreases. In addition, the accuracy of using weak-link scaling predictions based on the standard and modified Weibull distribution are assessed, which indicates that the use of the modified distribution provides better correlation with the experimental data than the standard model. The result highlights the accuracy of the modified Weibull model for characterizing the strength and predicting the size dependence of bamboo fiber.
Pallocchia, G.; Laurenza, M.; Consolini, G.
2017-03-01
Some interplanetary shocks are associated with short-term and sharp particle flux enhancements near the shock front. Such intensity enhancements, known as shock-spike events (SSEs), represent a class of relatively energetic phenomena as they may extend to energies of some tens of MeV or even beyond. Here we present an SSE case study in order to shed light on the nature of the particle acceleration involved in this kind of event. Our observations refer to an SSE registered on 2011 October 3 at 22:23 UT, by STEREO B instrumentation when, at a heliocentric distance of 1.08 au, the spacecraft was swept by a perpendicular shock moving away from the Sun. The main finding from the data analysis is that a Weibull distribution represents a good fitting function to the measured particle spectrum over the energy range from 0.1 to 30 MeV. To interpret such an observational result, we provide a theoretical derivation of the Weibull spectrum in the framework of the acceleration by “killed” stochastic processes exhibiting power-law growth in time of the velocity expectation, such as the classical Fermi process. We find an overall coherence between the experimental values of the Weibull spectrum parameters and their physical meaning within the above scenario. Hence, our approach based on the Weibull distribution proves to be useful for understanding SSEs. With regard to the present event, we also provide an alternative explanation of the Weibull spectrum in terms of shock-surfing acceleration.
QUASI-DIAGONALIZATION FOR A SINGULARLY PERTURBED DIFFERENTIAL SYSTEM WITH TWO PARAMETERS
Institute of Scientific and Technical Information of China (English)
无
2011-01-01
By two successive linear transformations,a singularly perturbed differential system with two parameters is quasi-diagonalized. The method of variation of constants and the principle of contraction map are used to prove the existence of the transformations.
Quasi-sure Product Variation of Two-parameter Smooth Martingales on the Wiener Space
Institute of Scientific and Technical Information of China (English)
Ji Cheng LIU; Jia Gang REN
2006-01-01
In this paper, we prove that the process of product variation of a two-parameter smooth martingale admits an ∞ modification, which can be constructed as the quasi-sure limit of sum of the corresponding product variation.
Directory of Open Access Journals (Sweden)
Yu. Mishura
1996-01-01
Full Text Available We study two-parameter coordinate-wise C0-semigroups and their generators, as well as two-parameter evolutions and differential equations up to the second order for them. These results are applied to obtain the Hille-Yosida theorem for homogeneous Markov fields of the Feller type and to establish forward, backward, and mixed Kolmogorov equations for nonhomogeneous diffusion fields on the plane.
Theory of two-parameter Markov chain with an application in warranty study
Calvache, Álvaro
2012-01-01
In this paper we present the classical results of Kolmogorov's backward and forward equations to the case of a two-parameter Markov process. These equations relates the infinitesimal transition matrix of the two-parameter Markov process. However, solving these equations is not possible and we require a numerical procedure. In this paper, we give an alternative method by use of double Laplace transform of the transition probability matrix and of the infinitesimal transition matrix of the process. An illustrative example is presented for the method proposed. In this example, we consider a two-parameter warranty model, in which a system can be any of these states: working, failure. We calculate the transition density matrix of these states and also the cost of the warranty for the proposed model.
Transverse Momentum Distribution in Heavy Ion Collision using q-Weibull Formalism
Dash, Sadhana
2016-01-01
We have implemented the Tsallis q-statistics in the Weibull model of particle production known as the q-Weibull distribution to describe the transverse-momentum (pT ) distribution of the charged hadrons at mid-rapidity measured at RHIC and LHC energies. The model describes the data remarkably well for the entire pT range measured in nucleus-nucleus and nucleon-nucleon collisions. The proposed distribution is based on the non-extensive Tsallis q-statistics which replaces the usual thermal equilibrium assumption of hydrodynamical models. The parameters of the distribution can be related to the various aspects of complex dynamics associated with such collision process.
Directory of Open Access Journals (Sweden)
J. Szymszal
2007-07-01
Full Text Available The first part of the study describes the methods used to determine Weibull modulus and the related reliability index of hypereutectic silumins containing about 17% Si, assigned for manufacture of high-duty castings to be used in automotive applications and aviation. The second part of the study discusses the importance of chemical composition, including the additions of 3% Cu, 1,5% Ni and 1,5% Mg, while in the third part attention was focussed on the effect of process history, including mould type (sand or metal as well as the inoculation process and heat treatment (solutioning and ageing applied to the cast AlSi17Cu3Mg1,5Ni1,5 alloy, on the run of Weibull distribution function and reliability index calculated for the tensile strength Rm of the investigated alloys.
Comparison of Estimators for Exponentiated Inverted Weibull Distribution Based on Grouped Data Amal
2014-01-01
In many situations, instead of complete sample, data is available only in grouped form. This paper presents estimation of population parameters for the exponentiated inverted Weibull distribution based on grouped data with equi and unequi-spaced grouping. Several alternative estimation schemes, such as, the method of maximum likelihood, least lines, least squares, minimum chi-square, and modified minimum chi-square are considered. Since the different methods of estimation didn...
An EOQ model with time dependent Weibull deterioration and ramp type demand ,
Directory of Open Access Journals (Sweden)
Chaitanya Kumar Tripathy
2011-04-01
Full Text Available This paper presents an order level inventory system with time dependent Weibull deterioration and ramp type demand rate where production and demand are time dependent. The proposed model of this paper considers economic order quantity under two different cases. The implementation of the proposed model is illustrated using some numerical examples. Sensitivity analysis is performed to show the effect of changes in the parameters on the optimum solution.
Directory of Open Access Journals (Sweden)
Soontorn Boonta
2013-01-01
Full Text Available In this study, we applied Randomized Neighborhood Search (RNS to estimate the Weibull parameters to determine the severity of fire accidents; the data were provided by the Thai Reinsurance Public Co., Ltd. We compared this technique with other frequently-used techniques: the Maximum Likelihood Estimator (MLE, the Method of Moments (MOM, the Least Squares Method (LSM and the weighted least squares method (WLSM and found that RNS estimates the parameters more accurately than do MLE, MOM, LSM or WLSM."
Directory of Open Access Journals (Sweden)
Jain Sanjay
2010-01-01
Full Text Available In this present paper an inventory model is developed with ramp type demand, starting with shortage and three - parameter Weibull distribution deterioration. A brief analysis of the cost involved is carried out by an example.
Directory of Open Access Journals (Sweden)
Abul Kalam Azad
2014-05-01
Full Text Available The best Weibull distribution methods for the assessment of wind energy potential at different altitudes in desired locations are statistically diagnosed in this study. Seven different methods, namely graphical method (GM, method of moments (MOM, standard deviation method (STDM, maximum likelihood method (MLM, power density method (PDM, modified maximum likelihood method (MMLM and equivalent energy method (EEM were used to estimate the Weibull parameters and six statistical tools, namely relative percentage of error, root mean square error (RMSE, mean percentage of error, mean absolute percentage of error, chi-square error and analysis of variance were used to precisely rank the methods. The statistical fittings of the measured and calculated wind speed data are assessed for justifying the performance of the methods. The capacity factor and total energy generated by a small model wind turbine is calculated by numerical integration using Trapezoidal sums and Simpson’s rules. The results show that MOM and MLM are the most efficient methods for determining the value of k and c to fit Weibull distribution curves.
An EOQ Model for Three parameter Weibull Deteriorating Item with Partial Backlogging
Directory of Open Access Journals (Sweden)
L.M. Pradhan
2013-03-01
Full Text Available Background: Business organisations are facing a lot of competition during these days. To withstand the competition and to remain in the front row, an enterprise should have optimum profitable plan for his business. Researchers in recent years have developed various inventory models for deteriorating items considering various practical situations. Partial backlogging is considerably a new concept introduced in developing various models for Weibull deteriorating items. Methodology: In this paper an inventory model has been developed considering three parameter Weibull deterioration of a single item with partial backlogging. Here demand rate is considered to be constant and lead time is zero. During the stock out period the backlogging rate is variable and is dependent on the length of the waiting time for the next replenishment. Results and conclusion: Optimal order quantity and total variable cost during a cycle has been derived for the proposed inventory model considering three parameter Weibull deteriorating item with partial backlogging. The results obtained in this paper are illustrated with the help of a numerical example and sensitivity analysis.
Valoración de derivados europeos con mixtura de distribuciones Weibull
Directory of Open Access Journals (Sweden)
Andrés Mauricio Molina
2015-07-01
Full Text Available El modelo Black-Scholes para valoración de opciones europeas se usa bastante en el mercado por su fácil ejecución. Sin embargo, empieza a ser poco preciso en diferentes activos cuya dinámica no es de una distribución lognormal, por lo que se necesita buscar nuevas distribuciones para valorar opciones emitidas sobre diferentes activos subyacentes. Varios investigadores han trabajado en nuevas fórmulas de valoración de derivados suponiendo diferentes distribuciones ya sea para el precio del activo subyacente o para su retorno. Este artículo presenta dos fórmulas para valoración de activos: una modifica la fórmula usando una distribución de Weibull de dos parámetros propuesta por Savickas (2002 añadiendo dos nuevos parámetros (escala y localización y otra supone que la distribución del activo es una mixtura de distribuciones de Weibull. Se presentan también comparaciones de estos modelos con otros ya existentes como Black-Scholes y el modelo de Savickas con distribución Weibull simple.
A Finite Element Study of the Bending Behavior of Beams Resting on Two-Parameter Elastic Foundation
Directory of Open Access Journals (Sweden)
Iancu-Bogdan Teodoru
2006-01-01
Full Text Available Although the Winkler’s model is a poor representation of the many practical subgrade or subbase materials, it is widely used in soil-structure problems for almost one and a half century. The foundations represented by Winkler model can not sustain shear stresses, and hence discontinuity of adjacent spring displacements can occur. This is the prime shortcoming of this foundation model which in practical applications may result in significant inaccuracies in the evaluated structural response. In order to overcome these problem many researchers have been proposed various mechanical foundation models considering interaction with the surroundings. Among them we shall mention the class of two-parameter foundations -- named like this because they have the second parameter which introduces interactions between adjacent springs, in addition to the first parameter from the ordinary Winkler’s model. This class of models includes Filonenko-Borodich, Pasternak, generalized, and Vlasov foundations. Mathematically, the equations to describe the reaction of the two-parameter foundations arc equilibrium ones, and the only difference is the definition of the parameters. For the convenience of discussion, the Pasternak foundation is adopted in present paper. In order to analyse the bending behavior of a Euler-Bernoulli beam resting on two-parameter elastic foundation a (displacement Finite Element (FE formulation, based on the cubic displacement function of the governing differential equation, is introduced. The resulting effects of shear stiffness of the Pasternak model on the mechanical quantities are discussed in comparison with those of the Winkler’s model. Some numerical case studies illustrate the accuracy of the formulation and the importance of the soil shearing effect in the vertical direction, associated with continuous elastic foundation.
Harun; Draisma; Frankena; Veeneklaas; Van Kampen M
1999-05-07
In this paper we tested the Weibull function and beta-binomial distribution to analyse and predict nest hatchability, using empirical data on hatchability in Muscovy duck (Cairina moschata) eggs under natural incubation (932 successfully incubated nests and 11 822 eggs). The estimated parameters of the Weibull function and beta-binomial model were compared with the logistic regression analysis. The maximum likelihood estimation of the parameters was used to quantify simultaneously the influence of the nesting behaviour and the duration of the reproduction cycle on hatchability. The estimated parameters showed that the hatchability was not affected in natural dump nests, but in artificial dump nests and in nests with non-term eggs the hatchability was reduced by 10 and 25%, respectively. Similar results were obtained using logistic regression. Both models provided a satisfactory description of the observed data set, but the beta-binomial model proved to have more parameters with practical and biological meaningful interpretations, because this model is able to quantify and incorporate the unexplained variation in a single parameter theta (which is a variance measure). Copyright 1999 Academic Press.
A Jacobi-Davidson type method for a right definite two-parameter eigenvalue problem
Hochstenbach, M.; Plestenjak, B.
2001-01-01
We present a new numerical iterative method for computing selected eigenpairs of a right definite two-parameter eigenvalue problem. The method works even without good initial approximations and is able to tackle large problems that are too expensive for existing methods. The new method is similar
Accuracy of Parameter Estimation in Gibbs Sampling under the Two-Parameter Logistic Model.
Kim, Seock-Ho; Cohen, Allan S.
The accuracy of Gibbs sampling, a Markov chain Monte Carlo procedure, was considered for estimation of item and ability parameters under the two-parameter logistic model. Memory test data were analyzed to illustrate the Gibbs sampling procedure. Simulated data sets were analyzed using Gibbs sampling and the marginal Bayesian method. The marginal…
Codimension-Two Bifurcation Analysis in Hindmarsh-Rose Model with Two Parameters
Institute of Scientific and Technical Information of China (English)
DUAN Li-Xia; LU Qi-Shao
2005-01-01
@@ Bifurcation phenomena in a Hindmarsh-Rose neuron model are investigated. Special attention is paid to the bifurcation structures off two parameters, where codimension-two generalized-Hopf bifurcation and fold-Hopf bifurcation occur. The classification offiring patterns as well as the transition mechanism in different regions on the parameter plane are obtained.
EQUIDISTANT TOOTH GENERATION ON NONCYLINDRICAL SURFACES FOR TWO-PARAMETER GEARING
Directory of Open Access Journals (Sweden)
Yuriy GUTSALENKO,
2011-11-01
Full Text Available The questions of design research of noncylindrical tooth surfaces for two-parameter gearing on example of bevel gears with constant normal pitch forgearing variators are been considered. Engineering is based on the special applied development of the mathematical theory of multiparametric mappings of space.
SINGULARLY PERTURBED SOLUTION FOR THIRD ORDER NONLINEAR EQUATIONS WITH TWO PARAMETERS
Institute of Scientific and Technical Information of China (English)
无
2007-01-01
A class of singularly perturbed boundary value problems for nonlinear equation of the third order with two parameters is considered. Under suitable conditions, using the theory of differential inequalities the existence and asymptotic behavior of the solution for boundary value problem are studied.
Application of a two-parameter quantum algebra to rotational spectroscopy of nuclei
Barbier, R.; Kibler, M.
1996-10-01
A two-parameter quantum algebra Uqp( u2) is briefly investigated in this paper. The basic ingredients of a model based on the Uqp( u2) symmetry, the qp-rotator model, are presented in detail. Some general tendencies arising from the application of this model to the description of rotational bands of various atomic nuclei are summarized.
Directory of Open Access Journals (Sweden)
Asoke Kumar Bhunia
2014-06-01
Full Text Available In this paper, an attempt is made to develop two inventory models for deteriorating items with variable demand dependent on the selling price and frequency of advertisement of items. In the first model, shortages are not allowed whereas in the second, these are allowed and partially backlogged with a variable rate dependent on the duration of waiting time up to the arrival of next lot. In both models, the deterioration rate follows three-parameter Weibull distribution and the transportation cost is considered explicitly for replenishing the order quantity. This cost is dependent on the lot-size as well as the distance from the source to the destination. The corresponding models have been formulated and solved. Two numerical examples have been considered to illustrate the results and the significant features of the results are discussed. Finally, based on these examples, the effects of different parameters on the initial stock level, shortage level (in case of second model only, cycle length along with the optimal profit have been studied by sensitivity analyses taking one parameter at a time keeping the other parameters as same.
A SINGULARLY PERTURBED PROBLEM OF THIRD ORDER EQUATION WITH TWO PARAMETERS
Institute of Scientific and Technical Information of China (English)
无
2009-01-01
A singularly perturbed problem of third order equation with two parameters is studied. Using singular perturbation method, the structure of asymptotic solutions to the problem is discussed under three possible cases of two related small parameters. The results obtained reveal the different structures and limit behaviors of the solutions in three different cases. And in comparison with the exact solutions of the autonomous equation they are relatively perfect.
Institute of Scientific and Technical Information of China (English)
WV Xiao-Bo; MO Juan; YANG Ming-Hao; ZHENG Qiao-Hua; GU Hua-Guang; HEN Wei
2008-01-01
@@ Two different bifurcation scenarios, one is novel and the other is relatively simpler, in the transition procedures of neural firing patterns are studied in biological experiments on a neural pacemaker by adjusting two parameters. The experimental observations are simulated with a relevant theoretical model neuron. The deterministic non-periodic firing pattern lying within the novel bifurcation scenario is suggested to be a new case of chaos, which has not been observed in previous neurodynamical experiments.
两参数Hardy-Hilbert不等式%Hardy-Hilbert's Inequalities With Two Parameters
Institute of Scientific and Technical Information of China (English)
徐景实
2007-01-01
In this paper, the author discusses Hardy-Hilbert's inequalities with two parameters and their variants. These are generalizations of Hardy-Hilbert's inequalities with one parameter in recent literatures.%讨论了两参数Hardy-Hilbert不等式和它们的一些变形.这些不等式推广了近年文献中的单参数Hardy-Hilbert不等式.
FINITE SINGULARITIES OF TOTAL MULTIPLICITY FOUR FOR A PARTICULAR SYSTEM WITH TWO PARAMETERS
Directory of Open Access Journals (Sweden)
Simona Cristina Nartea
2011-07-01
Full Text Available A particular Lotka-Volterra system with two parameters describingthe dynamics of two competing species is analyzed from the algebraicviewpoint. This study involves the invariants and the comitants of the system determinated by the application of the affine transformations group. First, the conditions for the existence of four (different or equal finite singularities for the general system are proofed, then is studied the particular case.
Directory of Open Access Journals (Sweden)
Anupam Pathak
2014-11-01
Full Text Available Abstract: Problem Statement: The two-parameter exponentiated Rayleigh distribution has been widely used especially in the modelling of life time event data. It provides a statistical model which has a wide variety of application in many areas and the main advantage is its ability in the context of life time event among other distributions. The uniformly minimum variance unbiased and maximum likelihood estimation methods are the way to estimate the parameters of the distribution. In this study we explore and compare the performance of the uniformly minimum variance unbiased and maximum likelihood estimators of the reliability function R(t=P(X>t and P=P(X>Y for the two-parameter exponentiated Rayleigh distribution. Approach: A new technique of obtaining these parametric functions is introduced in which major role is played by the powers of the parameter(s and the functional forms of the parametric functions to be estimated are not needed. We explore the performance of these estimators numerically under varying conditions. Through the simulation study a comparison are made on the performance of these estimators with respect to the Biasness, Mean Square Error (MSE, 95% confidence length and corresponding coverage percentage. Conclusion: Based on the results of simulation study the UMVUES of R(t and ‘P’ for the two-parameter exponentiated Rayleigh distribution found to be superior than MLES of R(t and ‘P’.
Weibull analysis and flexural strength of hot-pressed core and veneered ceramic structures.
Bona, Alvaro Della; Anusavice, Kenneth J; DeHoff, Paul H
2003-11-01
To test the hypothesis that the Weibull moduli of single- and multilayer ceramics are controlled primarily by the structural reliability of the core ceramic.Methods. Seven groups of 20 bar specimens (25 x 4 x 1.2 mm) were made from the following materials: (1) IPS Empress--a hot-pressed (HP) leucite-based core ceramic; (2) IPS Empress2--a HP lithia-based core ceramic; (3 and 7) Evision--a HP lithia-based core ceramic (ES); (4) IPS Empress2 body--a glass veneer; (5) ES (1.1 mm thick) plus a glaze layer (0.1 mm); and (6) ES (0.8 mm thick) plus veneer (0.3 mm) and glaze (0.1 mm). Each specimen was subjected to four-point flexure loading at a cross-head speed of 0.5 mm/min while immersed in distilled water at 37 degrees C, except for Group 7 that was tested in a dry environment. Failure loads were recorded and the fracture surfaces were examined using SEM. ANOVA and Duncan's multiple range test were used for statistical analysis. No significant differences were found between the mean flexural strength values of Groups 2, 3, 5, and 6 or between Groups 1 and 4 (p>0.05). However, significant differences were found for dry (Group 7) and wet (Groups 1-6) conditions. Glazing had no significant effect on the flexural strength or Weibull modulus. The strength and Weibull modulus of the ES ceramic were similar to those of Groups 5 and 6. The structural reliability of veneered core ceramic is controlled primarily by that of the core ceramic.
A spatial scan statistic for survival data based on Weibull distribution.
Bhatt, Vijaya; Tiwari, Neeraj
2014-05-20
The spatial scan statistic has been developed as a geographical cluster detection analysis tool for different types of data sets such as Bernoulli, Poisson, ordinal, normal and exponential. We propose a scan statistic for survival data based on Weibull distribution. It may also be used for other survival distributions, such as exponential, gamma, and log normal. The proposed method is applied on the survival data of tuberculosis patients for the years 2004-2005 in Nainital district of Uttarakhand, India. Simulation studies reveal that the proposed method performs well for different survival distribution functions.
A class of generalized beta distributions, Pareto power series and Weibull power series
Lemos de Morais, Alice
2009-01-01
Nesta dissertação trabalhamos com três classes de distribuições de probabilidade, sendo uma já conhecida na literatura, a Classe de Distribuições Generalizadas Beta (Beta-G) e duas outras novas classes introduzidas nesta tese, baseadas na composição das distribuições Pareto e Weibull com a classe de distribuições discretas power series. Fazemos uma revisão geral da classe Beta-G e introduzimos um caso especial, a distribuição beta logística generalizada do tipo IV (BGL(IV)). In...
Directory of Open Access Journals (Sweden)
Lianxia Zhao
2016-01-01
Full Text Available An inventory model for Weibull-distributed deteriorating items is considered so as to minimize the total cost per unit time in this paper. The model starts with shortage, allowed partial backlogging, and trapezoidal demand rate. By analyzing the model, an efficient solution procedure is proposed to determine the optimal replenishment and the optimal order quantity and the average total costs are also obtained. Finally, numerical examples are provided to illustrate the theoretical results and a sensitivity analysis of the major parameters with respect to the stability of optimal solution is also carried out.
Institute of Scientific and Technical Information of China (English)
Xiao Hailin; Nie Zaiping; Yang Shiwen
2007-01-01
The novel closed-form expressions for the average channel capacity of dual selection diversity is presented, as well as, the bit-error rate (BER) of several coherent and noncoherent digital modulation schemes in the correlated Weibull fading channels with nonidentical statistics.The results are expressed in terms of Meijer's Gfunction, which can be easily evaluated numerically.The simulation results are presented to validate the proposed theoretical analysis and to examine the effects of the fading severity on the concerned quantities.
Comparison of Estimators for Exponentiated Inverted Weibull Distribution Based on Grouped Data Amal
Directory of Open Access Journals (Sweden)
S. Hassan
2014-04-01
Full Text Available In many situations, instead of complete sample, data is available only in grouped form. This paper presents estimation of population parameters for the exponentiated inverted Weibull distribution based on grouped data with equi and unequi-spaced grouping. Several alternative estimation schemes, such as, the method of maximum likelihood, least lines, least squares, minimum chi-square, and modified minimum chi-square are considered. Since the different methods of estimation didn't provide closed form solution, thus numerical procedure is applied. The root mean squared error resulting estimators used as comparison criterion to measure both the accuracy and the precision for each parameter.
Directory of Open Access Journals (Sweden)
Wu Kun-Shan
2002-01-01
Full Text Available In this paper, an EOQ inventory model is depleted not only by time varying demand but also by Weibull distribution deterioration, in which the inventory is permitted to start with shortages and end without shortages. A theory is developed to obtain the optimal solution of the problem; it is then illustrated with the aid of several numerical examples. Moreover, we also assume that the holding cost is a continuous, non-negative and non-decreasing function of time in order to extend the EOQ model. Finally, sensitivity of the optimal solution to changes in the values of different system parameters is also studied.
Hammou Elotmany; M'Hamed Eddahbi
2015-01-01
Hammou El-otmany, M'hamed Eddahbi Facult{\\'e} des Sciences et Techniques Marrakech-Maroc Laboratoire de m{\\'e}thodes stochastiques appliqu{\\'e}e a la finance et actuariat (LaMsaFA) Abstract. In the present paper we propose a new stochastic diffusion process with drift proportional to the Weibull density function defined as X $\\epsilon$ = x, dX t = $\\gamma$ t (1 -- t $\\gamma$+1) -- t $\\gamma$ X t dt + $\\sigma$X t dB t , t \\textgreater{} 0, with parameters $\\gamma$ \\textgreater{} 0 and $\\sigma$...
The effect of ignoring individual heterogeneity in Weibull log-normal sire frailty models
DEFF Research Database (Denmark)
Damgaard, Lars Holm; Korsgaard, Inge Riis; Simonsen, J;
2006-01-01
The objective of this study was, by means of simulation, to quantify the effect of ignoring individual heterogeneity in Weibull sire frailty models on parameter estimates and to address the consequences for genetic inferences. Three simulation studies were evaluated, which included 3 levels...... the software Survival Kit for the incomplete sire model. For the incomplete sire model, the Monte Carlo and Survival Kit parameter estimates were similar. This study established that when unobserved individual heterogeneity was ignored, the parameter estimates that included sire effects were biased toward zero...
EFFECT OF NANOPOWDER ADDITION ON THE FLEXURAL STRENGTH OF ALUMINA CERAMIC - A WEIBULL MODEL ANALYSIS
Directory of Open Access Journals (Sweden)
Daidong Guo
2016-05-01
Full Text Available Alumina ceramics were prepared either with micrometer-sized alumina powder (MAP or with the addition of nanometer-sized alumina powder (NAP. The density, crystalline phase, flexural strength and the fracture surface of the two ceramics were measured and compared. Emphasis has been put on the influence of nanopowder addition on the flexural strength of Al₂O₃ ceramic. The analysis based on the Weibull distribution model suggests the distribution of the flexural strength of the NAP ceramic is more concentrated than that of the MAP ceramic. Therefore, the NAP ceramics will be more stable and reliable in real applications.
Directory of Open Access Journals (Sweden)
Aliashim Albani
2014-02-01
Full Text Available The demand for electricity in Malaysia is growing in tandem with its Gross Domestic Product (GDP growth. Malaysia is going to need even more energy as it strives to grow towards a high-income economy. Malaysia has taken steps to exploring the renewable energy (RE including wind energy as an alternative source for generating electricity. In the present study, the wind energy potential of the site is statistically analyzed based on 1-year measured hourly time-series wind speed data. Wind data were obtained from the Malaysian Meteorological Department (MMD weather stations at nine selected sites in Malaysia. The data were calculated by using the MATLAB programming to determine and generate the Weibull and Rayleigh distribution functions. Both Weibull and Rayleigh models are fitted and compared to the Field data probability distributions of year 2011. From the analysis, it was shown that the Weibull distribution is fitting the Field data better than the Rayleigh distribution for the whole year 2011. The wind power density of every site has been studied based on the Weibull and Rayleigh functions. The Weibull distribution shows a good approximation for estimation of wind power density in Malaysia.
Prestack migration velocity analysis based on simplifi ed two-parameter moveout equation
Institute of Scientific and Technical Information of China (English)
Chen Hai-Feng; Li Xiang-Yang; Qian Zhong-Ping; Song Jian-Jun; Zhao Gui-Ling
2016-01-01
Stacking velocityVC2, vertical velocity ratioγ0, effective velocity ratioγef, and anisotropic parameterχef are correlated in the PS-converted-wave (PS-wave) anisotropic prestack Kirchhoff time migration (PKTM) velocity model and are thus difficult to independently determine. We extended the simplified two-parameter (stacking velocity VC2 and anisotropic parameterkef) moveout equation from stacking velocity analysis to PKTM velocity model updating and formed a new four-parameter (stacking velocityVC2, vertical velocity ratioγ0, effective velocity ratioγef, and anisotropic parameterkef) PS-wave anisotropic PKTM velocity model updating and processfl ow based on the simplifi ed two-parameter moveout equation. In the proposed method, first, the PS-wave two-parameter stacking velocity is analyzed to obtain the anisotropic PKTM initial velocity and anisotropic parameters; then, the velocity and anisotropic parameters are corrected by analyzing the residual moveout on common imaging point gathers after prestack time migration. The vertical velocity ratioγ0 of the prestack time migration velocity model is obtained with an appropriate method utilizing the P- and PS-wave stacked sections after level calibration. The initial effective velocity ratioγef is calculated using the Thomsen (1999) equation in combination with the P-wave velocity analysis; ultimately, the final velocity model of the effective velocity ratioγef is obtained by percentage scanning migration. This method simplifi es the PS-wave parameter estimation in high-quality imaging, reduces the uncertainty of multiparameter estimations, and obtains good imaging results in practice.
A two parameter ratio-product-ratio estimator using auxiliary information
Chami, Peter S; Thomas, Doneal
2012-01-01
We propose a two parameter ratio-product-ratio estimator for a finite population mean in a simple random sample without replacement following the methodology in Ray and Sahai (1980), Sahai and Ray (1980), Sahai and Sahai (1985) and Singh and Ruiz Espejo (2003). The bias and mean square error of our proposed estimator are obtained to the first degree of approximation. We derive conditions for the parameters under which the proposed estimator has smaller mean square error than the sample mean, ratio and product estimators. We carry out an application showing that the proposed estimator outperforms the traditional estimators using groundwater data taken from a geological site in the state of Florida.
Two-parameters quasi-filled function algorithm for nonlinear integer programming
Institute of Scientific and Technical Information of China (English)
WANG Wei-xiang; SHANG You-lin; ZHANG Lian-sheng
2006-01-01
A quasi-filled function for nonlinear integer programming problem is given in this paper. This function contains two parameters which are easily to be chosen. Theoretical properties of the proposed quasi-filled function are investigated. Moreover,we also propose a new solution algorithm using this quasi-filled function to solve nonlinear integer programming problem in this paper. The examples with 2 to 6 variables are tested and computational results indicated the efficiency and reliability of the proposed quasi-filled function algorithm.
Two-Parameter Radon Transformation of the Wigner Operator and Its Inverse
Institute of Scientific and Technical Information of China (English)
范洪义; 程海凌
2001-01-01
Using the technique of integration within an ordered product of operators, we reveal that a new quantum mechanical representation ｜x, μ,v〉exists, the eigenvector of operator μQ + νP (linear combination of coordinate Q and momentum P) with eigenvalue x, which is inherent to the two-parameter (μ, ν) Radon transformation of the Wigner operator. It turns out that the projection operator ｜x,μ,ν>
A Two-parameter bicovariant differential calculus on the (1 + 2)-dimensional q-superspace
Yasar, Ergün
2016-01-01
We construct a two-parameter bicovariant differential calculus on ℛq1/2 with the help of the covariance point of view using the Hopf algebra structure of ℛq1/2. To achieve this, we first use the consistency of calculus and the approach of R-matrix which satisfies both ungraded and graded Yang-Baxter equations. In particular, based on this differential calculus, we investigate Cartan-Maurer forms for this q-superspace. Finally, we obtain the quantum Lie superalgebra corresponding the Cartan-Maurer forms.
Huang, Zhi-Hui; Tang, Ying-Chun; Dai, Kai
2016-05-01
Semiconductor materials and Product qualified rate are directly related to the manufacturing costs and survival of the enterprise. Application a dynamic reliability growth analysis method studies manufacturing execution system reliability growth to improve product quality. Refer to classical Duane model assumptions and tracking growth forecasts the TGP programming model, through the failure data, established the Weibull distribution model. Combining with the median rank of average rank method, through linear regression and least squares estimation method, match respectively weibull information fusion reliability growth curve. This assumption model overcome Duane model a weakness which is MTBF point estimation accuracy is not high, through the analysis of the failure data show that the method is an instance of the test and evaluation modeling process are basically identical. Median rank in the statistics is used to determine the method of random variable distribution function, which is a good way to solve the problem of complex systems such as the limited sample size. Therefore this method has great engineering application value.
Performance Improvement in Spatially Multiplexed MIMO Systems over Weibull-Gamma Fading Channel
Tiwari, Keerti; Saini, Davinder S.; Bhooshan, Sunil V.
2016-11-01
In multiple-input multiple-output (MIMO) systems, spatial demultiplexing at the receiver has its own significance. Thus, several detection techniques have been investigated. There is a tradeoff between computational complexity and optimal performance in most of the detection techniques. One of the detection techniques which gives improved performance and acceptable level of complexity is ordered successive interference cancellation (OSIC) with minimum mean square error (MMSE). However, optimal performance can be achieved by maximum likelihood (ML) detection but at a higher complexity level. Therefore, MMSE-OSIC with candidates (OSIC2) detection is recommended as a solution. In this paper, spatial multiplexed (SM) MIMO systems are considered to evaluate error performance with different detection techniques such as MMSE-OSIC, ML and MMSE-OSIC2 in a composite fading i. e. Weibull-gamma (WG) fading environment. In WG distribution, Weibull and gamma distribution represent multipath and shadowing effects, respectively. Simulation results illustrate that MMSE-OSIC2 detection technique gives the improved symbol error rate (SER) performance which is similar to ML performance and its complexity level approaches to MMSE-OSIC.
Transferability of Charpy Absorbed Energy to Fracture Toughness Based on Weibull Stress Criterion
Institute of Scientific and Technical Information of China (English)
Hongyang JING; Lianyong XU; Lixing HUO; Fumiyoshi Minami
2005-01-01
The relationship between Charpy absorbed energy and the fracture toughness by means of the (crack tip opening displacement (CTOD)) method was analyzed based on the Weibull stress criterion. The Charpy absorbed energy and the fracture toughness were measured for the SN490B steel under the ductile-brittle transition temperature region. For the instrumented Charpy impact test, the curves between the loading point displacement and the load against time were recorded. The critical Weibull stress was taken as a fracture controlled parameter, and it could not be affected by the specimen configuration and the loading pattern based on the local approach. The parameters controlled brittle fracture are obtained from the Charpy absorbed energy results, then the fracture toughness for the compact tension (CT) specimen is predicted. It is found that the results predicted are in good agreement with the experimental. The fracture toughness could be evaluated by the Charpy absorbed energy, because the local approach gives a good description for the brittle fracture even though the Charpy impact specimen or the CT specimen is used for the given material.
Institute of Scientific and Technical Information of China (English)
潘青松; 彭刚; 胡伟华; 徐鑫
2015-01-01
为了解混凝土在不同加载速率下的力学特性，采用微机控制电液伺服大型多功能动静力三轴仪，对强度等级为 C15、边长为150 mm 的立方体混凝土试件在不同加载速率为10－5／s，10－4／s，10－3／s，5×10－3／s 下进行了单轴压缩试验，对不同加载速率下单轴压缩混凝土的抗压强度、变形、基于修正后的 Weibull 统计理论的应力应变全曲线模型参数等进行了研究和分析。结果表明：修正后的 Weibull 统计理论模型能较好地拟合混凝土试件在不同加载速率下的全曲线模型；材料的强度硬化特性可以通过 Weibull 本构模型中的参数 m 和 E 值表征；应变软化特性可以通过 Weibull 本构模型中的参数 c 值表征。%In order to understand the mechanical properties of concrete under different loading rates,we conducted uniaxial compression test on cubic concrete specimens (strength C15,side length 150mm)under different loading rates (10 -5 /s,10 -4 /s,10 -3 /s,5 ×10 -3 /s).The test was carried out by micro-computer controlled electro-hydrau-lic servo static and dynamic multifunctional triaxial apparatus.The compressive strength of concrete under uniaxial compression,deformation,and stress-strain curve model parameters based on the statistical theory of modified Weibull were studied and analyzed.Results revealed that the modified Weibull strain curve model parameters well fit the complete curve model of concrete specimens under different loading rates.The strength hardening properties could be characterized by the values of parameter m and E in the Weibull constitutive model,and the strain soften-ing behavior can be expressed by parameter c in the constitutive Weibull model.
Hirose, H
1997-01-01
This paper proposes a new treatment for electrical insulation degradation. Some types of insulation which have been used under various circumstances are considered to degrade at various rates in accordance with their stress circumstances. The cross-linked polyethylene (XLPE) insulated cables inspected by major Japanese electric companies clearly indicate such phenomena. By assuming that the inspected specimen is sampled from one of the clustered groups, a mixed degradation model can be constructed. Since the degradation of the insulation under common circumstances is considered to follow a Weibull distribution, a mixture model and a Weibull power law can be combined. This is called The mixture Weibull power law model. By using the maximum likelihood estimation for the newly proposed model to Japanese 22 and 33 kV insulation class cables, they are clustered into a certain number of groups by using the AIC and the generalized likelihood ratio test method. The reliability of the cables at specified years are assessed.
Representations of coherent and squeezed states in an extended two-parameter Fock space
Institute of Scientific and Technical Information of China (English)
M. K. Tavassoly; M. H. Lake
2012-01-01
Recently an f-deformed Fock space which is spanned by ｜n〉λ was introduced.These bases are the eigenstates of a deformed non-Hermitian Hamiltonian.In this contribution,we will use rather new nonorthogonal basis vectors for the construction of coherent and squeezed states,which in special case lead to the earlier known states.For this purpose,we first generalize the previously introduced Fock space spanned by ｜n〉λ bases,to a new one,spanned by extended two-parameters bases ｜n〉λ1,λ2.These bases are now the eigenstates of a non-Hermitian Hamiltonian Hλ1,λ2 =a(+)1,λ2a +1/2,where a(+)λ1,λ2 =a(+) + λ1a + λ2 and a are,respectively,the deformed creation and ordinary bosonic annihilation operators.The bases ｜n〉λ1,λ2 are nonorthogonal (squeezed states),but normalizable.Then,we deduce the new representations of coherent and squeezed states in our two-parameter Fock space.Finally,we discuss the quantum statistical properties,as well as the non-classical properties of the obtained states numerically.
Behera, Nirbhay K; Naik, Bharati; Nandi, Basanta K; Pani, Tanmay
2016-01-01
The charged particle multiplicity distribution and the transverse energy distribution measured in heavy-ion collisions at top RHIC and LHC energies are described using the two-component model approach based on convolution of Monte Carlo Glauber model with the Weibull model for particle production. The model successfully describes the multiplicity and transverse energy distribution of minimum bias collision data for a wide range of energies. We also propose that Weibull-Glauber model can be used to determine the centrality classes in heavy-ion collision as an alternative to the conventional Negative Binomial distribution for particle production.
Directory of Open Access Journals (Sweden)
Janine Treter
2010-01-01
Full Text Available Saponins are natural soaplike foam-forming compounds widely used in foods, cosmetic and pharmaceutical preparations. In this work foamability and foam lifetime of foams obtained from Ilex paraguariensis unripe fruits were analyzed. Polysorbate 80 and sodium dodecyl sulfate were used as reference surfactants. Aiming a better data understanding a linearized 4-parameters Weibull function was proposed. The mate hydroethanolic extract (ME and a mate saponin enriched fraction (MSF afforded foamability and foam lifetime comparable to the synthetic surfactants. The linearization of the Weibull equation allowed the statistical comparison of foam decay curves, improving former mathematical approaches.
This study investigated the compositional characteristics and shelf-life of Njangsa seed oil (NSO). Oil from Njangsa had a high polyunsaturated fatty acid (PUFA) content of which alpha eleosteric acid (alpha-ESA), an unusual conjugated linoleic acid was the most prevalent (about 52%). Linoleic acid...
Chaos and stability in a two-parameter family of convex billiard tables
Bálint, Péter; Hernández-Tahuilán, Jorge; Sanders, David P
2010-01-01
We study, by numerical simulations and semi-rigorous arguments, a two-parameter family of convex, two-dimensional billiard tables, generalizing the one-parameter class of oval billiards of Benettin--Strelcyn [Phys. Rev. A 17, 773 (1978)]. We observe interesting dynamical phenomena when the billiard tables are continuously deformed from the integrable circular billiard to different versions of completely-chaotic stadia. In particular, we conjecture that a new class of ergodic billiard tables is obtained in certain regions of the two-dimensional parameter space, when the billiards are close to skewed stadia. We provide heuristic arguments supporting this conjecture, and give numerical confirmation using the powerful method of Lyapunov-weighted dynamics.
Thermodynamic geometry, condensation and Debye model of two-parameter deformed statistics
Mohammadzadeh, Hosein; Azizian-Kalandaragh, Yashar; Cheraghpour, Narges; Adli, Fereshteh
2017-08-01
We consider the statistical distribution function of a two parameter deformed system, namely qp-deformed bosons and fermions. Using a thermodynamic geometry approach, we derive the thermodynamic curvature of an ideal gas with particles obeying qp-bosons and qp-fermions. We show that the intrinsic statistic interaction of qp-bosons is attractive in all physical ranges, while it is repulsive for qp-fermions. Also, the thermodynamic curvature of qp-boson gas is singular at a specified value of fugacity and therefore, a phase transition such as Bose-Einstein condensation can take place. In the following, we compare the experimental and theoretical results of temperature-dependent specific heat capacity of some metallic materials in the framework of q and qp-deformed algebras.
A two-parameter nondiffusive heat conduction model for data analysis in pump-probe experiments
Ma, Yanbao
2014-12-01
Nondiffusive heat transfer has attracted intensive research interests in last 50 years because of its importance in fundamental physics and engineering applications. It has unique features that cannot be described by the Fourier law. However, current studies of nondiffusive heat transfer still focus on studying the effective thermal conductivity within the framework of the Fourier law due to a lack of a well-accepted replacement. Here, we show that nondiffusive heat conduction can be characterized by two inherent material properties: a diffusive thermal conductivity and a ballistic transport length. We also present a two-parameter heat conduction model and demonstrate its validity in different pump-probe experiments. This model not only offers new insights of nondiffusive heat conduction but also opens up new avenues for the studies of nondiffusive heat transfer outside the framework of the Fourier law.
Efficient, uninformative sampling of limb darkening coefficients for two-parameter laws
Kipping, David M
2013-01-01
Stellar limb darkening affects a wide range of astronomical measurements and is frequently modeled with a parametric model using polynomials in the cosine of the angle between the line of sight and the emergent intensity. Two-parameter laws are particularly popular for cases where one wishes to fit freely for the limb darkening coefficients (i.e. an uninformative prior) due to the compact prior volume and the fact more complex models rarely obtain unique solutions with present data. In such cases, we show that the two limb darkening coefficients are constrained by three physical boundary conditions, describing a triangular region in the two-dimensional parameter space. We show that uniformly distributed samples may be drawn from this region with optimal efficiency by a technique developed by computer graphical programming: triangular sampling. Alternatively, one can use make draws using a uniform, bivariate Dirichlet distribution. We provide simple expressions for these parametrizations for both techniques ap...
Quantum transport and two-parameter scaling at the surface of a weak topological insulator.
Mong, Roger S K; Bardarson, Jens H; Moore, Joel E
2012-02-17
Weak topological insulators have an even number of Dirac cones in their surface spectrum and are thought to be unstable to disorder, which leads to an insulating surface. Here we argue that the presence of disorder alone will not localize the surface states; rather, the presence of a time-reversal symmetric mass term is required for localization. Through numerical simulations, we show that in the absence of the mass term the surface always flow to a stable metallic phase and the conductivity obeys a one-parameter scaling relation, just as in the case of a strong topological insulator surface. With the inclusion of the mass, the transport properties of the surface of a weak topological insulator follow a two-parameter scaling form.
On Calculating the Hougaard Measure of Skewness in a Nonlinear Regression Model with Two Parameters
Directory of Open Access Journals (Sweden)
S. A. EL-Shehawy
2009-01-01
Full Text Available Problem statement: This study presented an alternative computational algorithm for determining the values of the Hougaard measure of skewness as a nonlinearity measure in a Nonlinear Regression model (NLR-model with two parameters. Approach: These values indicated a degree of a nonlinear behavior in the estimator of the parameter in a NLR-model. Results: We applied the suggested algorithm on an example of a NLR-model in which there is a conditionally linear parameter. The algorithm is mainly based on many earlier studies in measures of nonlinearity. The algorithm was suited for implementation using computer algebra systems such as MAPLE, MATLAB and MATHEMATICA. Conclusion/Recommendations: The results with the corresponding output the same considering example will be compared with the results in some earlier studies.
Directory of Open Access Journals (Sweden)
Abeer Abd-Alla EL-Helbawy
2016-09-01
Full Text Available The accelerated life tests provide quick information on the life time distributions by testing materials or products at higher than basic conditional levels of stress such as pressure, high temperature, vibration, voltage or load to induce failures. In this paper, the acceleration model assumed is log linear model. Constant stress tests are discussed based on Type I and Type II censoring. The Kumaraswmay Weibull distribution is used. The estimators of the parameters, reliability, hazard rate functions and p-th percentile at normal condition, low stress, and high stress are obtained. In addition, credible intervals for parameters of the models are constructed. Optimum test plan are designed. Some numerical studies are used to solve the complicated integrals such as Laplace and Markov Chain Monte Carlo methods.
An Approach to Determine the Weibull Parameters for Wind Energy Analysis: The Case of Galicia (Spain
Directory of Open Access Journals (Sweden)
Camilo Carrillo
2014-04-01
Full Text Available The Weibull probability density function (PDF has mostly been used to fit wind speed distributions for wind energy applications. The goodness of fit of the results depends on the estimation method that was used and the wind type of the analyzed area. In this paper, a study on a particular area (Galicia was performed to test the performance of several fitting methods. The goodness of fit was evaluated by well-known indicators that use the wind speed or the available wind power density. However, energy production must be a critical parameter in wind energy applications. Hence, a fitting method that accounts for the power density distribution is proposed. To highlight the usefulness of this method, indicators that use energy production values are also presented.
PERFORMANCE EVALUATION OF OSSO-CFAR WITH BINARY INTEGRATION IN WEIBULL BACKGROUND
Institute of Scientific and Technical Information of China (English)
Meng Xiangwei
2013-01-01
The performance of the Ordered-Statistic Smallest Of (OSSO) Constant False Alarm Rate (CFAR) with binary integration in Weibull background with known shape parameter is analyzed,in the cases that the processor operates in homogeneous background and non-homogeneous situation caused by multiple targets and clutter edge.The analytical models of this scheme for the performance evaluation are given.It is shown that the OSSO-CFAR with binary integration can greatly improve the detection performance with respect to the single pulse processing case.As the clutter background becomes spiky,a high threshold S of binary integration (S/M) is required in order to obtain a good detection performance in homogeneous background.Moreover,the false alarm performance of the OSSO-CFAR with binary integration is more sensitive to the changes of shape parameter or power level of the clutter background.
The Transmuted Geometric-Weibull distribution: Properties, Characterizations and Regression Models
Directory of Open Access Journals (Sweden)
Zohdy M Nofal
2017-06-01
Full Text Available We propose a new lifetime model called the transmuted geometric-Weibull distribution. Some of its structural properties including ordinary and incomplete moments, quantile and generating functions, probability weighted moments, Rényi and q-entropies and order statistics are derived. The maximum likelihood method is discussed to estimate the model parameters by means of Monte Carlo simulation study. A new location-scale regression model is introduced based on the proposed distribution. The new distribution is applied to two real data sets to illustrate its flexibility. Empirical results indicate that proposed distribution can be alternative model to other lifetime models available in the literature for modeling real data in many areas.
Directory of Open Access Journals (Sweden)
Amany E. Aly
2016-04-01
Full Text Available When a system consisting of independent components of the same type, some appropriate actions may be done as soon as a portion of them have failed. It is, therefore, important to be able to predict later failure times from earlier ones. One of the well-known failure distributions commonly used to model component life, is the modified Weibull distribution (MWD. In this paper, two pivotal quantities are proposed to construct prediction intervals for future unobservable lifetimes based on generalized order statistics (gos from MWD. Moreover, a pivotal quantity is developed to reconstruct missing observations at the beginning of experiment. Furthermore, Monte Carlo simulation studies are conducted and numerical computations are carried out to investigate the efficiency of presented results. Finally, two illustrative examples for real data sets are analyzed.
On the Performance Analysis of Digital Communications over Weibull-Gamma Channels
Ansari, Imran Shafique
2015-05-01
In this work, the performance analysis of digital communications over a composite Weibull-Gamma (WG) multipath-fading and shadowing channel is presented wherein WG distribution is appropriate for modeling fading environments when multipath is superimposed on shadowing. More specifically, in this work, exact closed-form expressions are derived for the probability density function, the cumulative distribution function, the moment generating function, and the moments of a composite WG channel. Capitalizing on these results, new exact closed-form expressions are offered for the outage probability, the higher- order amount of fading, the average error rate for binary and M-ary modulation schemes, and the ergodic capacity under various types of transmission policies, mostly in terms of Meijer\\'s G functions. These new analytical results were also verified via computer-based Monte-Carlo simulation results. © 2015 IEEE.
Modeling the reliability and maintenance costs of wind turbines using Weibull analysis
Energy Technology Data Exchange (ETDEWEB)
Vachon, W.A. [W.A. Vachon & Associates, Inc., Manchester, MA (United States)
1996-12-31
A general description is provided of the basic mathematics and use of Weibull statistical models for modeling component failures and maintenance costs as a function of time. The applicability of the model to wind turbine components and subsystems is discussed with illustrative examples of typical component reliabilities drawn from actual field experiences. Example results indicate the dominant role of key subsystems based on a combination of their failure frequency and repair/replacement costs. The value of the model is discussed as a means of defining (1) maintenance practices, (2) areas in which to focus product improvements, (3) spare parts inventory, and (4) long-term trends in maintenance costs as an important element in project cash flow projections used by developers, investors, and lenders. 6 refs., 8 figs., 3 tabs.
A study of optimization problem for amplify-and-forward relaying over weibull fading channels
Ikki, Salama Said
2010-09-01
This paper addresses the power allocation and relay positioning problems in amplify-and-forward cooperative networks operating in Weibull fading environments. We study adaptive power allocation (PA) with fixed relay location, optimal relay location with fixed power allocation, and joint optimization of the PA and relay location under total transmit power constraint, in order to minimize the outage probability and average error probability at high signal-to-noise ratios (SNR). Analytical results are validated by numerical simulations and comparisons between the different optimization schemes and their performance are provided. Results show that optimum PA brings only coding gain, while optimum relay location yields, in addition to the latter, diversity gains as well. Also, joint optimization improves both, the diversity gain and coding gain. Furthermore, results illustrate that the analyzed adaptive algorithms outperform uniform schemes. ©2010 IEEE.
Test Population Selection from Weibull-Based, Monte Carlo Simulations of Fatigue Life
Vlcek, Brian L.; Zaretsky, Erwin V.; Hendricks, Robert C.
2012-01-01
Fatigue life is probabilistic and not deterministic. Experimentally establishing the fatigue life of materials, components, and systems is both time consuming and costly. As a result, conclusions regarding fatigue life are often inferred from a statistically insufficient number of physical tests. A proposed methodology for comparing life results as a function of variability due to Weibull parameters, variability between successive trials, and variability due to size of the experimental population is presented. Using Monte Carlo simulation of randomly selected lives from a large Weibull distribution, the variation in the L10 fatigue life of aluminum alloy AL6061 rotating rod fatigue tests was determined as a function of population size. These results were compared to the L10 fatigue lives of small (10 each) populations from AL2024, AL7075 and AL6061. For aluminum alloy AL6061, a simple algebraic relationship was established for the upper and lower L10 fatigue life limits as a function of the number of specimens failed. For most engineering applications where less than 30 percent variability can be tolerated in the maximum and minimum values, at least 30 to 35 test samples are necessary. The variability of test results based on small sample sizes can be greater than actual differences, if any, that exists between materials and can result in erroneous conclusions. The fatigue life of AL2024 is statistically longer than AL6061 and AL7075. However, there is no statistical difference between the fatigue lives of AL6061 and AL7075 even though AL7075 had a fatigue life 30 percent greater than AL6061.
Directory of Open Access Journals (Sweden)
Quintana Alicia Esther
2015-01-01
Full Text Available Manufacturing with optimal quality standards is underpinned to the high reliability of its equipment and systems, among other essential pillars. Maintenance Engineering is responsible for planning control and continuous improvement of its critical equipment by any approach, such as Six Sigma. This is nourished by numerous statistical tools highlighting, among them, statistical process control charts. While their first applications were in production, other designs have emerged to adapt to new needs as monitoring equipment and systems in the manufacturing environment. The time between failures usually fits an exponential or Weibull model. The t chart and adjusted t chart, with probabilistic control limits, are suitable alternatives to monitor the mean time between failures. Unfortunately, it is difficult to find publications of them applied to the models Weibull, very useful in contexts such as maintenance. In addition, literature limits the study of their performance to the analysis of the standard metric average run length, thus giving a partial view. The aim of this paper is to explore the performance of the t chart and adjusted t chart using three metrics, two unconventional. To do this, it incorporates the concept of lateral variability, in their forms left and right variability. Major precisions of the behavior of these charts allow to understand the conditions under which are suitable: if the main objective of monitoring lies in detecting deterioration, the t chart with adjustment is recommended. On the other hand, when the priority is to detect improvements, the t chart without adjustment is the best choice. However, the response speed of both charts is very variable from run to run.
The effect of ignoring individual heterogeneity in Weibull log-normal sire frailty models.
Damgaard, L H; Korsgaard, I R; Simonsen, J; Dalsgaard, O; Andersen, A H
2006-06-01
The objective of this study was, by means of simulation, to quantify the effect of ignoring individual heterogeneity in Weibull sire frailty models on parameter estimates and to address the consequences for genetic inferences. Three simulation studies were evaluated, which included 3 levels of individual heterogeneity combined with 4 levels of censoring (0, 25, 50, or 75%). Data were simulated according to balanced half-sib designs using Weibull log-normal animal frailty models with a normally distributed residual effect on the log-frailty scale. The 12 data sets were analyzed with 2 models: the sire model, equivalent to the animal model used to generate the data (complete sire model), and a corresponding model in which individual heterogeneity in log-frailty was neglected (incomplete sire model). Parameter estimates were obtained from a Bayesian analysis using Gibbs sampling, and also from the software Survival Kit for the incomplete sire model. For the incomplete sire model, the Monte Carlo and Survival Kit parameter estimates were similar. This study established that when unobserved individual heterogeneity was ignored, the parameter estimates that included sire effects were biased toward zero by an amount that depended in magnitude on the level of censoring and the size of the ignored individual heterogeneity. Despite the biased parameter estimates, the ranking of sires, measured by the rank correlations between true and estimated sire effects, was unaffected. In comparison, parameter estimates obtained using complete sire models were consistent with the true values used to simulate the data. Thus, in this study, several issues of concern were demonstrated for the incomplete sire model.
SINGH, S. P.; G.C.Panda
2015-01-01
This paper derives an inventory model is developed for items that deteriorates at a generalized Weibull distributed rate when demand for the items is dependent on the selling price. Shortages are not allowed and price inflation is taken into consideration over finite planning horizon. A brief analysis of the cost involved is carried out by theoretical analysis.
Directory of Open Access Journals (Sweden)
S.P.Singh
2015-09-01
Full Text Available This paper derives an inventory model is developed for items that deteriorates at a generalized Weibull distributed rate when demand for the items is dependent on the selling price. Shortages are not allowed and price inflation is taken into consideration over finite planning horizon. A brief analysis of the cost involved is carried out by theoretical analysis.
Directory of Open Access Journals (Sweden)
P Bhattacharya
2016-09-01
Full Text Available The wind resource varies with of the day and the season of the year and even some extent from year to year. Wind energy has inherent variances and hence it has been expressed by distribution functions. In this paper, we present some methods for estimating Weibull parameters in case of a low wind speed characterization, namely, shape parameter (k, scale parameter (c and characterize the discrete wind data sample by the discrete Hilbert transform. We know that the Weibull distribution is an important distribution especially for reliability and maintainability analysis. The suitable values for both shape parameter and scale parameters of Weibull distribution are important for selecting locations of installing wind turbine generators. The scale parameter of Weibull distribution also important to determine whether a wind farm is good or not. Thereafter the use of discrete Hilbert transform (DHT for wind speed characterization provides a new era of using DHT besides its application in digital signal processing. Basically in this paper, discrete Hilbert transform has been applied to characterize the wind sample data measured on College of Engineering and Management, Kolaghat, East Midnapore, India in January 2011.
A Weibull multi-state model for the dependence of progression-free survival and overall survival.
Li, Yimei; Zhang, Qiang
2015-07-30
In oncology clinical trials, overall survival, time to progression, and progression-free survival are three commonly used endpoints. Empirical correlations among them have been published for different cancers, but statistical models describing the dependence structures are limited. Recently, Fleischer et al. proposed a statistical model that is mathematically tractable and shows some flexibility to describe the dependencies in a realistic way, based on the assumption of exponential distributions. This paper aims to extend their model to the more flexible Weibull distribution. We derived theoretical correlations among different survival outcomes, as well as the distribution of overall survival induced by the model. Model parameters were estimated by the maximum likelihood method and the goodness of fit was assessed by plotting estimated versus observed survival curves for overall survival. We applied the method to three cancer clinical trials. In the non-small-cell lung cancer trial, both the exponential and the Weibull models provided an adequate fit to the data, and the estimated correlations were very similar under both models. In the prostate cancer trial and the laryngeal cancer trial, the Weibull model exhibited advantages over the exponential model and yielded larger estimated correlations. Simulations suggested that the proposed Weibull model is robust for data generated from a range of distributions.
Sazuka, Naoya; Inoue, Jun-Ichi
2007-03-01
A Weibull distribution with power-law tails is confirmed as a good candidate to describe the first passage time process of foreign currency exchange rates. The Lorentz curve and the corresponding Gini coefficient for a Weibull distribution are derived analytically. We show that the coefficient is in good agreement with the same quantity calculated from the empirical data. We also calculate the average waiting time which is an important measure to estimate the time for customers to wait until the next price change after they login to their computer systems. By assuming that the first passage time distribution might change its shape from the Weibull to the power-law at some critical time, we evaluate the averaged waiting time by means of the renewal-reward theorem. We find that our correction of tails of the distribution makes the averaged waiting time much closer to the value obtained from empirical data analysis. We also discuss the deviation from the estimated average waiting time by deriving the waiting time distribution directly. These results make us conclude that the first passage process of the foreign currency exchange rates is well described by a Weibull distribution with power-law tails.
Directory of Open Access Journals (Sweden)
Jae Phil Park
2016-12-01
Full Text Available It is extremely difficult to predict the initiation time of cracking due to a large time spread in most cracking experiments. Thus, probabilistic models, such as the Weibull distribution, are usually employed to model the initiation time of cracking. Therefore, the parameters of the Weibull distribution are estimated from data collected from a cracking test. However, although the development of a reliable cracking model under ideal experimental conditions (e.g., a large number of specimens and narrow censoring intervals could be achieved in principle, it is not straightforward to quantitatively assess the effects of the ideal experimental conditions on model estimation uncertainty. The present study investigated the effects of key experimental conditions, including the time-dependent effect of the censoring interval length, on the estimation uncertainties of the Weibull parameters through Monte Carlo simulations. The simulation results provided quantified estimation uncertainties of Weibull parameters in various cracking test conditions. Hence, it is expected that the results of this study can offer some insight for experimenters developing a probabilistic crack initiation model by performing experiments.
Energy Technology Data Exchange (ETDEWEB)
Bass, B.R.; Williams, P.T.; McAfee, W.J.; Pugh, C.E. [Oak Ridge National Lab., Heavy-Section Steel Technology Program, Oak Ridge, TN (United States)
2001-07-01
A primary objective of the United States Nuclear Regulatory Commission (USNRC) -sponsored Heavy-Section Steel Technology (HSST) Program is to develop and validate technology applicable to quantitative assessments of fracture prevention margins in nuclear reactor pressure vessels (RPVs) containing flaws and subjected to service-induced material toughness degradation. This paper describes an experimental/analytical program for the development of a Weibull statistical model of cleavage fracture toughness for applications to shallow surface-breaking and embedded flaws in RPV materials subjected to multi-axial loading conditions. The experimental part includes both material characterization testing and larger fracture toughness experiments conducted using a special-purpose cruciform beam specimen developed by Oak Ridge National Laboratory for applying biaxial loads to shallow cracks. Test materials (pressure vessel steels) included plate product forms (conforming to ASTM A533 Grade B Class 1 specifications) and shell segments procured from a pressurized-water reactor vessel intended for a nuclear power plant. Results from tests performed on cruciform specimens demonstrated that biaxial loading can have a pronounced effect on shallow-flaw fracture toughness in the lower-transition temperature region. A local approach methodology based on a three-parameter Weibull model was developed to correlate these experimentally-observed biaxial effects on fracture toughness. The Weibull model, combined with a new hydrostatic stress criterion in place of the more commonly used maximum principal stress in the kernel of the Weibull stress integral definition, is shown to provide a scaling mechanism between uniaxial and biaxial loading states for 2-dimensional flaws located in the A533-B plate material. The Weibull stress density was introduced as a matrice for identifying regions along a semi-elliptical flaw front that have a higher probability of cleavage initiation. Cumulative
Sub-tangentially loaded and damped Beck's columns on two-parameter elastic foundation
Lee, Jun-Seok; Kim, Nam-Il; Kim, Moon-Young
2007-10-01
The dynamic stability of the damped Beck's column on two-parameter elastic foundation is investigated by using Hermitian beam elements. For this purpose, based on the extended Hamilton's principle, the dimensionless finite element (FE) formulation using the Hermitian interpolation function is presented. First, the mass matrix, the external and internal damping matrices, the elastic and the geometric stiffness matrices, Winkler and Pasternak foundation matrices, and the load correction stiffness matrix due to the sub-tangential follower force are obtained. Then, evaluation procedure for the flutter and divergence loads of the non-conservative system and the time history analysis using the Newmark- β method are shortly described. Finally, the influences of various parameters on the dynamic stability of non-conservative systems are newly addressed: (1) variation of the second flutter load due to sub-tangentiality, (2) influences of the external and the internal damping on flutter loads by analysis of complex natural frequencies, (3) the effect of the growth rate of motion in a finite time interval using time history analysis, and (4) fluctuation of divergence and flutter loads due to Winkler and Pasternak foundations.
Explicit formula for the Holevo bound for two-parameter qubit-state estimation problem
Suzuki, Jun
2016-04-01
The main contribution of this paper is to derive an explicit expression for the fundamental precision bound, the Holevo bound, for estimating any two-parameter family of qubit mixed-states in terms of quantum versions of Fisher information. The obtained formula depends solely on the symmetric logarithmic derivative (SLD), the right logarithmic derivative (RLD) Fisher information, and a given weight matrix. This result immediately provides necessary and sufficient conditions for the following two important classes of quantum statistical models; the Holevo bound coincides with the SLD Cramér-Rao bound and it does with the RLD Cramér-Rao bound. One of the important results of this paper is that a general model other than these two special cases exhibits an unexpected property: the structure of the Holevo bound changes smoothly when the weight matrix varies. In particular, it always coincides with the RLD Cramér-Rao bound for a certain choice of the weight matrix. Several examples illustrate these findings.
Representations of Coherent and Squeezed States in an Extended Two-parameters Fock Space
Tavassoly, M K
2012-01-01
Recently a $f$-deformed Fock space which is spanned by $|n>_{\\lambda}$ has been introduced. These bases are indeed the eigen-states of a deformed non-Hermitian Hamiltonian. In this contribution, we will use a rather new non-orthogonal basis vectors for the construction of coherent and squeezed states, which in special case lead to the earlier known states. For this purpose, we first generalize the previously introduced Fock space spanned by $|n>_{\\lambda}$ bases, to a new one, spanned by an extended two-parameters bases $|n>_{\\lambda_{1},\\lambda_{2}}$. These bases are now the eigen-states of a non-Hermitian Hamiltonian $H_{\\lambda_{1},\\lambda_{2}}=a^{\\dagger}_{\\lambda_{1},\\lambda_{2}}a+1/2$, where $a^{\\dagger}_{\\lambda_{1},\\lambda_{2}}=a^{\\dagger}+\\lambda_{1}a + \\lambda_{2}$ and $a$ are respectively, the deformed creation and ordinary bosonic annihilation operators. The bases $|n>_{\\lambda_{1},\\lambda_{2}}$ are non-orthogonal (squeezed states), but normalizable. Then, we deduce the new representations of cohe...
Energy Technology Data Exchange (ETDEWEB)
Gao, Yun, E-mail: ygao@yorku.ca [Department of Mathematics and Statistics, York University, 4700 Keele Street, Toronto, Ontario M3J 1P3 (Canada); Hu, Naihong, E-mail: nhhu@math.ecnu.edu.cn [Department of Mathematics, East China Normal University, Shanghai 200241 (China); Zhang, Honglian, E-mail: hlzhangmath@shu.edu.cn [Department of Mathematics, Shanghai University, Shanghai 200444 (China)
2015-01-15
In this paper, we define the two-parameter quantum affine algebra for type G{sub 2}{sup (1)} and give the (r, s)-Drinfeld realization of U{sub r,s}(G{sub 2}{sup (1)}), as well as establish and prove its Drinfeld isomorphism. We construct and verify explicitly the level-one vertex representation of two-parameter quantum affine algebra U{sub r,s}(G{sub 2}{sup (1)}), which also supports an evidence in nontwisted type G{sub 2}{sup (1)} for the uniform defining approach via the two-parameter τ-invariant generating functions proposed in Hu and Zhang [Generating functions with τ-invariance and vertex representations of two-parameter quantum affine algebras U{sub r,s}(g{sup ^}): Simply laced cases e-print http://arxiv.org/abs/1401.4925 ].
On two-parameter models of photon cross sections: application to dual-energy CT imaging.
Williamson, Jeffrey F; Li, Sicong; Devic, Slobodan; Whiting, Bruce R; Lerma, Fritz A
2006-11-01
The goal of this study is to evaluate the theoretically achievable accuracy in estimating photon cross sections at low energies (20-1000 keV) from idealized dual-energy x-ray computed tomography (CT) images. Cross-section estimation from dual-energy measurements requires a model that can accurately represent photon cross sections of any biological material as a function of energy by specifying only two characteristic parameters of the underlying material, e.g., effective atomic number and density. This paper evaluates the accuracy of two commonly used two-parameter cross-section models for postprocessing idealized measurements derived from dual-energy CT images. The parametric fit model (PFM) accounts for electron-binding effects and photoelectric absorption by power functions in atomic number and energy and scattering by the Klein-Nishina cross section. The basis-vector model (BVM) assumes that attenuation coefficients of any biological substance can be approximated by a linear combination of mass attenuation coefficients of two dissimilar basis substances. Both PFM and BVM were fit to a modern cross-section library for a range of elements and mixtures representative of naturally occurring biological materials (Z = 2-20). The PFM model, in conjunction with the effective atomic number approximation, yields estimated the total linear cross-section estimates with mean absolute and maximum error ranges of 0.6%-2.2% and 1%-6%, respectively. The corresponding error ranges for BVM estimates were 0.02%-0.15% and 0.1%-0.5%. However, for photoelectric absorption frequency, the PFM absolute mean and maximum errors were 10.8%-22.4% and 29%-50%, compared with corresponding BVM errors of 0.4%-11.3% and 0.5%-17.0%, respectively. Both models were found to exhibit similar sensitivities to image-intensity measurement uncertainties. Of the two models, BVM is the most promising approach for realizing dual-energy CT cross-section measurement.
Le, Cui; Wanxi, Peng; Zhengjun, Sun; Lili, Shang; Guoning, Chen
2014-07-01
Bamboo is a radial gradient variation composite material against parasitology and vector biology, but the vascular bundles in inner layer are evenly distributed. The objective is to determine the regular size pattern and Weibull statistical analysis of the vascular bundle tensile strength in inner layer of Moso bamboo. The size and shape of vascular bundles in inner layer are similar, with an average area about 0.1550 mm2. A statistical evaluation of the tensile strength of vascular bundle was conducted by means of Weibull statistics, the results show that the Weibull modulus m is 6.1121 and the accurate reliability assessment of vascular bundle is determined.
Etude expérimentale et analyse probabiliste du comportement à la ...
African Journals Online (AJOL)
DK
quantitativement à partir de l'application du modèle probabiliste à deux paramètres de Weibull. ... have been evidenced through analyses of their mechanical properties. ... structurelles des structures composites [4-5]. ..... stratifiés, celles-ci restent comparables à cell ...... composites manufacturing Composites Part A: Applied.
Directory of Open Access Journals (Sweden)
Chris Bambey Guure
2012-01-01
Full Text Available The survival function of the Weibull distribution determines the probability that a unit or an individual will survive beyond a certain specified time while the failure rate is the rate at which a randomly selected individual known to be alive at time will die at time (. The classical approach for estimating the survival function and the failure rate is the maximum likelihood method. In this study, we strive to determine the best method, by comparing the classical maximum likelihood against the Bayesian estimators using an informative prior and a proposed data-dependent prior known as generalised noninformative prior. The Bayesian estimation is considered under three loss functions. Due to the complexity in dealing with the integrals using the Bayesian estimator, Lindley’s approximation procedure is employed to reduce the ratio of the integrals. For the purpose of comparison, the mean squared error (MSE and the absolute bias are obtained. This study is conducted via simulation by utilising different sample sizes. We observed from the study that the generalised prior we assumed performed better than the others under linear exponential loss function with respect to MSE and under general entropy loss function with respect to absolute bias.
Balakrishnan, Narayanaswamy; Pal, Suvra
2016-08-01
Recently, a flexible cure rate survival model has been developed by assuming the number of competing causes of the event of interest to follow the Conway-Maxwell-Poisson distribution. This model includes some of the well-known cure rate models discussed in the literature as special cases. Data obtained from cancer clinical trials are often right censored and expectation maximization algorithm can be used in this case to efficiently estimate the model parameters based on right censored data. In this paper, we consider the competing cause scenario and assuming the time-to-event to follow the Weibull distribution, we derive the necessary steps of the expectation maximization algorithm for estimating the parameters of different cure rate survival models. The standard errors of the maximum likelihood estimates are obtained by inverting the observed information matrix. The method of inference developed here is examined by means of an extensive Monte Carlo simulation study. Finally, we illustrate the proposed methodology with a real data on cancer recurrence.
Linear vs. piecewise Weibull model for genetic evaluation of sires for longevity in Simmental cattle
Directory of Open Access Journals (Sweden)
Nikola Raguž
2014-09-01
Full Text Available This study was focused on genetic evaluation of longevity in Croatian Simmental cattle using linear and survival models. The main objective was to create a genetic model that is most appropriate to describe the longevity data. Survival analysis, using piecewise Weibull proportional hazards model, used all information on the length of productive life including censored as well as uncensored observations. Linear models considered culled animals only. The relative milk production within herd had a highest impact on cows’ longevity. In comparison of estimated genetic parameters among methods, survival analysis yielded higher heritability value (0.075 than linear sire (0.037 and linear animal model (0.056. When linear models were used, genetic trend of Simmental bulls for longevity was slightly increasing over the years, unlike a decreasing trend in case of survival analysis methodology. Average reliability of bulls’ breeding values was higher in case of survival analysis. The rank correlations between survival analysis and linear models bulls’ breeding values for longevity were ranged between 0.44 and 0.46 implying huge differences in ranking of sires.
Pasari, S.
2013-05-01
Earthquake recurrence interval is one of the important ingredients towards probabilistic seismic hazard assessment (PSHA) for any location. Weibull, gamma, generalized exponential and lognormal distributions are quite established probability models in this recurrence interval estimation. Moreover these models share many important characteristics among themselves. In this paper, we aim to compare the effectiveness of these models in recurrence interval estimations and eventually in hazard analysis. To contemplate the appropriateness of these models, we use a complete and homogeneous earthquake catalogue of 20 events (M ≥ 7.0) spanning for the period 1846 to 1995 from North-East Himalayan region (200-320 N and 870-1000 E). The model parameters are estimated using modified maximum likelihood estimator (MMLE). No geological or geophysical evidences have been considered in this calculation. The estimated conditional probability reaches quite high after about a decade for an elapsed time of 17 years (i.e. 2012). Moreover, this study shows that the generalized exponential distribution fits the above data events more closely compared to the conventional models and hence it is tentatively concluded that generalized exponential distribution can be effectively considered in earthquake recurrence studies.
Comparison of Bayesian and Classical Analysis of Weibull Regression Model: A Simulation Study
Directory of Open Access Journals (Sweden)
İmran KURT ÖMÜRLÜ
2011-01-01
Full Text Available Objective: The purpose of this study was to compare performances of classical Weibull Regression Model (WRM and Bayesian-WRM under varying conditions using Monte Carlo simulations. Material and Methods: It was simulated the generated data by running for each of classical WRM and Bayesian-WRM under varying informative priors and sample sizes using our simulation algorithm. In simulation studies, n=50, 100 and 250 were for sample sizes, and informative prior values using a normal prior distribution with was selected for b1. For each situation, 1000 simulations were performed. Results: Bayesian-WRM with proper informative prior showed a good performance with too little bias. It was found out that bias of Bayesian-WRM increased while priors were becoming distant from reliability in all sample sizes. Furthermore, Bayesian-WRM obtained predictions with more little standard error than the classical WRM in both of small and big samples in the light of proper priors. Conclusion: In this simulation study, Bayesian-WRM showed better performance than classical method, when subjective data analysis performed by considering of expert opinions and historical knowledge about parameters. Consequently, Bayesian-WRM should be preferred in existence of reliable informative priors, in the contrast cases, classical WRM should be preferred.
Cheng, Mingjian; Zhang, Yixin; Gao, Jie; Wang, Fei; Zhao, Fengsheng
2014-06-20
We model the average channel capacity of optical wireless communication systems for cases of weak to strong turbulence channels, using the exponentiation Weibull distribution model. The joint effects of the beam wander and spread, pointing errors, atmospheric attenuation, and the spectral index of non-Kolmogorov turbulence on system performance are included. Our results show that the average capacity decreases steeply as the propagation length L changes from 0 to 200 m and decreases slowly down or tends to a stable value as the propagation length L is greater than 200 m. In the weak turbulence region, by increasing the detection aperture, we can improve the average channel capacity and the atmospheric visibility as an important issue affecting the average channel capacity. In the strong turbulence region, the increase of the radius of the detection aperture cannot reduce the effects of the atmospheric turbulence on the average channel capacity, and the effect of atmospheric visibility on the channel information capacity can be ignored. The effect of the spectral power exponent on the average channel capacity in the strong turbulence region is higher than weak turbulence region. Irrespective of the details determining the turbulent channel, we can say that pointing errors have a significant effect on the average channel capacity of optical wireless communication systems in turbulence channels.
Directory of Open Access Journals (Sweden)
Yiannoutsos Constantin T
2009-06-01
Full Text Available Abstract Background Mortality of HIV-infected patients initiating antiretroviral therapy in the developing world is very high immediately after the start of ART therapy and drops sharply thereafter. It is necessary to use models of survival time that reflect this change. Methods In this endeavor, parametric models with changepoints such as Weibull models can be useful in order to explicitly model the underlying failure process, even in the case where abrupt changes in the mortality rate are present. Estimation of the temporal location of possible mortality changepoints has important implications on the effective management of these patients. We briefly describe these models and apply them to the case of estimating survival among HIV-infected patients who are initiating antiretroviral therapy in a care and treatment programme in sub-Saharan Africa. Results As a first reported data-driven estimate of the existence and location of early mortality changepoints after antiretroviral therapy initiation, we show that there is an early change in risk of death at three months, followed by an intermediate risk period lasting up to 10 months after therapy. Conclusion By explicitly modelling the underlying abrupt changes in mortality risk after initiation of antiretroviral therapy we are able to estimate their number and location in a rigorous, data-driven manner. The existence of a high early risk of death after initiation of antiretroviral therapy and the determination of its duration has direct implications for the optimal management of patients initiating therapy in this setting.
Directory of Open Access Journals (Sweden)
Luiz Claudio Pardini
2002-10-01
Full Text Available Carbon fibres and glass fibres are reinforcements for advanced composites and the fiber strength is the most influential factor on the strength of the composites. They are essentially brittle and fail with very little reduction in cross section. Composites made with these fibres are characterized by a high strength/density ratio and their properties are intrisically related to their microstructure, i.e., amount and orientation of the fibres, surface treatment, among other factors. Processing parameters have an important role in the fibre mechanical behaviour (strength and modulus. Cracks, voids and impurities in the case of glass fibres and fibrillar misalignments in the case of carbon fibres are created during processing. Such inhomogeneities give rise to an appreciable scatter in properties. The most used statistical tool that deals with this characteristic variability in properties is the Weibull distribution. The present work investigates the influence of the testing gage length on the strength, Young's modulus and Weibull modulus of carbon fibres and glass fibres. The Young's modulus is calculated by two methods: (i ASTM D 3379M, and (ii interaction between testing equipment/specimen The first method resulted in a Young modulus of 183 GPa for carbon fibre, and 76 GPa for glass fibre. The second method gave a Young modulus of 250 GPa for carbon fibre and 50 GPa for glass fibre. These differences revelead differences on how the interaction specimen/testing machine can interfere in the Young modulus calculations. Weibull modulus can be a tool to evaluate the fibre's homogeneity in terms of properties and it is a good quality control parameter during processing. In the range of specimen gage length tested the Weibull modulus for carbon fibre is ~ 3.30 and for glass fibres is ~ 5.65, which indicates that for the batch of fibres tested, the glass fibre is more uniform in properties.
Energy Technology Data Exchange (ETDEWEB)
Kantar, Yeliz Mert; Usta, Ilhan [Department of Statistics, Anadolu University, Eskisehir 26470 (Turkey)
2008-05-15
In this study, the minimum cross entropy (MinxEnt) principle is applied for the first time to the wind energy field. This principle allows the inclusion of previous information of a wind speed distribution and covers the maximum entropy (MaxEnt) principle, which is also discussed by Li and Li and Ramirez as special cases in their wind power study. The MinxEnt probability density function (pdf) derived from the MinxEnt principle are used to determine the diurnal, monthly, seasonal and annual wind speed distributions. A comparison between MinxEnt pdfs defined on the basis of the MinxEnt principle and the Weibull pdf on wind speed data, which are taken from different sources and measured in various regions, is conducted. The wind power densities of the considered regions obtained from Weibull and MinxEnt pdfs are also compared. The results indicate that the pdfs derived from the MinxEnt principle fit better to a variety of measured wind speed data than the conventionally applied empirical Weibull pdf. Therefore, it is shown that the MinxEnt principle can be used as an alternative method to estimate both wind distribution and wind power accurately. (author)
Directory of Open Access Journals (Sweden)
H Akbari
2016-02-01
Full Text Available Introduction Temperature and water potential are two of the most important environmental factors regulating the seed germination. The germination response of a population of seeds to temperature and water potential can be described on the basis of hydrothermal time (HTT model. Regardless of the wide use of HTT models to simulate germination, little research has critically examined the assumption that the base water potential within these models is normally distributed. An alternative to the normal distribution that can fit a range of distribution types is the Weibull distribution. Using germination data of Castor bean (Ricinus communis L. over a range of water potential and sub-optimal temperature, we compared the utility of the normal and Weibull distribution in estimating base water potential (b. The accuracy of their respective HTT models in predicting germination percentage across the sub-optimal temperature range was also examined. Materials and Methods Castor bean seed germination was tested across a range of water potential (0, -0.3, -0.6 and -0.9 MPa at the sub-optimal range of temperature (ranging from 10 to 35 ˚C, with 5 ˚C intervals. Osmotic solutions were prepared by dissolving polyethylene glycol 8000 in distilled water according to the Michel (1983 equation for a given temperature. Seed germination was tested on 4 replicates of 50 seeds in moist paper towels in the incubator. The HTT models, based on the normal and Weibull distributions were fitted to data from all combinations of temperatures and water potentials using the PROC NLMIXED procedure in SAS. Results and Discussion Based on both normal and Weibull distribution functions, hydrotime constant and base water potential for castor bean seed germination were declined by increasing the temperature. Reducing the values of base water potential showed the greater need to water uptake for germination at lower temperatures and reducing hydrotime constant indicated an increase
Directory of Open Access Journals (Sweden)
Gary Black
2011-01-01
Full Text Available Many real-world processes generate autocorrelated and/or Weibull data. In such cases, the independence and/or normality assumptions underlying the Shewhart and EWMA control charts are invalid. Although data transformations exist, such tools would not normally be understood or employed by naive practitioners. Thus, the question arises, “What are the effects on robustness whenever these charts are used in such applications?” Consequently, this paper examines and compares the performance of these two control charts when the problem (the model is subjected to autocorrelated and/or Weibull data. A variety of conditions are investigated related to the magnitudes of various parameters related to the process shift, the autocorrelation coefficient and the Weibull shape parameter. Results indicate that the EWMA chart outperforms the Shewhart in 62% of the cases, particularly those cases with low to moderate autocorrelation effects. The Shewhart chart outperforms the EWMA chart in 35% of the cases, particularly those cases with high autocorrelation and zero or high process shift effects.
Entanglement for a two-parameter class of states in a high-dimension bipartite quantum system
Institute of Scientific and Technical Information of China (English)
无
2010-01-01
The entanglement for a two-parameter class of states in a high-dimension (m n,n≥m≥3) bipartite quantum system is discussed.The negativity (N) and the relative entropy of entanglement (Er) are calculated,and the analytic expressions are obtained.Finally the relation between the negativity and the relative entropy of entanglement is discussed.The result demonstrates that all PPT states of the two-parameter class of states are separable,and all entangled states are NPT states.Different from the 2 ? n quantum system,the negativity for a two-parameter class of states in high dimension is not always greater than or equal to the relative entropy of entanglement.The more general relation expression is mN/2≥Er.
Statistical Analysis of a Weibull Extension with Bathtub-Shaped Failure Rate Function
Directory of Open Access Journals (Sweden)
Ronghua Wang
2014-01-01
Full Text Available We consider the parameter inference for a two-parameter life distribution with bathtub-shaped or increasing failure rate function. We present the point and interval estimations for the parameter of interest based on type-II censored samples. Through intensive Monte-Carlo simulations, we assess the performance of the proposed estimation methods by a comparison of precision. Example applications are demonstrated for the efficiency of the methods.
Directory of Open Access Journals (Sweden)
Shi Jack J
2012-10-01
the spatial and temporal evolution of the LV wall motion using a two-parameter formulation in conjunction with tMRI-based visualization of the LV wall in the transverse planes of the apex, mid-ventricle and base. In healthy hearts, the analytical model will potentially allow deriving biomechanical entities, such as strain, strain rate or torsion, which are typically used as diagnostic, prognostic or predictive markers of cardiovascular diseases including diabetes.
A two-parameter family of exact asymptotically flat solutions to the Einstein-scalar field equations
Energy Technology Data Exchange (ETDEWEB)
Nikonov, V V; Tchemarina, Ju V; Tsirulev, A N [Department of Mathematics, Tver State University, Sadovyi per. 35, Tver 170002 (Russian Federation)], E-mail: tsirulev@tversu.ru
2008-07-07
We consider a static spherically symmetric real scalar field, minimally coupled to Einstein gravity. A two-parameter family of exact asymptotically flat solutions is obtained by using the inverse problem method. This family includes non-singular solutions, black holes and naked singularities. For each of these solutions the respective potential is partially negative but positive near spatial infinity. (comments, replies and notes)
ON SHARP MARKOV PROPERTY OF TWO-PARAMETER MARKOV PROCESSES%两参数Markov过程的Sharp Markov性
Institute of Scientific and Technical Information of China (English)
陈雄
2011-01-01
It is proven that for two-parameter Markov processes, only when they are the finite or countable unions of rectangles, will they show Sharp Markov property.%证明了两参数Markov过程仅在正位矩形的有限并或可列并区域上才具有Sharp Markov性.
Rodrigues, Sinval A; Ferracane, Jack L; Della Bona, Alvaro
2008-03-01
The aim of the present study was to evaluate the flexural strength and the Weibull modulus of a microhybrid and a nanofill composite by means of 3- and 4-point bending tests. Thirty specimens of Filtek Z250 (3M/ESPE) and Filtek Supreme (3M/ESPE) were prepared for each test according to the ISO 4049/2000 specification. After 24h in distilled water at 37 degrees C the specimens were submitted to 3- and 4-point bending tests using a universal testing machine DL2000 (EMIC) with a crosshead speed of 1 mm/min. Flexural strength data were calculated and submitted to Student's t-test (alpha=0.05) and Weibull statistics. The fractured surfaces were analyzed based on fractographic principles. The two composites had equivalent strength in both test methods. However, the test designs significantly affected the flexural strength of the microhybrid and the nanofill composites. Weibull modulus (m) of Supreme was similar with both tests, while for Z250, a higher m was observed with the 3-point bending test. Critical flaws were most often associated with the specimen's surface (up to 90%) and were characterized as surface scratches/grooves, non-uniform distribution of phases, inclusions and voids. Flexural strength as measured by the 3-point bending test is higher than by the 4-point bending test, due to the smaller flaw containing area involved in the former. Despite the large difference in average filler size between the composites, the volume fraction of the filler in both materials is similar, which was probably the reason for similar mean flexural strength values and fracture behavior.
Fetisova, Yu. A.; Ermolenko, B. V.; Ermolenko, G. V.; Kiseleva, S. V.
2017-04-01
We studied the information basis for the assessment of wind power potential on the territory of Russia. We described the methodology to determine the parameters of the Weibull function, which reflects the density of distribution of probabilities of wind flow speeds at a defined basic height above the surface of the earth using the available data on the average speed at this height and its repetition by gradations. The application of the least square method for determining these parameters, unlike the use of graphical methods, allows performing a statistical assessment of the results of approximation of empirical histograms by the Weibull formula. On the basis of the computer-aided analysis of the statistical data, it was shown that, at a fixed point where the wind speed changes at different heights, the range of parameter variation of the Weibull distribution curve is relatively small, the sensitivity of the function to parameter changes is quite low, and the influence of changes on the shape of speed distribution curves is negligible. Taking this into consideration, we proposed and mathematically verified the methodology of determining the speed parameters of the Weibull function at other heights using the parameter computations for this function at a basic height, which is known or defined by the average speed of wind flow, or the roughness coefficient of the geological substrate. We gave examples of practical application of the suggested methodology in the development of the Atlas of Renewable Energy Resources in Russia in conditions of deficiency of source meteorological data. The proposed methodology, to some extent, may solve the problem related to the lack of information on the vertical profile of repeatability of the wind flow speeds in the presence of a wide assortment of wind turbines with different ranges of wind-wheel axis heights and various performance characteristics in the global market; as a result, this methodology can become a powerful tool for
Directory of Open Access Journals (Sweden)
L. M. Pradhan
2012-01-01
Full Text Available This paper deals with the development of an inventory model for Weibull deteriorating items with constant demand when delay in payments is allowed to the retailer to settle the account against the purchases made. Shortages are not allowed and the salvage value is associated with the deteriorated units. In this paper, we consider two cases; those are for the case payment within the permissible time and for payment after the expiry of permissible time with interest. Numerical examples are provided to illustrate our results. Sensitivity analysis are carried out to analyze the effect of changes in the optimal solution with respect to change in one parameter at a time.
Energy Technology Data Exchange (ETDEWEB)
Arcos-Olalla, Rafael, E-mail: olalla@fisica.ugto.mx [Departamento de Física, DCI Campus León, Universidad de Guanajuato, Apdo. Postal E143, 37150 León, Gto. (Mexico); Reyes, Marco A., E-mail: marco@fisica.ugto.mx [Departamento de Física, DCI Campus León, Universidad de Guanajuato, Apdo. Postal E143, 37150 León, Gto. (Mexico); Rosu, Haret C., E-mail: hcr@ipicyt.edu.mx [IPICYT, Instituto Potosino de Investigacion Cientifica y Tecnologica, Apdo. Postal 3-74 Tangamanga, 78231 San Luis Potosí, S.L.P. (Mexico)
2012-10-01
We introduce an alternative factorization of the Hamiltonian of the quantum harmonic oscillator which leads to a two-parameter self-adjoint operator from which the standard harmonic oscillator, the one-parameter oscillators introduced by Mielnik, and the Hermite operator are obtained in certain limits of the parameters. In addition, a single Bernoulli-type parameter factorization, which is different from the one introduced by M.A. Reyes, H.C. Rosu, and M.R. Gutiérrez [Phys. Lett. A 375 (2011) 2145], is briefly discussed in the final part of this work. -- Highlights: ► Factorizations with operators which are not mutually adjoint are presented. ► New two-parameter and one-parameter self-adjoint oscillator operators are introduced. ► Their eigenfunctions are two- and one-parameter deformed Hermite functions.
两个参数的新Hilbert型积分不等式%Hilbert-type Integral Inequality with Two Parameters
Institute of Scientific and Technical Information of China (English)
付向红
2011-01-01
By introducing the weight function,we obtain a Hilbert-type integral inequality with two parameters and the equivalent form with a best constant factor.%通过引入权函数的方法,利用带权的H lder不等式,得到一个带两个参数的并具有最佳常数的新Hilbert型积分不等式及其等价形式.
Two-Parameter Stochastic Resonance in a Model of Electrodissolution of Fe in H2SO4
Institute of Scientific and Technical Information of China (English)
无
2001-01-01
Stochastic resonance (SR) is shown in a two-parameter system, a model of electrodissolution of Fe in H2SO4. Modulation of two different parameters by a periodic signal in one parameter and noise in the other parameter is found to give rise to SR. The result indicates that the noise can enlarge a weak periodic signal and lead the system to order. The scenario and novel aspects of SR in this system are discussed.
Directory of Open Access Journals (Sweden)
Jose Javier Gorgoso-Varela
2016-04-01
Full Text Available Aim of study: In this study we compare the accuracy of three bivariate distributions: Johnson’s SBB, Weibull-2P and LL-2P functions for characterizing the joint distribution of tree diameters and heights.Area of study: North-West of Spain.Material and methods: Diameter and height measurements of 128 plots of pure and even-aged Tasmanian blue gum (Eucalyptus globulus Labill. stands located in the North-west of Spain were considered in the present study. The SBB bivariate distribution was obtained from SB marginal distributions using a Normal Copula based on a four-parameter logistic transformation. The Plackett Copula was used to obtain the bivariate models from the Weibull and Logit-logistic univariate marginal distributions. The negative logarithm of the maximum likelihood function was used to compare the results and the Wilcoxon signed-rank test was used to compare the related samples of these logarithms calculated for each sample plot and each distribution.Main results: The best results were obtained by using the Plackett copula and the best marginal distribution was the Logit-logistic.Research highlights: The copulas used in this study have shown a good performance for modeling the joint distribution of tree diameters and heights. They could be easily extended for modelling multivariate distributions involving other tree variables, such as tree volume or biomass.
Santi, D. N.; Purnaba, I. G. P.; Mangku, I. W.
2016-01-01
Bonus-Malus system is said to be optimal if it is financially balanced for insurance companies and fair for policyholders. Previous research about Bonus-Malus system concern with the determination of the risk premium which applied to all of the severity that guaranteed by the insurance company. In fact, not all of the severity that proposed by policyholder may be covered by insurance company. When the insurance company sets a maximum bound of the severity incurred, so it is necessary to modify the model of the severity distribution into the severity bound distribution. In this paper, optimal Bonus-Malus system is compound of claim frequency component has geometric distribution and severity component has truncated Weibull distribution is discussed. The number of claims considered to follow a Poisson distribution, and the expected number λ is exponentially distributed, so the number of claims has a geometric distribution. The severity with a given parameter θ is considered to have a truncated exponential distribution is modelled using the Levy distribution, so the severity have a truncated Weibull distribution.
Directory of Open Access Journals (Sweden)
Justyna Kobryń
2017-01-01
Full Text Available Triterpenoid saponins complex of biological origin, escin, exhibits significant clinical activity in chronic venous insufficiency, skin inflammation, epidermal abrasions, allergic dermatitis, and acute impact injuries, especially in topical application. The aim of the study is the comparison of various hydrogel formulations, as carriers for a horse chestnut seed extract (EH. Methylcellulose (MC, two polyacrylic acid derivatives (PA1 and PA2, and polyacrylate crosspolymer 11 (PC-11 were employed. The release rates of EH were examined and a comparison with the Weibull model equation was performed. Application of MC as the carrier in the hydrogel preparation resulted in fast release rate of EH, whereas in the case of the hydrogel composed with PC-11 the release was rather prolonged. Applied Weibull function adhered best to the experimental data. Due to the evaluated shape parameter β, in the Weibull equation, the systems under study released the active compound according to the Fickian diffusion.
Directory of Open Access Journals (Sweden)
Ruben M. Mouangue
2014-05-01
Full Text Available The modeling of the wind speed distribution is of great importance for the assessment of wind energy potential and the performance of wind energy conversion system. In this paper, the choice of two determination methods of Weibull parameters shows theirs influences on the Weibull distribution performances. Because of important calm winds on the site of Ngaoundere airport, we characterize the wind potential using the approach of Weibull distribution with parameters which are determined by the modified maximum likelihood method. This approach is compared to the Weibull distribution with parameters which are determined by the maximum likelihood method and the hybrid distribution which is recommended for wind potential assessment of sites having nonzero probability of calm. Using data provided by the ASECNA Weather Service (Agency for the Safety of Air Navigation in Africa and Madagascar, we evaluate the goodness of fit of the various fitted distributions to the wind speed data using the Q – Q plots, the Pearson’s coefficient of correlation, the mean wind speed, the mean square error, the energy density and its relative error. It appears from the results that the accuracy of the Weibull distribution with parameters which are determined by the modified maximum likelihood method is higher than others. Then, this approach is used to estimate the monthly and annual energy productions of the site of the Ngaoundere airport. The most energy contribution is made in March with 255.7 MWh. It also appears from the results that a wind turbine generator installed on this particular site could not work for at least a half of the time because of higher frequency of calm. For this kind of sites, the modified maximum likelihood method proposed by Seguro and Lambert in 2000 is one of the best methods which can be used to determinate the Weibull parameters.
Cook, Paul P
2016-01-01
We investigate two-parameter solutions of sigma-models on two dimensional symmetric spaces contained in E11. Embedding such sigma-model solutions in space-time gives solutions of M* and M'-theory where the metric depends on general travelling wave functions, as opposed to harmonic functions typical in general relativity, supergravity and M-theory. Weyl reflection allows such solutions to be mapped to M-theory solutions where the wave functions depend explicitly on extra coordinates contained in the fundamental representation of E11.
Wong, Allan C. L.; Childs, Paul A.; Peng, Gang-Ding
2007-11-01
A multiplexing technique using amplitude-modulated chirped fibre Bragg gratings (AMCFBGs) is presented. This technique realises the multiplexing of spectrally overlapped AMCFBGs with identical centre Bragg wavelength and bandwidth. Since it is fully compatible with the wavelength division multiplexing scheme, the number of gratings that can be multiplexed can be increased by several times. The discrete wavelet transform is used to demodulate such multiplexed signal. A wavelet denoising technique is applied to the multiplexed signal in conjunction with the demodulation. Strain measurements are performed to experimentally demonstrate the feasibility of this multiplexing technique. The absolute error and crosstalk are measured. An application to simultaneous two-parameter sensing is also demonstrated.
Directory of Open Access Journals (Sweden)
Carlos García Mogollón
2010-07-01
Full Text Available La guayaba (Psidium guajava L. es una fruta tropical susceptible de sufrir alteraciones indeseables que afectan su vida útil debido a condiciones inadecuadas de almacenamiento y acondicionamiento. En este trabajo se estimó la vida útil de guayaba en fresco utilizando el modelo probabilístico de Weibull y se valuó la calidad de los frutos durante almacenamiento en diferentes condiciones de temperatura y empaque. La evaluación poscosecha se hizo por 15 días con guayabas variedad regional roja. Se utilizó un dise&#ntilde;o completamente al azar con arreglo factorial consistente en tres factores: tiempo de almacenamiento con seis niveles (0, 3, 6, 9, 12 y 15 días, temperatura de almacenamiento con dos niveles: ambiente (37 °C y humedad relativa (HR entre 85 y 90% y refrigeración (9±2 °C y HR de 85 - 90%; dos tipo de empaques: bandeja de poliestireno con film plástico de PVC y 'foil' de aluminio. Durante la evaluación sensorial en el periodo de almacenamiento se usó una escala estructurada de tres puntos grado de satisfacción. El modelo de Weibull demostró ser adecuado para predecir la vida útil de la guayaba fresca basados en los criterios de ajustes, límites de confianza de aceptación y fallo. Durante el periodo de almacenamiento se observó que el factor tiempo, la temperatura y el tipo de empaque tienen un efecto estadístico significativo (P Guava is a tropical fruit susceptible to undesirable alterations that affect the shelf-life due to inadequate conditions of storage and packing. In this work the shelf-life of guava in fresh using the probabilistic model of Weibull was considered and the quality of the fruits was estimated during storage to different conditions of temperature and packing. The postharvest evaluation was made during 15 days with guavas variety `Red Regional´. The completely randomized design and factorial design with 3 factors: storage time with 6 levels (0, 3, 6, 9, 12, 15 days, storage temperature with
Energy Technology Data Exchange (ETDEWEB)
Santhosh, T.V., E-mail: santutv@barc.gov.in [Reactor Safety Division, Bhabha Atomic Research Centre (India); Gopika, V. [Reactor Safety Division, Bhabha Atomic Research Centre (India); Ghosh, A.K. [Raja Ramanna Fellow, Department of Atomic Energy (India); Fernandes, B.G. [Department of Electrical Engineering, Indian Institute of Technology Bombay (India); Dubey, K.A. [Radiation Technology Development Division, Bhabha Atomic Research Centre (India)
2016-01-15
Highlights: • An approach for time dependent reliability prediction of I&C cable insulation materials for use in PSA of NPP has been developed based on OIT and OITp measurement, and Weibull theory. • OITs were determined from the measured OITp based on the fundamental thermodynamics principles, and the correlations obtained from DSC and FTIR are in good agreement with the EAB. • The SEM of thermal and irradiated samples of insulation materials was performed to support the degradation behaviour observed from OIT and EAB measurements. • The proposed methodology has been illustrated with the accelerated thermal and radiation ageing data on low voltage cables used in NPP for I&C applications. • The time dependent reliability predicted from the OIT based on Weibull theory will be useful in incorporating the cable ageing into PSA of NPP. - Abstract: Instrumentation and control (I&C) cables used in nuclear power plants (NPPs) are exposed to various deteriorative environmental effects during their operational lifetime. The factors consisting of long-term irradiation and enhanced temperature eventually result in insulation degradation. Monitoring of the actual state of the cable insulation and the prediction of their residual service life consist of the measurement of the properties that are directly proportional to the functionality of the cables (usually, elongation at break is used as the critical parameter). Although, several condition monitoring (CM) and life estimation techniques are available, currently there is no any standard methodology or an approach towards incorporating the cable ageing effects into probabilistic safety assessment (PSA) of NPPs. In view of this, accelerated thermal and radiation ageing of I&C cable insulation materials have been carried out and the degradation due to thermal and radiation ageing has been assessed using oxidation induction time (OIT) and oxidation induction temperature (OITp) measurements by differential scanning
Ertl, T; Woosley, S E; Sukhbold, T; Ugliano, M
2015-01-01
Thus far, judging the fate of a massive star (either a neutron star (NS) or a black hole) solely by its structure prior to core collapse has been ambiguous. Our work and previous attempts find a non-monotonic variation of successful and failed supernovae with zero-age main-sequence mass, for which no single structural parameter can serve as a good predictive measure. However, we identify two parameters computed from the pre-collapse structure of the progenitor, which in combination allow for a clear separation of exploding and non-exploding cases with only few exceptions (~1--2.5%) in our set of 621 investigated stellar models. One parameter is M4, defining the enclosed mass for a dimensionless entropy per nucleon of s = 4, and the other is mu4 = dm/dr|_{s=4}, being the mass-derivative at this location. The two parameters mu4 and M4*mu4 can be directly linked to the mass-infall rate, Mdot, of the collapsing star and the electron-type neutrino luminosity of the accreting proto-NS, L_nue ~ M_ns*Mdot, which play...
Institute of Scientific and Technical Information of China (English)
Li Xiao-Ming; Chen Shuang-Quan; Li Xiang-Yang
2013-01-01
Several parameters are needed to describe the converted-wave (C-wave) moveout in processing multi-component seismic data, because of asymmetric raypaths and anisotropy. As the number of parameters increases, the converted wave data processing and analysis becomes more complex. This paper develops a new moveout equation with two parameters for C-waves in vertical transverse isotropy (VTI) media. The two parameters are the C-wave stacking velocity (VC2) and the squared velocity ratio (γvti) between the horizontal P-wave velocity and C-wave stacking velocity. The new equation has fewer parameters, but retains the same applicability as previous ones. The applicability of the new equation and the accuracy of the parameter estimation are checked using model and real data. The form of the new equation is the same as that for layered isotropic media. The new equation can simplify the procedure for C-wave processing and parameter estimation in VTI media, and can be applied to real C-wave processing and interpretation. Accurate VC2 andγvti can be deduced from C-wave data alone using the double-scanning method, and the velocity ratio model is suitable for event matching between P-and C-wave data.
Directory of Open Access Journals (Sweden)
Lalit Mohan Pradhan
2014-03-01
Full Text Available Background: In the present competitive business scenario researchers have developed various inventory models for deteriorating items considering various practical situations for better inventory control. Permissible delay in payments with various demands and deteriorations is considerably a new concept introduced in developing various inventory models. These models are very useful for both the consumers and the manufacturer. Methods: In the present work an inventory model has been developed for a three parameter Weibull deteriorating item with ramp type demand and salvage value under trade credit system. Here we have considered a single item for developing the model. Results and conclusion: Optimal order quantity, optimal cycle time and total variable cost during a cycle have been derived for the proposed inventory model. The results obtained in this paper have been illustrated with the help of numerical examples and sensitivity analysis.
Directory of Open Access Journals (Sweden)
D. Kidmo Kaoga
2014-12-01
Full Text Available In this study, five numerical Weibull distribution methods, namely, the maximum likelihood method, the modified maximum likelihood method (MLM, the energy pattern factor method (EPF, the graphical method (GM, and the empirical method (EM were explored using hourly synoptic data collected from 1985 to 2013 in the district of Maroua in Cameroon. The performance analysis revealed that the MLM was the most accurate model followed by the EPF and the GM. Furthermore, the comparison between the wind speed standard deviation predicted by the proposed models and the measured data showed that the MLM has a smaller relative error of -3.33% on average compared to -11.67% on average for the EPF and -8.86% on average for the GM. As a result, the MLM was precisely recommended to estimate the scale and shape parameters for an accurate and efficient wind energy potential evaluation.
Energy Technology Data Exchange (ETDEWEB)
Lee, Jongk Uk; Lee, Kwan Hee; Kim, Sung Il; Yook, Dae Sik; Ahn, Sang Myeon [KINS, Daejeon (Korea, Republic of)
2016-05-15
Evaluation of the meteorological characteristics at the nuclear power plant and in the surrounding area should be performed in determining the site suitability for safe operation of the nuclear power plant. Under unexpected emergency condition, knowledge of meteorological information on the site area is important to provide the basis for estimating environmental impacts resulting from radioactive materials released in gaseous effluents during the accident condition. In the meteorological information, wind speed and direction are the important meteorological factors for examination of the safety analysis in the nuclear power plant area. Wind characteristics was analyzed on Hanbit NPP area. It was found that the Weibull parameters k and c vary 2.56 to 4.77 and 4.53 to 6.79 for directional wind speed distribution, respectively. Maximum wind frequency was NE and minimum was NNW.
Directory of Open Access Journals (Sweden)
Navid Feroze
2016-03-01
Full Text Available The families of mixture distributions have a wider range of applications in different fields such as fisheries, agriculture, botany, economics, medicine, psychology, electrophoresis, finance, communication theory, geology and zoology. They provide the necessary flexibility to model failure distributions of components with multiple failure modes. Mostly, the Bayesian procedure for the estimation of parameters of mixture model is described under the scheme of Type-I censoring. In particular, the Bayesian analysis for the mixture models under doubly censored samples has not been considered in the literature yet. The main objective of this paper is to develop the Bayes estimation of the inverse Weibull mixture distributions under doubly censoring. The posterior estimation has been conducted under the assumption of gamma and inverse levy using precautionary loss function and weighted squared error loss function. The comparisons among the different estimators have been made based on analysis of simulated and real life data sets.
Directory of Open Access Journals (Sweden)
L. M. Vas
2012-12-01
Full Text Available The short and long term creep behavior is one of the most important properties of polymers used for engineering applications. In order to study this kind of behavior of PP tensile and short term creep measurements were performed and analyzed using long term creep behavior estimating method based on short term tensile and creep tests performed at room temperature, viscoelastic behavior, and variable transformations. Applying Weibull distribution based approximations for the measured curves predictions for the creep strain to failure depending on the creep load were determined and the parameters were found by fitting the measurements. The upper, mean, and lower estimations as well as the confidence interval for the means give a possibility for designers' calculations at arbitrary creep load levels.
Directory of Open Access Journals (Sweden)
A. Taravat
2013-09-01
Full Text Available As a major aspect of marine pollution, oil release into the sea has serious biological and environmental impacts. Among remote sensing systems (which is a tool that offers a non-destructive investigation method, synthetic aperture radar (SAR can provide valuable synoptic information about the position and size of the oil spill due to its wide area coverage and day/night, and all-weather capabilities. In this paper we present a new automated method for oil-spill monitoring. A new approach is based on the combination of Weibull Multiplicative Model and machine learning techniques to differentiate between dark spots and the background. First, the filter created based on Weibull Multiplicative Model is applied to each sub-image. Second, the sub-image is segmented by two different neural networks techniques (Pulsed Coupled Neural Networks and Multilayer Perceptron Neural Networks. As the last step, a very simple filtering process is used to eliminate the false targets. The proposed approaches were tested on 20 ENVISAT and ERS2 images which contained dark spots. The same parameters were used in all tests. For the overall dataset, the average accuracies of 94.05 % and 95.20 % were obtained for PCNN and MLP methods, respectively. The average computational time for dark-spot detection with a 256 × 256 image in about 4 s for PCNN segmentation using IDL software which is the fastest one in this field at present. Our experimental results demonstrate that the proposed approach is very fast, robust and effective. The proposed approach can be applied to the future spaceborne SAR images.
Energy Technology Data Exchange (ETDEWEB)
Gorgoseo, J. J.; Rojo, A.; Camara-Obregon, A.; Dieguez-Aranda, U.
2012-07-01
The purpose of this study was to compare the accuracy of the Weibull, Johnson's SB and beta distributions, fitted with some of the most usual methods and with different fixed values for the location parameters, for describing diameter distributions in even-aged stands of Pinus pinaster, Pinus radiata and Pinus sylvestris in northwest Spain. A total of 155 permanent plots in Pinus sylvestris stands throughout Galicia, 183 plots in Pinus pinaster stands throughout Galicia and Asturias and 325 plots in Pinus radiata stands in both regions were measured to describe the diameter distributions. Parameters of the Weibull function were estimated by Moments and Maximum Likelihood approaches, those of Johnson's SB function by Conditional Maximum Likelihood and by Knoebel and Burkhart's method, and those of the beta function with the method based on the moments of the distribution. The beta and the Johnson's SB functions were slightly superior to Weibull function for Pinus pinaster stands; the Johnson's SB and beta functions were more accurate in the best fits for Pinus radiata stands, and the best results of the Weibull and the Johnson's SB functions were slightly superior to beta function for Pinus sylvestris stands. However, the three functions are suitable for this stands with an appropriate value of the location parameter and estimation of parameters method. (Author) 44 refs.
Application of Two-Parameter Extrapolation for Solution of Boundary-Value Problem on Semi-Axis
Zhidkov, E P
2000-01-01
A method for refining approximate eigenvalues and eigenfunctions for a boundary-value problem on a half-axis is suggested. To solve the problem numerically, one has to solve a problem on a finite segment [0,R] instead of the original problem on the interval [0,\\infty). This replacement leads to eigenvalues' and eigenfunctions' errors. To choose R beforehand for obtaining their required accuracy is often impossible. Thus, one has to resolve the problem on [0,R] with larger R. If there are two eigenvalues or two eigenfunctions that correspond to different segments, the suggested method allows one to improve the accuracy of the eigenvalue and the eigenfunction for the original problem by means of extrapolation along the segment. This approach is similar to Richardson's method. Moreover, a two-parameter extrapolation is described. It is combination of the extrapolation along the segment and Richardson's extrapolation along a discretization step.
DEFF Research Database (Denmark)
Boolsen, Merete Watt
bogen forklarer de fundamentale trin i forskningsprocessen og applikerer dem på udvalgte kvalitative analyser: indholdsanalyse, Grounded Theory, argumentationsanalyse og diskursanalyse......bogen forklarer de fundamentale trin i forskningsprocessen og applikerer dem på udvalgte kvalitative analyser: indholdsanalyse, Grounded Theory, argumentationsanalyse og diskursanalyse...
Energy Technology Data Exchange (ETDEWEB)
Gabriel Filho, Luis Roberto Almeida [Universidade Estadual Paulista (CE/UNESP), Tupa, SP (Brazil). Coordenacao de Estagio; Cremasco, Camila Pires [Faculdade de Tecnologia de Presidente Prudente, SP (Brazil); Seraphim, Odivaldo Jose [Universidade Estadual Paulista (FCA/UNESP), Botucatu, SP (Brazil). Fac. de Ciencias Agronomicas; Cagnon, Jose Angelo [Universidade Estadual Paulista (FEB/UNESP), Bauru, SP (Brazil). Faculdade de Engenharia
2008-07-01
The wind behavior of a region can be described by frequency distribution that provide information and characteristics needed for a possible deployment of wind energy harvesting in the region. These characteristics, such as the annual average speed, the variance and shunting line standard of the registered speeds and the density of aeolian power average hourly, can be gotten by the frequency of occurrence of determined speed, that in turn must be studied through analytical expressions. The more adjusted analytical function for aeolian distributions is the function of density of Weibull, that can be determined by numerical methods and linear regressions. Once you have determined this function, all wind characteristics mentioned above may be determined accurately. The objective of this work is to characterize the aeolian behavior in the region of Botucatu-SP and to determine the energy potential for implementation of aeolian turbines. For the development of the present research, was used an Monitorial Young Wind anemometer of Campbell company installed a 10 meters of height. The experiment was developed in the Nucleus of Alternative Energies and Renewed - NEAR of the Laboratory of Agricultural Energize of the Department of Agricultural Engineering of the UNESP, Agronomy Sciences Faculty, Lageado Experimental Farm, located in the city of Botucatu - SP. The geographic localization is defined by the coordinates 22 deg 51' South latitude (S) and 48 deg 26' Longitude West (W) and average altitude of 786 meters above sea level. The analysis was carried through using registers of speed of the wind during the period of September of 2004 the September of 2005. After determined the distribution of frequencies of the hourly average speed of the wind, it was determined function of associated Weibull, thus making possible the determination of the annual average speed of the wind (2,77 m/s), of the shunting line standard of the registered speeds (0,55 m/s), of the
T-stress estimation by the two-parameter approach for a specimen with a V-shaped notch
Bouledroua, O.; Elazzizi, A.; Hadj Meliani, M.; Pluvinage, G.; Matvienko, Y. G.
2017-05-01
In the present research, T-stress solutions are provided for a V-shaped notch in the case of surface defects in a pressurised pipeline. The V-shaped notch is analyzed with the use of the finite element method by the Castem2000 commercial software to determine the stress distribution ahead of the notch tip. The notch aspect ratio is varied. In contrast to a crack, it is found that the T-stress is not constant and depends on the distance from the notch tip. To estimate the T-stress in the case of a notch, a novel method is developed, inspired by the volumetric method approach proposed by Pluvinage. The method is based on averaging the T-stress over the effective distance ahead of the notch tip. The effective distance is determined by the point with the minimum stress gradient in the fracture process zone. This approach is successfully used to quantify the constraints of the notch-tip fields for various geometries and loading conditions. Moreover, the proposed T-stress estimation creates a basis for analyzing the crack path under mixed-mode loading from the viewpoint of the two-parameter fracture mechanics.
Yusuf, Madaki Umar; Bakar, Mohd. Rizam B. Abu
2016-06-01
Models for survival data that includes the proportion of individuals who are not subject to the event under study are known as a cure fraction models or simply called long-term survival models. The two most common models used to estimate the cure fraction are the mixture model and the non-mixture model. in this work, we present mixture and the non-mixture cure fraction models for survival data based on the beta-Weibull distribution. This four parameter distribution has been proposed as an alternative extension of the Weibull distribution in the analysis of lifetime data. This approach allows the inclusion of covariates in the models, where the estimation of the parameters was obtained under a Bayesian approach using Gibbs sampling methods.
Directory of Open Access Journals (Sweden)
Alireza Taravat
2014-12-01
Full Text Available Oil spills represent a major threat to ocean ecosystems and their environmental status. Previous studies have shown that Synthetic Aperture Radar (SAR, as its recording is independent of clouds and weather, can be effectively used for the detection and classification of oil spills. Dark formation detection is the first and critical stage in oil-spill detection procedures. In this paper, a novel approach for automated dark-spot detection in SAR imagery is presented. A new approach from the combination of adaptive Weibull Multiplicative Model (WMM and MultiLayer Perceptron (MLP neural networks is proposed to differentiate between dark spots and the background. The results have been compared with the results of a model combining non-adaptive WMM and pulse coupled neural networks. The presented approach overcomes the non-adaptive WMM filter setting parameters by developing an adaptive WMM model which is a step ahead towards a full automatic dark spot detection. The proposed approach was tested on 60 ENVISAT and ERS2 images which contained dark spots. For the overall dataset, an average accuracy of 94.65% was obtained. Our experimental results demonstrate that the proposed approach is very robust and effective where the non-adaptive WMM & pulse coupled neural network (PCNN model generates poor accuracies.
Kim, Kyu Rang; Kim, Mijin; Choe, Ho-Seong; Han, Mae Ja; Lee, Hye-Rim; Oh, Jae-Won; Kim, Baek-Jo
2016-07-01
Pollen is an important cause of respiratory allergic reactions. As individual sanitation has improved, allergy risk has increased, and this trend is expected to continue due to climate change. Atmospheric pollen concentration is highly influenced by weather conditions. Regression analysis and modeling of the relationships between airborne pollen concentrations and weather conditions were performed to analyze and forecast pollen conditions. Traditionally, daily pollen concentration has been estimated using regression models that describe the relationships between observed pollen concentrations and weather conditions. These models were able to forecast daily concentrations at the sites of observation, but lacked broader spatial applicability beyond those sites. To overcome this limitation, an integrated modeling scheme was developed that is designed to represent the underlying processes of pollen production and distribution. A maximum potential for airborne pollen is first determined using the Weibull probability density function. Then, daily pollen concentration is estimated using multiple regression models. Daily risk grade levels are determined based on the risk criteria used in Korea. The mean percentages of agreement between the observed and estimated levels were 81.4-88.2 % and 92.5-98.5 % for oak and Japanese hop pollens, respectively. The new models estimated daily pollen risk more accurately than the original statistical models because of the newly integrated biological response curves. Although they overestimated seasonal mean concentration, they did not simulate all of the peak concentrations. This issue would be resolved by adding more variables that affect the prevalence and internal maturity of pollens.
Energy Technology Data Exchange (ETDEWEB)
Neilson, Henry J., E-mail: hjn2@case.edu [Case Western Reserve University, 10900 Euclid Ave, Cleveland, OH (United States); Petersen, Alex S.; Cheung, Andrew M.; Poon, S. Joseph; Shiflet, Gary J. [University of Virginia, 395 McCormick Road, P.O. Box 400745, Charlottesville, VA 22904 (United States); Widom, Mike [Carnegie Mellon University, 5000 Forbes Avenue, Wean Hall 3325, Pittsburgh, PA 15213 (United States); Lewandowski, John J. [Case Western Reserve University, 10900 Euclid Ave, Cleveland, OH (United States)
2015-05-14
In this study, the variations in mechanical properties of Ni−Co−Ta-based metallic glasses have been analyzed. Three different chemistries of metallic glass ribbons were analyzed: Ni{sub 45}Ta{sub 35}Co{sub 20}, Ni{sub 40}Ta{sub 35}Co{sub 20}Nb{sub 5}, and Ni{sub 30}Ta{sub 35}Co{sub 30}Nb{sub 5}. These alloys possess very high density (approximately 12.5 g/cm{sup 3}) and very high strength (e.g. >3 GPa). Differential scanning calorimetry (DSC) and x-ray diffraction (XRD) were used to characterize the amorphicity of the ribbons. Mechanical properties were measured via a combination of Vickers hardness, bending strength, and tensile strength for each chemistry. At least 50 tests were conducted for each chemistry and each test technique in order to quantify the variability of properties using both 2- and 3-parameter Weibull statistics. The variability in properties and their source(s) were compared to that of other engineering materials, while the nature of deformation via shear bands as well as fracture surface features have been determined using scanning electron microscopy (SEM). Toughness, the role of defects, and volume effects are also discussed.
Polyzois, Gregory L; Lagouvardos, Panagiotis E; Frangou, Maria J
2012-06-01
The aim of this study was to (1) investigate the flexural strengths of three denture resins i.e. heat, photopolymerised and microwaved and how it was affected by relining with auto- and visible light-polymerised hard reliners, (2) investigate the bond strengths between denture resins and hard reliners and (3) interpret the results of both tests by utilising Weibull analysis. Specimens (65 × 10 × 2.5 mm) from denture resins, relined and bonded combinations were tested using a four-point bending test in a universal testing machine and a crosshead speed of 5 mm/min. Ten specimens for each bulk resin and denture resin-reliner combination for a total of 150 were tested. Statistical analysis indicated significant differences between bulk materials (p < 0.001) and between reliners (p < 0.001) for flexural and bond strength tests. was concluded that (1) the four-point flexural strength was different between the denture base materials, (2) flexure strength between bulk and relined or between relined with autopolymerised and photopolymerised bases was different, (3) flexural strength among relined denture bases was different and (4) bond strengths among relined denture bases were different. © 2011 The Gerodontology Society and John Wiley & Sons A/S.
Kim, Kyu Rang; Kim, Mijin; Choe, Ho-Seong; Han, Mae Ja; Lee, Hye-Rim; Oh, Jae-Won; Kim, Baek-Jo
2017-02-01
Pollen is an important cause of respiratory allergic reactions. As individual sanitation has improved, allergy risk has increased, and this trend is expected to continue due to climate change. Atmospheric pollen concentration is highly influenced by weather conditions. Regression analysis and modeling of the relationships between airborne pollen concentrations and weather conditions were performed to analyze and forecast pollen conditions. Traditionally, daily pollen concentration has been estimated using regression models that describe the relationships between observed pollen concentrations and weather conditions. These models were able to forecast daily concentrations at the sites of observation, but lacked broader spatial applicability beyond those sites. To overcome this limitation, an integrated modeling scheme was developed that is designed to represent the underlying processes of pollen production and distribution. A maximum potential for airborne pollen is first determined using the Weibull probability density function. Then, daily pollen concentration is estimated using multiple regression models. Daily risk grade levels are determined based on the risk criteria used in Korea. The mean percentages of agreement between the observed and estimated levels were 81.4-88.2 % and 92.5-98.5 % for oak and Japanese hop pollens, respectively. The new models estimated daily pollen risk more accurately than the original statistical models because of the newly integrated biological response curves. Although they overestimated seasonal mean concentration, they did not simulate all of the peak concentrations. This issue would be resolved by adding more variables that affect the prevalence and internal maturity of pollens.
Directory of Open Access Journals (Sweden)
Ewa Wąsik
2016-06-01
Full Text Available The article presents the reliability of municipal sewage treatment plant in an area Niepołomicka Industrial Zone. The analysis is based on five indicators of pollution: BOD5, CODCr, total suspension, total nitrogen and total phosphorus. Samples of treated sewage were collected once a month in the period from January 2011 to December 2013. The paper presents an analysis of the effectiveness of individual indicators and identify their basic statistical characteristics. Studies have shown that wastewater treatment plant in Niepołomice is characterized by high efficiency of pollutants removal with mean effectiveness of BOD5 – 98.8%, CODCr – 97.0%, total suspension – 97.3%, total nitrogen – 88.6%, and total phosphorus – 97.0%. The calculated forecast reliability of the discussed treatment plant based on the distribution of indicators in treated wastewater using Weibull model showed, that the facility meet the requirements for removal of these indicators for 365 days in the case of BOD5, CODCr, suspended solids and total phosphorus, while for total nitrogen – 336 days a year.
Gross, Bernard
1996-01-01
Material characterization parameters obtained from naturally flawed specimens are necessary for reliability evaluation of non-deterministic advanced ceramic structural components. The least squares best fit method is applied to the three parameter uniaxial Weibull model to obtain the material parameters from experimental tests on volume or surface flawed specimens subjected to pure tension, pure bending, four point or three point loading. Several illustrative example problems are provided.
Energy Technology Data Exchange (ETDEWEB)
Han, Dong, E-mail: radon.han@gmail.com; Williamson, Jeffrey F. [Medical Physics Graduate Program, Department of Radiation Oncology, Virginia Commonwealth University, Richmond, Virginia 23298 (United States); Siebers, Jeffrey V. [Department of Radiation Oncology, University of Virginia, Charlottesville, Virginia 22908 (United States)
2016-01-15
Purpose: To evaluate the accuracy and robustness of a simple, linear, separable, two-parameter model (basis vector model, BVM) in mapping proton stopping powers via dual energy computed tomography (DECT) imaging. Methods: The BVM assumes that photon cross sections (attenuation coefficients) of unknown materials are linear combinations of the corresponding radiological quantities of dissimilar basis substances (i.e., polystyrene, CaCl{sub 2} aqueous solution, and water). The authors have extended this approach to the estimation of electron density and mean excitation energy, which are required parameters for computing proton stopping powers via the Bethe–Bloch equation. The authors compared the stopping power estimation accuracy of the BVM with that of a nonlinear, nonseparable photon cross section Torikoshi parametric fit model (VCU tPFM) as implemented by the authors and by Yang et al. [“Theoretical variance analysis of single- and dual-energy computed tomography methods for calculating proton stopping power ratios of biological tissues,” Phys. Med. Biol. 55, 1343–1362 (2010)]. Using an idealized monoenergetic DECT imaging model, proton ranges estimated by the BVM, VCU tPFM, and Yang tPFM were compared to International Commission on Radiation Units and Measurements (ICRU) published reference values. The robustness of the stopping power prediction accuracy of tissue composition variations was assessed for both of the BVM and VCU tPFM. The sensitivity of accuracy to CT image uncertainty was also evaluated. Results: Based on the authors’ idealized, error-free DECT imaging model, the root-mean-square error of BVM proton stopping power estimation for 175 MeV protons relative to ICRU reference values for 34 ICRU standard tissues is 0.20%, compared to 0.23% and 0.68% for the Yang and VCU tPFM models, respectively. The range estimation errors were less than 1 mm for the BVM and Yang tPFM models, respectively. The BVM estimation accuracy is not dependent on
Lin, Wei-Shao; Ercoli, Carlo; Feng, Changyong; Morton, Dean
2012-07-01
The objective of this study was to compare the effect of veneering porcelain (monolithic or bilayer specimens) and core fabrication technique (heat-pressed or CAD/CAM) on the biaxial flexural strength and Weibull modulus of leucite-reinforced and lithium-disilicate glass ceramics. In addition, the effect of veneering technique (heat-pressed or powder/liquid layering) for zirconia ceramics on the biaxial flexural strength and Weibull modulus was studied. Five ceramic core materials (IPS Empress Esthetic, IPS Empress CAD, IPS e.max Press, IPS e.max CAD, IPS e.max ZirCAD) and three corresponding veneering porcelains (IPS Empress Esthetic Veneer, IPS e.max Ceram, IPS e.max ZirPress) were selected for this study. Each core material group contained three subgroups based on the core material thickness and the presence of corresponding veneering porcelain as follows: 1.5 mm core material only (subgroup 1.5C), 0.8 mm core material only (subgroup 0.8C), and 1.5 mm core/veneer group: 0.8 mm core with 0.7 mm corresponding veneering porcelain with a powder/liquid layering technique (subgroup 0.8C-0.7VL). The ZirCAD group had one additional 1.5 mm core/veneer subgroup with 0.7 mm heat-pressed veneering porcelain (subgroup 0.8C-0.7VP). The biaxial flexural strengths were compared for each subgroup (n = 10) according to ISO standard 6872:2008 with ANOVA and Tukey's post hoc multiple comparison test (p≤ 0.05). The reliability of strength was analyzed with the Weibull distribution. For all core materials, the 1.5 mm core/veneer subgroups (0.8C-0.7VL, 0.8C-0.7VP) had significantly lower mean biaxial flexural strengths (p strength (p= 0.004) than subgroup 0.8C-0.7VP. Nonetheless, both veneered ZirCAD groups showed greater flexural strength than the monolithic Empress and e.max groups, regardless of core thickness and fabrication techniques. Comparing fabrication techniques, Empress Esthetic/CAD, e.max Press/CAD had similar biaxial flexural strength (p= 0.28 for Empress pair; p= 0
Doungmo Goufo, Emile Franc
2016-08-01
After having the issues of singularity and locality addressed recently in mathematical modelling, another question regarding the description of natural phenomena was raised: How influent is the second parameter β of the two-parameter Mittag-Leffler function Eα,β(z), z∈ℂ? To answer this question, we generalize the newly introduced one-parameter derivative with non-singular and non-local kernel [A. Atangana and I. Koca, Chaos, Solitons Fractals 89, 447 (2016); A. Atangana and D. Bealeanu (e-print)] by developing a similar two-parameter derivative with non-singular and non-local kernel based on Eα , β(z). We exploit the Agarwal/Erdelyi higher transcendental functions together with their Laplace transforms to explicitly establish the Laplace transform's expressions of the two-parameter derivatives, necessary for solving related fractional differential equations. Explicit expression of the associated two-parameter fractional integral is also established. Concrete applications are done on atmospheric convection process by using Lorenz non-linear simple system. Existence result for the model is provided and a numerical scheme established. As expected, solutions exhibit chaotic behaviors for α less than 0.55, and this chaos is not interrupted by the impact of β. Rather, this second parameter seems to indirectly squeeze and rotate the solutions, giving an impression of twisting. The whole graphics seem to have completely changed its orientation to a particular direction. This is a great observation that clearly shows the substantial impact of the second parameter of Eα , β(z), certainly opening new doors to modeling with two-parameter derivatives.
1981-12-01
evaluated at xlx 2,... ,x (20:303). For the Weibull pdf, the likelihood function can be represented by: L - f(xlx 2,...,x n:K,8,C) (16) Since the...vs A 2 Critical Values, Level-.Ol, n-30 128 , 0 6N m m • w - APPENDIX E Computer Prgrams 129 Program to Calculate the Cramer-von Mises Critical Values...1.E-4) 4,4,.30 30 CONTINUE 4 CONTINUE CsJ-C(3) T83-THETA (3) EKSJ-EK (3) 66 RETURN END *EOR SEOR *EOF 143 Program to Evaluate the Endpoints c C
Pal, Suvra; Balakrishnan, N
2017-05-16
In this paper, we develop likelihood inference based on the expectation maximization (EM) algorithm for the Box- Cox transformation cure rate model assuming the lifetimes to follow a Weibull distribution. A simulation study is carried out to demonstrate the performance of the proposed estimation method. Through Monte Carlo simulations, we also study the effect of model mis-specification on the estimate of cure rate. Finally, we analyze a well-known data on melanoma with the model and the inferential method developed here.
Directory of Open Access Journals (Sweden)
Gloria Bueno-García
2011-01-01
Full Text Available Se demuestra bajo las condiciones experimentales establecidas que las curvas de sobrevivencia de la Pseudomonas aeruginosa siguió una cinética no lineal mostrando una caída inicial rápida en el conteo celular seguida por una cola causada por una disminución de la velocidad de inactivación, El modelo de Weibull describió con precisión la cinética de inactivación. Se estimaron los parámetros estadísticos que mejor explican la frecuencia observada: media, varianza y coeficiente de asimetría. Para la Pseudomonas aeruginosa el valor b depende de la temperatura y el valor n es independiente. El modelo de distribución de Weibull fue capaz de predecir el tiempo de calentamiento para inactivar ocho ciclos log10 y para estimar el tiempo de calentamiento equivalente para la misma proporción de P. aeruginosa sobreviviente a otras temperaturas.
Directory of Open Access Journals (Sweden)
S. Lakshmi
2016-12-01
Full Text Available In this paper, we introduce probability density function of four variate Weibull distributions. A multivariate survival function of Weibull Distribution is used for four variables. From the survival function, the probability density function and cumulative probability function are derived. Ghrelin may affect reproductive function in animals and humans.In the application part the experimental conditions of an acute injection of ghrelin (1g/kg to normal women, basal and GnRH-induced LH and FSH secretion were not affected and suggested that ghrelin does not play a major physiological role in gonadotrophin secretion in women.In the mathematical part, we have found that the Survival function of the curveis suddenly decreased in Mid-luteal phase compare with other phases. Pdf of the curve is suppressed in late follicular phase and it will be increased at the time of7min.Pdf for early follicular phase of control cycleis increased from 4 min .Also Pdf curve for early follicular phase with ghrelin administration and mid-luteal phase with ghrelin and GnRH are also increased at the time of 5 and 3 minutes respectively.
Directory of Open Access Journals (Sweden)
Jianwei Yang
2016-06-01
Full Text Available In order to solve the reliability assessment of braking system component of high-speed electric multiple units, this article, based on two-parameter exponential distribution, provides the maximum likelihood estimation and Bayes estimation under a type-I life test. First of all, we evaluate the failure probability value according to the classical estimation method and then obtain the maximum likelihood estimation of parameters of two-parameter exponential distribution by performing and using the modified likelihood function. On the other hand, based on Bayesian theory, this article also selects the beta and gamma distributions as the prior distribution, combines with the modified maximum likelihood function, and innovatively applies a Markov chain Monte Carlo algorithm to parameters assessment based on Bayes estimation method for two-parameter exponential distribution, so that two reliability mathematical models of the electromagnetic valve are obtained. Finally, through type-I life test, the failure rates according to maximum likelihood estimation and Bayes estimation method based on Markov chain Monte Carlo algorithm are, respectively, 2.650 × 10−5 and 3.037 × 10−5. Compared with the failure rate of a electromagnetic valve 3.005 × 10−5, it proves that the Bayes method can use a Markov chain Monte Carlo algorithm to estimate reliability for two-parameter exponential distribution and Bayes estimation is more closer to the value of electromagnetic valve. So, by fully integrating multi-source, Bayes estimation method can preferably modify and precisely estimate the parameters, which can provide a certain theoretical basis for the safety operation of high-speed electric multiple units.
Cortellini, Davide; Canale, Angelo; Souza, Rodrigo O A; Campos, Fernanda; Lima, Julia C; Özcan, Mutlu
2015-12-01
The aim of this study was to evaluate the durability of lithium disilicate crowns bonded on abutments prepared with two types of finish lines after long-term cyclic loading. Pressed lithium disilicate all-ceramic molar crowns were bonded (Variolink II) to epoxy abutments (height: 5.5 mm, Ø: 7.5 mm, conicity: 6°) (N = 20) with either knife-edge (KE) or large chamfer (LC) finish lines. Each assembly was submitted to cyclic loading (1,200,000×; 200 N; 1 Hz) in water and then tested until fracture in a universal testing machine (1 mm/min). Failure types were classified and further evaluated under stereomicroscope and SEM. The data (N) were analyzed using one-way ANOVA. Weibull distribution values including the Weibull modulus (m), characteristic strength (0), probability of failure at 5% (0.05), 1% (0.01), and correlation coefficient were calculated. Type of finish line did not significantly influence the mean fracture strength of pressed ceramic crowns (KE: 1655 ± 353 N; LC: 1618 ± 263 N) (p = 0.7898). Weibull distribution presented lower shape value (m) of KE (m = 5.48; CI: 3.5 to 8.6) compared to LC (m = 7.68; CI: 5.2 to 11.3). Characteristic strengths (0) (KE: 1784.9 N; LC: 1712.1 N) were higher than probability of failure at 5% (0.05) (KE: 1038.1 N; LC: 1163.4 N) followed by 1% (0.01) (KE: 771 N; LC: 941.1 N), with a correlation coefficient of 0.966 for KE and 0.924 for LC. Type V failures (severe fracture of the crown and/or tooth) were more common in both groups. SEM findings showed that fractures occurred mainly from the cement/ceramic interface at the occlusal side of the crowns. Lithium disilicate ceramic crowns bonded onto abutment teeth with KE preparation resulted in similar fracture strength to those bonded on abutments with LC finish line. Pressed lithium disilicate ceramic crowns may not require invasive finish line preparations since finish line type did not impair the strength after aging conditions. © 2015 by the American College of
Directory of Open Access Journals (Sweden)
Marlen Navarro
Full Text Available Con el objetivo de conocer el vigor de las semillas de Albizia lebbeck mediante la evaluación de la emergencia de plántulas, a través de la función Weibull modificada, se realizó la siembra en tres condiciones ambientales y en diferentes tiempos de almacenamiento de la semilla. El diseño fue completamente aleatorizado, con arreglo factorial. Se realizó análisis de varianza para los parámetros M (emergencia acumulada máxima, k (tasa de emergencia y Z (retraso para el inicio de la emergencia de la función Weibull modificada. A partir de los seis meses de iniciado el almacenamiento (44,1 % se observó la pérdida brusca del porcentaje de M en el vivero (A y ligeras variaciones en la cabina (C, en comparación con A y B (sombreador. El ámbito de dispersión del parámetro k osciló entre 0,4-2,6; 0,29-1,9 y 0,5-1,4 % emergencia d-1 para las evaluaciones realizadas en A, B y C, respectivamente. Del análisis de Z se interpretó que el tiempo para el inicio de la emergencia, sin distinción del ambiente de siembra, estuvo enmarcado entre los 3,0 y 7,3 días posteriores a la siembra. En el vivero a pleno sol, en la evaluación a 6 mdia (meses de iniciado el almacenamiento, se obtuvieron los mejores resultados de los parámetros biológicos de la ecuación de Weibull, lo cual permitió un análisis global que indicó un grado de vigor alto en las semillas de A. lebbeck, en comparación con las restantes evaluaciones
Directory of Open Access Journals (Sweden)
Petros Damos
Full Text Available Temperature implies contrasting biological causes of demographic aging in poikilotherms. In this work, we used the reliability theory to describe the consistency of mortality with age in moth populations and to show that differentiation in hazard rates is related to extrinsic environmental causes such as temperature. Moreover, experiments that manipulate extrinsic mortality were used to distinguish temperature-related death rates and the pertinence of the Weibull aging model. The Newton-Raphson optimization method was applied to calculate parameters for small samples of ages at death by estimating the maximum likelihoods surfaces using scored gradient vectors and the Hessian matrix. The study reveals for the first time that the Weibull function is able to describe contrasting biological causes of demographic aging for moth populations maintained at different temperature regimes. We demonstrate that at favourable conditions the insect death rate accelerates as age advances, in contrast to the extreme temperatures in which each individual drifts toward death in a linear fashion and has a constant chance of passing away. Moreover, slope of hazard rates shifts towards a constant initial rate which is a pattern demonstrated by systems which are not wearing out (e.g. non-aging since the failure, or death, is a random event independent of time. This finding may appear surprising, because, traditionally, it was mostly thought as rule that in aging population force of mortality increases exponentially until all individuals have died. Moreover, in relation to other studies, we have not observed any typical decelerating aging patterns at late life (mortality leveling-off, but rather, accelerated hazard rates at optimum temperatures and a stabilized increase at the extremes.In most cases, the increase in aging-related mortality was simulated reasonably well according to the Weibull survivorship model that is applied. Moreover, semi log- probability hazard
Institute of Scientific and Technical Information of China (English)
Fan Hong-Yi; Hu Li-Yun
2009-01-01
This paper proves a new theorem on the relationship between optical field Wigner function's two-parameter Radon transform and optical Fresnel transform of the field, I.e., when an input field ψ(x') propagates through an optical [D (-B) (-C) A] system, the energy density of the output field is equal to the Radon transform of the Wigner function of the input field, where the Radon transform parameters are D, B. It prove this theorem in both spatial-domain and frequency-domain, in the latter case the Radon transform parameters are A, C 7.
Directory of Open Access Journals (Sweden)
Valliathal M.
2013-01-01
Full Text Available This paper deals with the effects of inflation and time discounting on an inventory model with general ramp type demand rate, time dependent (Weibull deterioration rate and partial backlogging of unsatisfied demand. The model is studied under the replenishment policy, starting with shortages under two different types of backlogging rates, and their comparative study is also provided. We then use the computer software, MATLto find the optimal replenishment policies. Duration of positive inventory level is taken as the decision variable to minimize the total cost of the proposed system. Numerical examples are then taken to illustrate the solution procedure. Finally, sensitivity of the optimal solution to changes of the values of different system parameters is also studied.
Energy Technology Data Exchange (ETDEWEB)
Lienkamp, M. (Technische Hochschule Darmstadt, Fachgebiet Physikalische Metallkunde, Fachbereich Materialwissenschaft (Germany)); Exner, H.E. (Technische Hochschule Darmstadt, Fachgebiet Physikalische Metallkunde, Fachbereich Materialwissenschaft (Germany))
1993-04-01
Present test methods used to determine the strength distribution of high performance fibres are either time consuming or not very reliable. A method is used which enables the derivation of the strength distribution function from one single tensile test. The load/elongation diagram of a bundle of fibres is taken from an elongation-controlled tensile test. From the ratio of the measured load to a fictive load, necessary to obtain an identical elongation in the bundle assuming all fibres are intact, the fraction of broken fibres for each point of the load/elongation diagram is determined. From this the strength distribution function and the Weibull parametes of the fibres can be calculated. Application of this simple, but very effective method, is demonstrated for a schematic example and for three fibre materials (carbon, aramid and ceramic fibres). (orig.)
Energy Technology Data Exchange (ETDEWEB)
Sun, Huarui, E-mail: huarui.sun@bristol.ac.uk; Bajo, Miguel Montes; Uren, Michael J.; Kuball, Martin [Center for Device Thermography and Reliability (CDTR), H. H. Wills Physics Laboratory, University of Bristol, Tyndall Avenue, Bristol BS8 1TL (United Kingdom)
2015-01-26
Gate leakage degradation of AlGaN/GaN high electron mobility transistors under OFF-state stress is investigated using a combination of electrical, optical, and surface morphology characterizations. The generation of leakage “hot spots” at the edge of the gate is found to be strongly temperature accelerated. The time for the formation of each failure site follows a Weibull distribution with a shape parameter in the range of 0.7–0.9 from room temperature up to 120 °C. The average leakage per failure site is only weakly temperature dependent. The stress-induced structural degradation at the leakage sites exhibits a temperature dependence in the surface morphology, which is consistent with a surface defect generation process involving temperature-associated changes in the breakdown sites.
Directory of Open Access Journals (Sweden)
J. Piątkowski
2012-12-01
Full Text Available Purpose: The main purpose of the study was to determine methodology for estimation of the operational reliability based on the statistical results of abrasive wear testing.Design/methodology/approach: For research, a traditional tribological system, i.e. a friction pair of the AlSi17CuNiMg silumin in contact with the spheroidal graphite cast iron of EN-GJN-200 grade, was chosen. Conditions of dry friction were assumed. This system was chosen based on mechanical cooperation between the cylinder (silumin and piston rings (spheroidal graphite cast iron in conventional internal combustion piston engines with spark ignition.Findings: Using material parameters of the cylinder and piston rings, nominal losses qualifying the cylinder for repair and the maximum weight losses that can be smothered were determined. Based on the theoretical number of engine revolutions to repair and stress acting on the cylinder bearing surface, the maximum distance that the motor vehicle can travel before the seizure of the cylinder occurs was calculated. These results were the basis for statistical analysis carried out with the Weibull modulus, the end result of which was the estimation of material reliability (the survival probability of tribological system and the determination of a pre-operation warranty period of the tribological system.Research limitations/implications: The analysis of Weibull distribution modulus will estimate the reliability of a tribological cylinder-ring system enabled the determination of an approximate theoretical time of the combustion engine failure-free running.Originality/value: The results are valuable statistical data and methodology proposed in this paper can be used to determine a theoretical life time of the combustion engine.
Evaluation of the reliability of Si3N4-Al2O3 -CTR2O3 ceramics through Weibull analysis
Directory of Open Access Journals (Sweden)
Santos Claudinei dos
2003-01-01
Full Text Available The objective of this work has been to compare the reliability of two Si3N4 ceramics, with Y2O3/Al2O3 or CTR2O3/Al2O3 mixtures as additives, in regard to their 4-point bending strength and to confirm the potential of the rare earth oxide mixture, CTR2O3, produced at FAENQUIL, as an alternative, low cost sinter additive for pure Y2O3 in the sintering of Si3N4 ceramics. The oxide mixture CTR2O3 is a solid solution formed mainly by Y2O3, Er2O3, Yb2O3 and Dy2O3 with other minor constituents and is obtained at a cost of only 20% of pure Y2O3. Samples were sintered by a gas pressure sintering process at 1900 °C under a nitrogen pressure of 1.5 MPa and an isothermal holding time of 2 h. The obtained materials were characterized by their relative density, phase composition and bending strength. The Weibull analysis was used to describe the reliability of these materials. Both materials produced presented relative densities higher than 99.5%t.d., b-Si3N4 and Y3Al5O12 (YAG as cristalline phases and bending strengths higher than 650 MPa, thus demonstrating similar behaviors regarding their physical, chemical and mechanical characteristics. The statistical analysis of their strength also showed similar results for both materials, with Weibull moduli m of about 15 and characteristic stress values s o of about 700 MPa. These results confirmed the possibility of using the rare earth oxide mixture, CTR2O3, as sinter additive for high performance Si3N4 ceramics, without prejudice of the mechanical properties when compared to Si3N4 ceramics sintered with pure Y2O3.
Hausmann, Michael; Doelle, Juergen; Arnold, Armin; Stepanow, Boris; Wickert, Burkhard; Boscher, Jeannine; Popescu, Paul C.; Cremer, Christoph
1992-07-01
Laser fluorescence activated slit-scan flow cytometry offers an approach to a fast, quantitative characterization of chromosomes due to morphological features. It can be applied for screening of chromosomal abnormalities. We give a preliminary report on the development of the Heidelberg slit-scan flow cytometer. Time-resolved measurement of the fluorescence intensity along the chromosome axis can be registered simultaneously for two parameters when the chromosome axis can be registered simultaneously for two parameters when the chromosome passes perpendicularly through a narrowly focused laser beam combined by a detection slit in the image plane. So far automated data analysis has been performed off-line on a PC. In its final performance, the Heidelberg slit-scan flow cytometer will achieve on-line data analysis that allows an electro-acoustical sorting of chromosomes of interest. Interest is high in the agriculture field to study chromosome aberrations that influence the size of litters in pig (Sus scrofa domestica) breeding. Slit-scan measurements have been performed to characterize chromosomes of pigs; we present results for chromosome 1 and a translocation chromosome 6/15.
Directory of Open Access Journals (Sweden)
Ingo W Nader
Full Text Available Parameters of the two-parameter logistic model are generally estimated via the expectation-maximization algorithm, which improves initial values for all parameters iteratively until convergence is reached. Effects of initial values are rarely discussed in item response theory (IRT, but initial values were recently found to affect item parameters when estimating the latent distribution with full non-parametric maximum likelihood. However, this method is rarely used in practice. Hence, the present study investigated effects of initial values on item parameter bias and on recovery of item characteristic curves in BILOG-MG 3, a widely used IRT software package. Results showed notable effects of initial values on item parameters. For tighter convergence criteria, effects of initial values decreased, but item parameter bias increased, and the recovery of the latent distribution worsened. For practical application, it is advised to use the BILOG default convergence criterion with appropriate initial values when estimating the latent distribution from data.
Evaluación poscosecha y estimación de vida útil de guayaba fresca utilizando el modelo de Weibull
Directory of Open Access Journals (Sweden)
García Mogollón Carlos
2010-09-01
Full Text Available
La guayaba (Psidium guajava L. es una fruta tropical susceptible de sufrir alteraciones indeseables que afectan su vida útil debido a condiciones inadecuadas de almacenamiento y acondicionamiento. En este trabajo se estimó la vida útil de guayaba en fresco utilizando el modelo probabilístico de Weibull y se valuó la calidad de los frutos durante almacenamiento en diferentes condiciones de temperatura y empaque. La evaluación poscosecha se hizo por 15 días con guayabas variedad regional roja. Se utilizó un diseño completamente al azar con arreglo factorial consistente en tres factores: tiempo de almacenamiento con seis niveles (0, 3, 6, 9, 12 y 15 días, temperatura de almacenamiento con dos niveles: ambiente (37 °C y humedad relativa (HR entre 85 y 90% y refrigeración (9±2 °C y HR de 85 - 90%; dos tipo de empaques: bandeja de poliestireno con film plástico de PVC y 'foil' de aluminio. Durante la evaluación sensorial en el periodo de almacenamiento se usó una escala estructurada de tres puntos grado de satisfacción. El modelo de Weibull demostró ser adecuado para predecir la vida útil de la guayaba fresca basados en los criterios de ajustes, límites de confianza de aceptación y fallo. Durante el periodo de almacenamiento se observó que el factor tiempo, la temperatura y el tipo de empaque tienen un efecto estadístico significativo (P < 0.05 sobre el diámetro equivalente, esfericidad, masa específica aparente, SST, pH, acidez y evaluación sensorial de los frutos. El producto puede ser consumido como fruta fresca hasta diez días de almacenamiento a temperatura ambiente y máximo quince días en almacenamiento refrigerado.
Evaluación poscosecha y estimación de vida útil de guayaba fresca utilizando el modelo de Weibull
Directory of Open Access Journals (Sweden)
Carlos García Mogollón
2010-07-01
Full Text Available La guayaba (Psidium guajava L. es una fruta tropical susceptible de sufrir alteraciones indeseables que afectan su vida útil debido a condiciones inadecuadas de almacenamiento y acondicionamiento. En este trabajo se estimó la vida útil de guayaba en fresco utilizando el modelo probabilístico de Weibull y se valuó la calidad de los frutos durante almacenamiento en diferentes condiciones de temperatura y empaque. La evaluación poscosecha se hizo por 15 días con guayabas variedad regional roja. Se utilizó un dise&#ntilde;o completamente al azar con arreglo factorial consistente en tres factores: tiempo de almacenamiento con seis niveles (0, 3, 6, 9, 12 y 15 días, temperatura de almacenamiento con dos niveles: ambiente (37 °C y humedad relativa (HR entre 85 y 90% y refrigeración (9±2 °C y HR de 85 - 90%; dos tipo de empaques: bandeja de poliestireno con film plástico de PVC y 'foil' de aluminio. Durante la evaluación sensorial en el periodo de almacenamiento se usó una escala estructurada de tres puntos grado de satisfacción. El modelo de Weibull demostró ser adecuado para predecir la vida útil de la guayaba fresca basados en los criterios de ajustes, límites de confianza de aceptación y fallo. Durante el periodo de almacenamiento se observó que el factor tiempo, la temperatura y el tipo de empaque tienen un efecto estadístico significativo (P < 0.05 sobre el diámetro equivalente, esfericidad, masa específica aparente, SST, pH, acidez y evaluación sensorial de los frutos. El producto puede ser consumido como fruta fresca hasta diez días de almacenamiento a temperatura ambiente y máximo quince días en almacenamiento refrigerado.
Análise da resistência de vigas de mármore sintético através da distribuição estatística de weibull
Rabahi, Ricardo Fouad
2011-01-01
No presente trabalho foi abordada a análise estatística de Weibull como ferramenta de avaliação da resistência mecânica à flexão tanto do mármore sintético puro quanto do mármore sintético reforçado com fibra vidro. Um dos objetivos do trabalho é observar o comportamento do módulo de Weibull e da resistência mecânica, à medida que se introduz fibra de vidro picotada na composição do mármore sintético. Sua influência na resistência mecânica será investigada e levar-se-á em consideração os ganh...
Directory of Open Access Journals (Sweden)
A Lakshmana Rao
2015-02-01
Full Text Available Inventory models play an important role in determining the optimal ordering and pricing policies. Much work has been reported in literature regarding inventory models with finite or infinite replenishment. But in many practical situations the replenishment is governed by random factors like procurement, transportation, environmental condition, availability of raw material etc., Hence, it is needed to develop inventory models with random replenishment. In this paper, an EPQ model for deteriorating items is developed and analyzed with the assumption that the replenishment is random and follows a Weibull distribution. It is further assumed that the life time of a commodity is random and follows a generalized Pareto distribution and demand is a function of on hand inventory. Using the differential equations, the instantaneous state of inventory is derived. With suitable cost considerations, the total cost function is obtained. By minimizing the total cost function, the optimal ordering policies are derived. Through numerical illustrations, the sensitivity analysis is carried. The sensitivity analysis of the model reveals that the random replenishment has significant influence on the ordering and pricing policies of the model. This model also includes some of the earlier models as particular cases for specific values of the parameters.
An EOQ Model for Items with Weibull Distribution Deterioration Rate%变质率呈Weibull分布的易变质物品的EOQ模型
Institute of Scientific and Technical Information of China (English)
王道平; 于俊娣; 李向阳
2011-01-01
基于需求和采购价格均为时变的EOQ模型,进一步考虑呈Weibull分布的变质对易变质物品库存管理的影响,建立了相应的EOQ模型,并对该模型进行仿真计算和主要参数的灵敏度分析.结果表明,该模型存在最优解且各主要参数对最优库存控制有不同程度的影响.%For deterioration items, the deterioration rate can be described by Weibull distribution. Basing on this assumption, a new economic order quantity (EOQ) model with time-varying demands and purchase prices is developed to analyze the effect of deteriorating items on inventory management. With this model,numerical analysis and parameter sensitivity analysis are done. It shows that an optimal solution for this problem exists and different parameters have different effect on the optimal inventory control policy.
Directory of Open Access Journals (Sweden)
Jinping Liu
2016-06-01
Full Text Available The topic of online product quality inspection (OPQI with smart visual sensors is attracting increasing interest in both the academic and industrial communities on account of the natural connection between the visual appearance of products with their underlying qualities. Visual images captured from granulated products (GPs, e.g., cereal products, fabric textiles, are comprised of a large number of independent particles or stochastically stacking locally homogeneous fragments, whose analysis and understanding remains challenging. A method of image statistical modeling-based OPQI for GP quality grading and monitoring by a Weibull distribution(WD model with a semi-supervised learning classifier is presented. WD-model parameters (WD-MPs of GP images’ spatial structures, obtained with omnidirectional Gaussian derivative filtering (OGDF, which were demonstrated theoretically to obey a specific WD model of integral form, were extracted as the visual features. Then, a co-training-style semi-supervised classifier algorithm, named COSC-Boosting, was exploited for semi-supervised GP quality grading, by integrating two independent classifiers with complementary nature in the face of scarce labeled samples. Effectiveness of the proposed OPQI method was verified and compared in the field of automated rice quality grading with commonly-used methods and showed superior performance, which lays a foundation for the quality control of GP on assembly lines.
Liu, Jinping; Tang, Zhaohui; Xu, Pengfei; Liu, Wenzhong; Zhang, Jin; Zhu, Jianyong
2016-01-01
The topic of online product quality inspection (OPQI) with smart visual sensors is attracting increasing interest in both the academic and industrial communities on account of the natural connection between the visual appearance of products with their underlying qualities. Visual images captured from granulated products (GPs), e.g., cereal products, fabric textiles, are comprised of a large number of independent particles or stochastically stacking locally homogeneous fragments, whose analysis and understanding remains challenging. A method of image statistical modeling-based OPQI for GP quality grading and monitoring by a Weibull distribution(WD) model with a semi-supervised learning classifier is presented. WD-model parameters (WD-MPs) of GP images’ spatial structures, obtained with omnidirectional Gaussian derivative filtering (OGDF), which were demonstrated theoretically to obey a specific WD model of integral form, were extracted as the visual features. Then, a co-training-style semi-supervised classifier algorithm, named COSC-Boosting, was exploited for semi-supervised GP quality grading, by integrating two independent classifiers with complementary nature in the face of scarce labeled samples. Effectiveness of the proposed OPQI method was verified and compared in the field of automated rice quality grading with commonly-used methods and showed superior performance, which lays a foundation for the quality control of GP on assembly lines. PMID:27367703
Institute of Scientific and Technical Information of China (English)
YAN Zhen-Ya
2002-01-01
The two-parameter family of Estevez-Mansfield-Clarkson equations with fully nonlinear dispersion (called E(m, n) equations), (uzm)zzr + γ(unzur)z + urr = 0 which is a generalized model of the integrable Estevez-MansfieldClarkson equation u + γ(uzuzr +uzzur) +urr = 0, is presented. Five types of symmetries of the E(m, n) equation are obtained by making use of the direct reduction method. Using these obtained reductions and some simple tranaformations,we obtain the solitary-like wave solutions of E(1, n) equation. In addition, we also find the compacton solutions (which are solitary waves with the property that after colliding with other compacton solutions, they reemerge with the same coherent shape) orE(3, 2) equation and E(m, m- 1) for its potentials, say, uz, and compacton-like solutions of E(m, m- 1)equations, respectively. Whether there exist compacton-like solutions of the other E(m, n) equation with m ≠ n + 1 is still an open problem.
带线性约束的新两参数估计%New Two Parameters Estimation for the Linear Model with Linear Restrictions
Institute of Scientific and Technical Information of China (English)
郭淑妹; 顾勇为; 郭杰
2013-01-01
针对带约束的最小二乘估计在参数估计中处理复共线性的不足，引入随机线性约束，提出了约束新两参数估计。并且得到在均方误差下，约束新两参数估计与约束最小二乘估计，约束岭估计和约束Liu估计相比的优良性。%In order to overcome the shortage of the multicollinearity in ordinary restricted least square estimation with parameter estimate based on the stochastic linear restrictions,a new estimation as restricted linear new two parameters estimation is proposed. In the mean squared error sense,compared with the properties the ordinary restricted least squares estimation,and the restricted ridge estimation,the method we proposed was superior.
Directory of Open Access Journals (Sweden)
Rodrigo Geroni Mendes Nascimento
2012-06-01
Full Text Available
Em 1979 a técnica de modelagem de distribuições diamétricas por funções probabilísticas foi aplicada pela primeira vez por Hyink & Moser na prognose do crescimento e da produção de florestas multiâneas e heterogêneas. Entretanto, atualmente, poucos trabalhos a utilizam no planejamento da produção dessas florestas por desconhecerem a viabilidade operacional da técnica. Sendo assim, esse trabalho visa apresentar uma revisão das características que propiciam a modelagem do crescimento e da produção por classe diamétrica, destacando a importância da dinâmica do recrutamento, mortalidade, sobrevivência, bem como dos atributos populacionais correlacionados à modelagem da distribuição de Weibull, apresentando as particularidades estatísticas utilizadas na modelagem da produção por esse método.
doi: 10.4336/2012.pfb.32.70.93
In 1979 the technique of modeling diameter distributions by probabilistic functions was first applied for Hyink & Moser in forecasting growth and production of uneven aged and heterogeneous forests. However, today few studies use this method for planning the production in these forests for not knowing the operational feasibility of the technique. Therefore this paper presents a review of the characteristics that allow the modeling of growth and yield by diameter class, highlighting the importance of the dynamics of recruitment, mortality, survival, and population of attributes related to the modeling of Weibull distribution, with the specific statistics used in the modeling of yield by this method.
doi: 10.4336/2012.pfb.32.70.93
Energy Technology Data Exchange (ETDEWEB)
Toure, S. [Cocody Univ. (Ivory Coast). Lab. d' Energie Solaire
2005-04-01
The 2-parameter Weibull distribution is the hypothesis that is widely used in the fitting studies of random series of wind speeds. Several procedures are used to find the set of the two fitting parameters k and c. From an experimental study, the fitting parameters were first determined by the regression method. The basic ideas of the Eigen-coordinates method were reported by previous works, in the case of the 4-parameter Stauffer distribution. In the present paper, the new method is applied to identify the 2-parameter Weibull distribution. The differential equation was identified. Then the study disclosed a linear relationship with two Eigen-coordinates. Two complemental errors {epsilon}{sub j} and e{sub j} were introduced, as criteria to assess the goodness-of-fit of the distribution. {epsilon}{sub j} was linked to the linear relationship. e{sub j} was used to test the goodness-of-fit between the observed and Weibull cumulative distribution functions. Then the fitting parameters were determined using the Eigen-coordinates method. The results showed a better reliability. (Author)
Chakraborty, Snehasis; Rao, Pavuluri Srinivasa; Mishra, Hari Niwas
2015-10-15
High pressure inactivation of natural microbiota viz. aerobic mesophiles (AM), psychrotrophs (PC), yeasts and molds (YM), total coliforms (TC) and lactic acid bacteria (LAB) in pineapple puree was studied within the experimental domain of 0.1-600 MPa and 30-50 °C with a treatment time up to 20 min. A complete destruction of yeasts and molds was obtained at 500 MPa/50 °C/15 min; whereas no counts were detected for TC and LAB at 300 MPa/30 °C/15 min. A maximum of two log cycle reductions was obtained for YM during pulse pressurization at the severe process intensity of 600 MPa/50 °C/20 min. The Weibull model clearly described the non-linearity of the survival curves during the isobaric period. The tailing effect, as confirmed by the shape parameter (β) of the survival curve, was obtained in case of YM (β1) was observed for the other microbial groups. Analogous to thermal death kinetics, the activation energy (Ea, kJ·mol(-1)) and the activation volume (Va, mL·mol(-1)) values were computed further to describe the temperature and pressure dependencies of the scale parameter (δ, min), respectively. A higher δ value was obtained for each microbe at a lower temperature and it decreased with an increase in pressure. A secondary kinetic model was developed describing the inactivation rate (k, min(-1)) as a function of pressure (P, MPa) and temperature (T, K) including the dependencies of Ea and Va on P and T, respectively.
Aragao, Glaucia M F; Corradini, Maria G; Normand, Mark D; Peleg, Micha
2007-11-01
Published survival curves of Escherichia coli in two growth media, with and without the presence of salt, at various temperatures and in a Greek eggplant salad having various levels of essential oil, all had a characteristic downward concavity when plotted on semi logarithmic coordinates. Some also exhibited what appeared as a 'shoulder' of considerable length. Regardless of whether a shoulder was noticed, the survival pattern could be considered as a manifestation of an underlying unimodal distribution of the cells' death times. Mathematically, the data could be described equally well by the Weibull and log normal distribution functions, which had similar modes, means, standard deviations and coefficients of skewness. When plotted in their probability density function (PDF) form, the curves also appeared very similar visually. This enabled us to quantify and compare the effect of temperature or essential oil concentration on the organism's survival in terms of these temporal distributions' characteristics. Increased lethality was generally expressed in a shorter mean and mode, a smaller standard deviation and increased overall symmetry as judged by the distributions' degree of skewness. The 'shoulder', as expected, simply indicated that the distribution's standard deviation was much smaller than its mode. Rate models based on the two distribution functions could be used to predict non isothermal survival patterns. They were derived on the assumption that the momentary inactivation rate is the isothermal rate at the momentary temperature at a time that corresponds to the momentary survival ratio. In this application, however, the Weibullian model with a fixed power was not only simpler and more convenient mathematically than the one based on the log normal distribution, but it also provided more accurate estimates of the dynamic inactivation patterns.
Quantifying Fire Cycle from Dendroecological Records Using Survival Analyses
Directory of Open Access Journals (Sweden)
Dominic Cyr
2016-06-01
Full Text Available Quantifying fire regimes in the boreal forest ecosystem is crucial for understanding the past and present dynamics, as well as for predicting its future dynamics. Survival analyses have often been used to estimate the fire cycle in eastern Canada because they make it possible to take into account the censored information that is made prevalent by the typically long fire return intervals and the limited scope of the dendroecological methods that are used to quantify them. Here, we assess how the true length of the fire cycle, the short-term temporal variations in fire activity, and the sampling effort affect the accuracy and precision of estimates obtained from two types of parametric survival models, the Weibull and the exponential models, and one non-parametric model obtained with the Cox regression. Then, we apply those results in a case area located in eastern Canada. Our simulation experiment confirms some documented concerns regarding the detrimental effects of temporal variations in fire activity on parametric estimation of the fire cycle. Cox regressions appear to provide the most accurate and robust estimator, being by far the least affected by temporal variations in fire activity. The Cox-based estimate of the fire cycle for the last 300 years in the case study area is 229 years (CI95: 162–407, compared with the likely overestimated 319 years obtained with the commonly used exponential model.
DEFF Research Database (Denmark)
le Fevre Jakobsen, Bjarne
Publikationen indeholder øvematerialer, tekster, powerpointpræsentationer og handouts til undervisningsfaget Sproglig Metode og Analyse på BA og tilvalg i Dansk/Nordisk 2010-2011......Publikationen indeholder øvematerialer, tekster, powerpointpræsentationer og handouts til undervisningsfaget Sproglig Metode og Analyse på BA og tilvalg i Dansk/Nordisk 2010-2011...
威布尔分布对整体旋转式斯特林制冷机的可靠性计算%Calculation on the reliability of rotary Stirling cryocoolers by Weibull Law
Institute of Scientific and Technical Information of China (English)
罗高乔; 范仙红; 何世安
2011-01-01
介绍了威布尔分布的计算过程,对Thales整体旋转式斯特林制冷机的可靠性试验数据进行了分析,比较了威布尔分布可靠性计算的数值解析法与图估法的计算结果,总结了Thales整体旋转式斯特林制冷机的可靠性计算过程和加速因子的确定,为国内同类产品可靠性试验方案和寿命计算方法提供借鉴.%This article described the calculation of Weibull Law and analyzed the experimental data on the reliability of Tha-les rotary Stirling cryocoolers. The comparison of calculating results between Weibull Law and figure - estimation method was conducted.It summerized the calculation process and speeding factor of Thales rotary Stirling cryocooler.
Directory of Open Access Journals (Sweden)
Daniel Henrique Breda Binoti
2013-09-01
Full Text Available Objetivou-se neste estudo avaliar a eficiência da função log-Pearson tipo V para a descrição da estrutura diamétrica de povoamentos equiâneos de eucaliptos, bem como propor um modelo de distribuição diamétrica utilizando essa função. A modelagem realizada pela função log-Pearson tipo V foi comparada com a modelagem realizada com a função Weibull e hiperbólica. Para isso utilizou-se dados de parcelas permanentes de eucalipto, localizadas na região centro oeste do estado de Minas Gerais. A função Pearson tipo V foi testada em três diferentes configurações, com três e dois parâmetros, e tendo o parâmetro de locação substituído pelo diâmetro mínimo da parcela. A aderência das funções aos dados foi comprovada pela aplicação do teste Kolmogorov-Sminorv (K-S. Todos os ajustes apresentaram aderência aos dados pelo teste KS. As funções Weibull e hiperbólica apresentaram desempenho superior ao demonstrado pela função Pearson tipo V.
Institute of Scientific and Technical Information of China (English)
蔡改贫; 郭进山; 夏刘洋
2016-01-01
为了对石灰石受冲击破碎后的颗粒粒度分布特征进行分析,采用Bond冲击破碎试验机对不同粒度的单个石灰石颗粒在不同摆锤冲击角度下进行冲击破碎试验.结果表明:Bond冲击破碎后石灰石颗粒粒度符合Weibull分布模型;破碎后颗粒的质量累积概率随冲击能量的增加而提高;破碎后颗粒的质量累积概率密度函数曲线峰值随着给矿粒度的增加而减小;冲击能量增加到一定数值后,冲击能量继续增加,破碎后石灰石各粒径颗粒的质量增加效果随给矿粒径增加而逐渐减弱;给矿粒度一定时,细粒径颗粒的增加幅度随着冲击能的增加而较小,破碎后颗粒的质量累积概率密度函数曲线的峰值随着冲击能的增加而提高;破碎后颗粒的质量累积概率密度函数曲线的宽度随给矿粒径的增加而增大.%To research the particle size distribution of limestone in the impact crusher,the experiments of single lime-stone of different size at different angles were carried out on Bond impact crushing test machine. The results showed that:the particle size of the limestone under Bond impact crushing is in line with the Weibull distribution. The mass cumulative proba-bility of the particles increases with the increase of the impact energy and the peak of broken particles ' mass accumulation probability density function decreases with the increase of feed size. When impact energy continues to increase,the quality in-crease effect of broken particle size of limestone is is gradually weakened as the feed size becomes largeness after the impact energy is increased to a certain value. While the feed size is certain,the increase amplitude of fine particle diminished and the peak of broken particles' mass accumulation probability density function increases as the impact energy increases,but the width of broken particles' mass accumulation probability density function curve widens with the increase of the feed size.
Institute of Scientific and Technical Information of China (English)
王永泉; 陈花玲; 赵建平; 朱子才
2013-01-01
提出一种针对MEMS(micro-electro-mechanical systems)器件机械失效进行可靠性建模与预测的概率方法.首先从材料力学性能的尺寸效应出发,介绍脆性材料断裂强度的不确定性及其Weibull概率分布；然后,针对典型的MEMS表面微加工工艺,推导得出基于牺牲层技术的淀积薄膜结构残余热应力表达式；在此基础上,以一种悬臂式MEMS多晶硅器件在冲击载荷下的断裂失效为研究实例,建立体现其尺度、工艺及载荷特性的可靠性分析模型,并利用相关文献对多晶硅力学性能的测试数据,对该器件的冲击可靠度进行定量计算.结果表明典型多晶硅MEMS结构具有高达103g ～104g数量级的抗冲击能力(g为重力加速度).同时可看出,MEMS可靠性受多种关联因素的综合影响,准确的可靠性建模及设计在很大程度上依赖于大量的微尺度下材料性能或行为的基础性实验数据.%A probabilistic approach to model and predict the reliability of MEMS ( micro-electro-mechanical systems) devices is proposed. Starting from the size effect of microstructures, the Weibull probability distribution for describing the uncertainty of brittle materials' fracture strength is presented at first. Then, aiming at a typical MEMS surface micro-machining process, which is characterized as the chemical vapor deposition and sacrificial layer technology, the thermal residual stress of thin films is derived. Based on this, the reliability assessment on the fracture failure of a polysilicon cantilevered device under shock load is performed as a case study. A reliability model of the device is established, which incorporates the scale, process and load characteristics to some extent. Using the testing data for the mechanical properties of polysilicon material provided by the relevant literatures, the quantitative shock reliability of the device is calculated. The analysis show that typical polysilicon MEMS structures can
Directory of Open Access Journals (Sweden)
Luís R. A Gabriel Filho
2011-02-01
Full Text Available O regime eólico de uma região pode ser descrito por distribuição de frequências que fornecem informações e características extremamente necessárias para uma possível implantação de sistemas eólicos de captação de energia na região e consequentes aplicações no meio rural em regiões afastadas. Estas características, tais como a velocidade média anual, a variância das velocidades registradas e a densidade da potência eólica média horária, podem ser obtidas pela frequência de ocorrências de determinada velocidade, que por sua vez deve ser estudada através de expressões analíticas. A função analítica mais adequada para distribuições eólicas é a função de densidade de Weibull, que pode ser determinada por métodos numéricos e regressões lineares. O objetivo deste trabalho é caracterizar analítica e geometricamente todos os procedimentos metodológicos necessários para a realização de uma caracterização completa do regime eólico de uma região e suas aplicações na região de Botucatu - SP, visando a determinar o potencial energético para implementação de turbinas eólicas. Assim, foi possível estabelecer teoremas relacionados com a forma de caracterização do regime eólico, estabelecendo a metodologia concisa analiticamente para a definição dos parâmetros eólicos de qualquer região a ser estudada. Para o desenvolvimento desta pesquisa, utilizou-se um anemômetro da CAMPBELL.The wind regime of a region can be described by frequency distributions that provide information and features extremely necessary for a possible deployment of wind systems of energy capturing in the region and the resulting applications in rural areas in remote regions. These features, such as the annual average speed, variance of speed and hourly average of wind power density, can be obtained by the frequency of occurrences of certain speed, which in turn should be studied through analytical expressions. The analytic
Boissière, Louis; Takemoto, Mitsuru; Bourghli, Anouar; Vital, Jean-Marc; Pellisé, Ferran; Alanay, Ahmet; Yilgor, Caglar; Acaroglu, Emre; Perez-Grueso, Francisco Javier; Kleinstück, Frank; Obeid, Ibrahim
2017-04-01
Many radiological parameters have been reported to correlate with patient's disability including sagittal vertical axis (SVA), pelvic tilt (PT), and pelvic incidence minus lumbar lordosis (PI-LL). European literature reports other parameters such as lumbar lordosis index (LLI) and the global tilt (GT). If most parameters correlate with health-related quality of life scores (HRQLs), their impact on disability remains unclear. This study aimed to validate these parameters by investigating their correlation with HRQLs. It also aimed to evaluate the relationship between each of these sagittal parameters and HRQLs to fully understand the impact in adult spinal deformity management. A retrospective review of a multicenter, prospective database was carried out. The database inclusion criteria were adults (>18 years old) presenting any of the following radiographic parameters: scoliosis (Cobb ≥20°), SVA ≥5 cm, thoracic kyphosis ≥60° or PT ≥25°. All patients with complete data at baseline were included. Health-related quality of life scores, demographic variables (DVs), and radiographic parameters were collected at baseline. Differences in HRQLs among groups of each DV were assessed with analyses of variance. Correlations between radiographic variables and HRQLs were assessed using the Spearman rank correlation. Multivariate linear regression models were fitted for each of the HRQLs (Oswestry Disability Index [ODI], Scoliosis Research Society-22 subtotal score, or physical component summaries) with sagittal parameters and covariants as independent variables. A p<.05 value was considered statistically significant. Among a total of 755 included patients (mean age, 52.1 years), 431 were non-surgical candidates and 324 were surgical candidates. Global tilt and LLI significantly correlated with HRQLs (r=0.4 and -0.3, respectively) for univariate analysis. Demographic variables such as age, gender, body mass index, past surgery, and surgical or non-surgical candidate
Assessing senescence patterns in populations of large mammals
Directory of Open Access Journals (Sweden)
Gaillard, J.-M.
2004-06-01
Full Text Available Theoretical models such as those of Gompertz and Weibull are commonly used to study senescence in survival for humans and laboratory or captive animals. For wild populations of vertebrates, senescence in survival has more commonly been assessed by fitting simple linear or quadratic relationships between survival and age. By using appropriate constraints on survival parameters in Capture-Mark-Recapture (CMR models, we propose a first analysis of the suitability of the Gompertz and the two-parameter Weibull models for describing aging-related mortality in free-ranging populations of ungulates. We first show how to handle the Gompertz and the two-parameter Weibull models in the context of CMR analyses. Then we perform a comparative analysis of senescence patterns in both sexes of two ungulate species highly contrasted according to the intensity of sexual selection. Our analyses provide support to the Gompertz model for describing senescence patterns in ungulates. Evolutionary implications of our results are discussed
DEFF Research Database (Denmark)
Nielsen, Peter Carøe; Hansen, Hans Nørgaard; Olsen, Flemming Ove
2007-01-01
The quantitative and qualitative description of laser beam characteristics is important for process implementation and optimisation. In particular, a need for quantitative characterisation of beam diameter was identified when using fibre lasers for micro manufacturing. Here the beam diameter limits...... the obtainable features in direct laser machining as well as heat affected zones in welding processes. This paper describes the development of a measuring unit capable of analysing beam shape and diameter of lasers to be used in manufacturing processes. The analyser is based on the principle of a rotating...... mechanical wire being swept through the laser beam at varying Z-heights. The reflected signal is analysed and the resulting beam profile determined. The development comprised the design of a flexible fixture capable of providing both rotation and Z-axis movement, control software including data capture...
Hendriks, M.A.; Luyten, J.W.; Scheerens, J.; Sleegers, P.J.C.; Scheerens, J.
2014-01-01
In this chapter results of a research synthesis and quantitative meta-analyses of three facets of time effects in education are presented, namely time at school during regular lesson hours, homework, and extended learning time. The number of studies for these three facets of time that could be used
Contesting Citizenship: Comparative Analyses
DEFF Research Database (Denmark)
Siim, Birte; Squires, Judith
2007-01-01
. Comparative citizenship analyses need to be considered in relation to multipleinequalities and their intersections and to multiple governance and trans-national organisinf. This, in turn, suggests that comparative citizenship analysis needs to consider new spaces in which struggles for equal citizenship occur...
Wavelet Analyses and Applications
Bordeianu, Cristian C.; Landau, Rubin H.; Paez, Manuel J.
2009-01-01
It is shown how a modern extension of Fourier analysis known as wavelet analysis is applied to signals containing multiscale information. First, a continuous wavelet transform is used to analyse the spectrum of a nonstationary signal (one whose form changes in time). The spectral analysis of such a signal gives the strength of the signal in each…
Veldman, M.; Schelvis-Smit, A.A.M.
2005-01-01
On behalf of a client of Animal Sciences Group, different varieties of veal were analyzed by both instrumental and sensory analyses. The sensory evaluation was performed with a sensory analytical panel in the period of 13th of May and 31st of May, 2005. The three varieties of veal were: young bull,
Institute of Scientific and Technical Information of China (English)
张建平; 韩熠; 刘宇; 朱群志
2015-01-01
为分析钠硫电池加热模块的温升过程，分别基于三维瞬态导热方程和Weibull函数建立了加热模块的理论模型和试验温升数据的拟合模型，数值模拟了钠硫电池加热模块温升过程与瞬态温度分布，探讨Weibull参数对升温曲线的影响规律。结果表明：Weibull拟合模型能够精确描述加热模块的温升过程，可靠度较高；模块内部整体温升率随时间和距离模块中心的长度均呈非线性降低趋势；形状参数和尺度参数分别决定了分段温升和整体温升的效率，这为钠硫电池加热模块以及其他加热装置的优化设计提供参考。%In order to analyze the temperature rise of the heating module for sodium⁃sulfur battery, the theoretical model of the heating module and fitting model of the experimental temperature data were established on the basis of 3D transient heat conduction equation and Weibull function, respectively, and also the temperature rise process and the transient temperature distribution of heating module for sodium⁃sulfur battery were numerically simulated, and the effects of Weibull parameters on the temperature rise curve were further investigated. The results indicate that the Weibull fitting model could accurately describe the temperature rise process of heating module with high reliability, and the temperature rise rate inside the whole heating module presents nonlinearly decreasing trend with the increase of time, as well as the length from the module center. Furthermore, shape and scale parameter dominate the efficiency of the sectional temperature rise and the overall one respectively, and the technical reference is provided for the optimal design of heating module for sodium⁃sulfur battery and other heating devices.
Institute of Scientific and Technical Information of China (English)
胡建军; 许洪斌; 高孝旺; 祖世华
2012-01-01
根据齿轮传动过程中普遍承受的三参数威布尔分布载荷谱,编制了试验用随机变幅疲劳载荷谱,在MTS电液伺服疲劳试验机上利用成组试验方法完成了该随机载荷作用下齿轮弯曲疲劳试验,得到了特定变异系数三参数威布尔分布载荷谱下齿轮弯曲强度的S-N曲线。试验结果证明,在服从三参数威布尔分布随机载荷谱下,随机变幅疲劳试验得出的轮齿疲劳寿命远低于恒载荷疲劳试验得出的疲劳寿命。对随机载荷下的齿轮设计的疲劳极限的理论值进行了预测,并与试验结果进行了比较。随机载荷下的理论值与试验结果相吻合,因此可以通过随机载荷谱的载荷比例系数去推断随机载荷下齿轮弯曲疲劳强度值。%Random-amplitude fatigue load spectrum for experiments is made according to the ubiquitous three-parameter Weibull distribution in gear transmission.Gear bending fatigue test under the random load is carried out on a MTS electro-hydraulic servo material fatigue tester by using group testing method,and the S-N curve of gear bending strength under three-parameter Weibull distribution with specific variation coefficients is obtained.The fatigue test results show the gear's endurance life under random load is far less than that under constant load when the load submits to three-parameter Weibull distribution random load spectrum.The theoretical value of fatigue limit for gear under random load is predicated and compared with test results.The theoretical value is in accordance with the test results.Therefore,the fatigue strength of gear bending under random load can be deduced according to the load ratio coefficient of random load spectrum.
Geiser, Achim
2015-01-01
A variety of possible future analyses of HERA data in the context of the HERA data preservation programme is collected, motivated, and commented. The focus is placed on possible future analyses of the existing $ep$ collider data and their physics scope. Comparisons to the original scope of the HERA programme are made, and cross references to topics also covered by other participants of the workshop are given. This includes topics on QCD, proton structure, diffraction, jets, hadronic final states, heavy flavours, electroweak physics, and the application of related theory and phenomenology topics like NNLO QCD calculations, low-x related models, nonperturbative QCD aspects, and electroweak radiative corrections. Synergies with other collider programmes are also addressed. In summary, the range of physics topics which can still be uniquely covered using the existing data is very broad and of considerable physics interest, often matching the interest of results from colliders currently in operation. Due to well-e...
Analysing Access Control Specifications
DEFF Research Database (Denmark)
Probst, Christian W.; Hansen, René Rydhof
2009-01-01
. Recent events have revealed intimate knowledge of surveillance and control systems on the side of the attacker, making it often impossible to deduce the identity of an inside attacker from logged data. In this work we present an approach that analyses the access control configuration to identify the set......When prosecuting crimes, the main question to answer is often who had a motive and the possibility to commit the crime. When investigating cyber crimes, the question of possibility is often hard to answer, as in a networked system almost any location can be accessed from almost anywhere. The most...... of credentials needed to reach a certain location in a system. This knowledge allows to identify a set of (inside) actors who have the possibility to commit an insider attack at that location. This has immediate applications in analysing log files, but also nontechnical applications such as identifying possible...
Energy Technology Data Exchange (ETDEWEB)
Wilen, C.; Moilanen, A.; Kurkela, E. [VTT Energy, Espoo (Finland). Energy Production Technologies
1996-12-31
The overall objectives of the project `Feasibility of electricity production from biomass by pressurized gasification systems` within the EC Research Programme JOULE II were to evaluate the potential of advanced power production systems based on biomass gasification and to study the technical and economic feasibility of these new processes with different type of biomass feed stocks. This report was prepared as part of this R and D project. The objectives of this task were to perform fuel analyses of potential woody and herbaceous biomasses with specific regard to the gasification properties of the selected feed stocks. The analyses of 15 Scandinavian and European biomass feed stock included density, proximate and ultimate analyses, trace compounds, ash composition and fusion behaviour in oxidizing and reducing atmospheres. The wood-derived fuels, such as whole-tree chips, forest residues, bark and to some extent willow, can be expected to have good gasification properties. Difficulties caused by ash fusion and sintering in straw combustion and gasification are generally known. The ash and alkali metal contents of the European biomasses harvested in Italy resembled those of the Nordic straws, and it is expected that they behave to a great extent as straw in gasification. Any direct relation between the ash fusion behavior (determined according to the standard method) and, for instance, the alkali metal content was not found in the laboratory determinations. A more profound characterisation of the fuels would require gasification experiments in a thermobalance and a PDU (Process development Unit) rig. (orig.) (10 refs.)
Energy Technology Data Exchange (ETDEWEB)
Geiser, Achim
2015-12-15
A variety of possible future analyses of HERA data in the context of the HERA data preservation programme is collected, motivated, and commented. The focus is placed on possible future analyses of the existing ep collider data and their physics scope. Comparisons to the original scope of the HERA pro- gramme are made, and cross references to topics also covered by other participants of the workshop are given. This includes topics on QCD, proton structure, diffraction, jets, hadronic final states, heavy flavours, electroweak physics, and the application of related theory and phenomenology topics like NNLO QCD calculations, low-x related models, nonperturbative QCD aspects, and electroweak radiative corrections. Synergies with other collider programmes are also addressed. In summary, the range of physics topics which can still be uniquely covered using the existing data is very broad and of considerable physics interest, often matching the interest of results from colliders currently in operation. Due to well-established data and MC sets, calibrations, and analysis procedures the manpower and expertise needed for a particular analysis is often very much smaller than that needed for an ongoing experiment. Since centrally funded manpower to carry out such analyses is not available any longer, this contribution not only targets experienced self-funded experimentalists, but also theorists and master-level students who might wish to carry out such an analysis.
Daniel Henrique Breda Binoti; Mayra Luiza Marques da Silva Binoti; Helio Garcia Leite
2013-01-01
Objetivou-se neste estudo avaliar a eficiência da função log-Pearson tipo V para a descrição da estrutura diamétrica de povoamentos equiâneos de eucaliptos, bem como propor um modelo de distribuição diamétrica utilizando essa função. A modelagem realizada pela função log-Pearson tipo V foi comparada com a modelagem realizada com a função Weibull e hiperbólica. Para isso utilizou-se dados de parcelas permanentes de eucalipto, localizadas na região centro oeste do estado de Minas Gerais. A funç...
Institute of Scientific and Technical Information of China (English)
韩亮; 毕文广; 李红江; 师相; 李柄成
2011-01-01
露天矿爆破中爆堆形态的影响因素众多，为找出影响爆堆形态的主控因素，引入灰色关联理论，由于爆堆形态无法用数字参量表达，因此很难直接利用灰色关联理论对其进行计算，通过引入Weibull模型对实测爆堆形态曲线进行拟合计算，完成了爆堆形态参数的量化过程，并对黑岱沟露天煤矿的49组实例进行了灰色关联度计算，得到了各因素对爆堆形态影响的关联序列，并对其进行了分析。该研究对于露天矿爆堆形态的设计优化具有一定的指导意义。%There are many factors influencing the shape of blasted stockpile in the open -pit. In order to find the main influence factors, the grey correlation theory is introduced. However, it is difficult to calculate the shape of blasted stockpile directly using grey correlation theory due to it is unable to express the shape with digital parameters. In this paper, by leading into Weibull model, the fitting calculation of the actual shape curve of blasted stockpile was accomplished, which made quantification of parameters of blasted stockpile. The 49 examples from Heidaigou open -pit were calculated with grey correlation theory, correlativity sequence of influencing factors of the blasted stockpile shape was obtained, and the results were analyzed. The research has definited significance for optimizing the shape design of blasted stockpile in openpit.
On generalized trigonometric functions with two parameters
Bhayo, Barkat Ali; Vuorinen, Matti
2011-01-01
The generalized $p$-trigonometric and ($p,q$)-trigonometric functions were introduced by P. Lindqvist and S. Takeuchi, respectively. We prove some inequalities and present a few conjectures for the ($p,q$)-functions.
Mirror symmetry for two parameter models, 2
Candelas, Philip; Katz, S; Morrison, Douglas Robert Ogston; Philip Candelas; Anamaria Font; Sheldon Katz; David R Morrison
1994-01-01
We describe in detail the space of the two K\\"ahler parameters of the Calabi--Yau manifold \\P_4^{(1,1,1,6,9)}[18] by exploiting mirror symmetry. The large complex structure limit of the mirror, which corresponds to the classical large radius limit, is found by studying the monodromy of the periods about the discriminant locus, the boundary of the moduli space corresponding to singular Calabi--Yau manifolds. A symplectic basis of periods is found and the action of the Sp(6,\\Z) generators of the modular group is determined. From the mirror map we compute the instanton expansion of the Yukawa couplings and the generalized N=2 index, arriving at the numbers of instantons of genus zero and genus one of each degree. We also investigate an SL(2,\\Z) symmetry that acts on a boundary of the moduli space.
Energy Technology Data Exchange (ETDEWEB)
Lawson, E.M. [Australian Nuclear Science and Technology Organisation, Lucas Heights, NSW (Australia). Physics Division
1998-03-01
The major use of ANTARES is Accelerator Mass Spectrometry (AMS) with {sup 14}C being the most commonly analysed radioisotope - presently about 35 % of the available beam time on ANTARES is used for {sup 14}C measurements. The accelerator measurements are supported by, and dependent on, a strong sample preparation section. The ANTARES AMS facility supports a wide range of investigations into fields such as global climate change, ice cores, oceanography, dendrochronology, anthropology, and classical and Australian archaeology. Described here are some examples of the ways in which AMS has been applied to support research into the archaeology, prehistory and culture of this continent`s indigenous Aboriginal peoples. (author)
Directory of Open Access Journals (Sweden)
Nelly Morais
2006-11-01
Full Text Available 1. Préambule - Conditions de réalisation de la présente analyse Un groupe d'étudiants de master 1 de FLE de l'université Paris 3 (donc des étudiants en didactique des langues se destinant à l'enseignement du FLE a observé le produit au cours d'un module sur les TIC (Technologies de l'Information et de la Communication et la didactique des langues. Une discussion s'est ensuite engagée sur le forum d'une plate-forme de formation à distance à partir de quelques questions posées par l'enseigna...
Energy Technology Data Exchange (ETDEWEB)
Glickman, Matthew R.; Tang, Akaysha (University of New Mexico, Albuquerque, NM)
2009-02-01
The motivating vision behind Sandia's MENTOR/PAL LDRD project has been that of systems which use real-time psychophysiological data to support and enhance human performance, both individually and of groups. Relevant and significant psychophysiological data being a necessary prerequisite to such systems, this LDRD has focused on identifying and refining such signals. The project has focused in particular on EEG (electroencephalogram) data as a promising candidate signal because it (potentially) provides a broad window on brain activity with relatively low cost and logistical constraints. We report here on two analyses performed on EEG data collected in this project using the SOBI (Second Order Blind Identification) algorithm to identify two independent sources of brain activity: one in the frontal lobe and one in the occipital. The first study looks at directional influences between the two components, while the second study looks at inferring gender based upon the frontal component.
Network class superposition analyses.
Directory of Open Access Journals (Sweden)
Carl A B Pearson
Full Text Available Networks are often used to understand a whole system by modeling the interactions among its pieces. Examples include biomolecules in a cell interacting to provide some primary function, or species in an environment forming a stable community. However, these interactions are often unknown; instead, the pieces' dynamic states are known, and network structure must be inferred. Because observed function may be explained by many different networks (e.g., ≈ 10(30 for the yeast cell cycle process, considering dynamics beyond this primary function means picking a single network or suitable sample: measuring over all networks exhibiting the primary function is computationally infeasible. We circumvent that obstacle by calculating the network class ensemble. We represent the ensemble by a stochastic matrix T, which is a transition-by-transition superposition of the system dynamics for each member of the class. We present concrete results for T derived from boolean time series dynamics on networks obeying the Strong Inhibition rule, by applying T to several traditional questions about network dynamics. We show that the distribution of the number of point attractors can be accurately estimated with T. We show how to generate Derrida plots based on T. We show that T-based Shannon entropy outperforms other methods at selecting experiments to further narrow the network structure. We also outline an experimental test of predictions based on T. We motivate all of these results in terms of a popular molecular biology boolean network model for the yeast cell cycle, but the methods and analyses we introduce are general. We conclude with open questions for T, for example, application to other models, computational considerations when scaling up to larger systems, and other potential analyses.
DEFF Research Database (Denmark)
Thorlacius, Lisbeth
2009-01-01
planlægning af de funktionelle og indholdsmæssige aspekter ved websites. Der findes en stor mængde teori- og metodebøger, som har specialiseret sig i de tekniske problemstillinger i forbindelse med interaktion og navigation, samt det sproglige indhold på websites. Den danske HCI (Human Computer Interaction...... hyperfunktionelle websites. Det primære ærinde for HCI-eksperterne er at udarbejde websites, som er brugervenlige. Ifølge deres direktiver skal websites være opbygget med hurtige og effektive navigations- og interaktionsstrukturer, hvor brugeren kan få sine informationer ubesværet af lange downloadingshastigheder...... eller blindgyder, når han/hun besøger sitet. Studier i design og analyse af de visuelle og æstetiske aspekter i planlægning og brug af websites har imidlertid kun i et begrænset omfang været under reflektorisk behandling. Det er baggrunden for dette kapitel, som indleder med en gennemgang af æstetikkens...
Directory of Open Access Journals (Sweden)
O. S. Vallejos-Barra
2009-01-01
Full Text Available Se generó un conjunto de procedimientos de cálculo que permitieran estimar los tres parámetros de la función de densidad de probabilidad Weibull 3P. Además, se estimaron y evaluaron los parámetros óptimos de esta función mediante los procedimientos de cálculo desarrollados, para modelar los diámetros a la altura del pecho de árboles de Pinus taeda. Los árboles fueron medidos durante ocho años en seis parcelas para cada una de las cinco densidades de plantación consideradas. Los parámetros de la Weibull 3P fueron estimados por cuatro métodos alternativos: máxima verosimilitud, momentos, percentiles e híbrido. Los procedimientos de optimización buscaban minimizar tanto el índice de error como los estadísticos de las pruebas de bondad de ajuste: Kolmogorov- Smirnov, Anderson-Darling, Kuiper, Cramer-Von Mises y Watson. Cuatro fueron los resultados principales de esta investigación. Primero, los métodos de estimación de parámetros y la edad de plantación afectaron el valor del parámetro de localización. Segundo, 45 % de los valores del parámetro de localización fueron negativos. En estos casos, se encontró una relación lineal altamente significativa entre los parámetros de localización, de escala y forma. Entonces, el efecto de un valor del parámetro de localización negativo fue compensado por el valor de los otros parámetros. Tercero, el método de percentiles y de máxima verosimilitud producen el menor y mayor valor del parámetro de localización, respectivamente. Cuarto, la mayor exactitud en el ajuste se logró con los métodos de estimación de parámetros de los percentiles y de los momentos. La mayor exactitud en el ajuste de la prueba Anderson-Darling se asoció al método de los momentos y el resto de las pruebas de bondad de ajuste con el método de los percentiles.
Institute of Scientific and Technical Information of China (English)
谭伟; 师义民; 孙玉东
2012-01-01
针对双参数指数型产品,在具有二项移走（即在每个观测时刻产品的移走数服从二项分布）的分组寿命试验下,研究了分组时刻的确定方法,推导出门限参数、寿命参数和移走概率的极大似然估计。进而,讨论了双参数指数型产品在具有二项移走的恒加寿命分组试验下的可靠性分析问题。利用加速寿命方程,给出了双参数指数型产品的可靠性估计。最后给出随机模拟例子验证了结论的正确性。%Based on two-parameter exponential grouped life test with binomial removals （The number of samples removed obey the binomial distribution at each observation time）, the paper studied the method to determine the grouped time, derived out and removal probability. Then, under the situation the MLE of threshold parameter, lifetime parameter of constant-stress accelerated grouped life test with binomial removals, by the aid of the accelerated life equation, the paper gave out the reliability estimation of two-parameter exponential unit. Finally, simulation example is given to verify the correctness of the result.
Weibull distribution for modeling drying of grapes and its application%基于Weibull分布函数的葡萄干燥过程模拟及应用
Institute of Scientific and Technical Information of China (English)
白竣文; 王吉亮; 肖红伟; 巨浩羽; 刘嫣红; 高振江
2013-01-01
为了探究 Weibull 分布函数中各参数的影响因素及其在干燥中的应用，该文以不同干燥方法（气体射流冲击干燥、真空脉动干燥）、干燥温度（50、55、60和65℃）以及烫漂预处理（30、60、90、120 s）的葡萄干燥过程为研究对象，利用Weibull分布函数对其干燥动力学曲线进行模拟并分析。研究结果表明：Weibull分布函数能够很好的模拟葡萄在试验条件下的干燥过程；尺度参数α与干燥温度有关，并且随着干燥温度的升高而降低；形状参数β与干燥方式和物料状态有关，但干燥温度对形状参数β的影响很小。计算了葡萄在干燥过程中的水分扩散系数Dcal在0.2982×10-9~2.7700×10-9 m2/s 之间，并根据阿伦尼乌斯公式计算出热风干燥和真空脉动干燥方法的干燥活化能分别为72.87和61.43 kJ/mol。研究结果为Weibull分布函数在葡萄干燥过程的应用提供参考。%Grapes as a seasonal fruit, have relatively high sugar content and moisture content, and are very sensitive to microbial spoilage during storage. Therefore, grapes once harvested must be consumed or processed into various products within a few weeks in order to reduce economic losses. Drying grapes into raisins is the major processing method in almost all countries where grapes are grown. The knowledge of the drying mechanism is very necessary for heat and moisture transportation efficiency, energy savings and product quality. Several different empirical and semi-empirical drying models were used for describing and predicting drying curves. Some of these models could give a good fit to the drying curves, but the basic idea of process characterization was to consider the process as a ‘‘black box’’--the drying materials and drying conditions were difficult to be related to the parameters of these models used. In this study, the Weibull distribution model was applied to the drying process under different
Institute of Scientific and Technical Information of China (English)
刘婷
2011-01-01
Accelerated life tests,in which more than one stress is often involved, have become widely used in today's industries. The log-linear accelerated model was proposed to describe the relation between multiple-stress and product's lifetime, and Weibull log-linear accelerated models were established. Due to highly nonlinear and non-monotonic of the log likelihood function, the genetic algorithm could be adopted to determine the maximum likelihood estimates of accelerated model parameters， and then reliability assessment and lifetime prediction could be realized under various stress. Finally,simulation results illustrate reasonability of the method proposed in the paper.%针对Weibull分布产品的多应力加速试验,提出采用对数线性加速模型描述多个应力和产品寿命之间的定量关系,建立了Weibull分布对数线性加速试验的可靠性分析模型.鉴于似然函数的高度非线性和非单调特性,采用遗传算法得到了加速模型参数的极大似然估计,可以实现不同应力之间可靠性特性参数的相互转换.仿真算例验证了所提方法的有效性.
Wind speed analysis in La Vainest, Mexico: a bimodal probability distribution case
Energy Technology Data Exchange (ETDEWEB)
Jaramillo, O.A.; Borja, M.A. [Energias No Convencionales, Morelos (Mexico). Instituto de Investigaciones Electricas
2004-08-01
The statistical characteristics of the wind speed in La Vainest, Oxoic, Mexico, have been analyzed by using wind speed data recorded by Instituto de Investigaciones Electricas (IIE). By grouping the observations by annual, seasonal and wind direction, we show that the wind speed distribution, with calms included, is not represented by the typical two-parameter Weibull function. A mathematical formulation by using a bimodal Weibull and Weibull probability distribution function (PDF) has been developed to analyse the wind speed frequency distribution in that region. The model developed here can be applied for similar regions where the wind speed distribution presents a bimodal PDF. The two-parameter Weibull wind speed distribution must not be generalised, since it is not accurate to represent some wind regimes as the case of La Ventosa, Mexico. The analysis of wind data shows that computing the capacity factor for wind power plants to be installed in La Ventosa must be carded out by means of a bimodal PDF instead of the typical Weibull PDF. Otherwise, the capacity factor will be underestimated. (author)
Institute of Scientific and Technical Information of China (English)
王大林; 赵博
2013-01-01
Equipment life cycle management, including mean replacement time and life reliability, was determined based on distribution fitting of historical life data for most nuclear power plants. This method is used widely but it ignores the influence on facilities life of the repair and operating time. To solve this problem, based on the specialties of components in nuclear power plant, a new approach of data processing and life deciding was proposed with empirical method and failure rate indicator combined by Weibull process. The calculation of actual example proves that this new approach overcomes defects such as waste of information and risk of assumption error.%核电厂目前多通过对历史更换数据统计拟合获得设备寿命分布,以确定设备的平均寿命和可靠寿命,而忽视了故障维修和累积运行时间对设备寿命的影响.为解决该问题,根据核电厂设备现场运行特点,对威布尔过程的拟合方法及在现场数据处理中的应用进行分析,提出了基于威布尔过程将专家经验寿命处理为失效率指标的寿期决策方法,克服了原方法对现场数据包含信息使用不完全的缺点,并进行了实例计算.
Institute of Scientific and Technical Information of China (English)
王道平; 于俊娣; 李向阳
2011-01-01
基于需求和采购价格均为时变的EOQ模型,考虑物品的变质率呈更符合现实情况的三参数Weibull分布,同时考虑短缺量拖后和资金时值对易变质物品库存管理的影响,构建了相应的EOQ模型.应用数学软件Matlab对该库存模型进行仿真计算和主要影响参数的灵敏度分析.结果表明,该模型存在最优解,且各主要影响参数对最优库存控制各有不同程度的影响,资金时值对库存总成本净现值的影响程度要甚于短缺量拖后的影响,故在制定科学的库存策略时资金时值需要更加关.注.%For deteriorating items, the deterioration rate can be described by Weibull distribution that is reality-oriented. Basing on this assumption, a new economic order quantity (EOQ) model with time-varying demand and purchase price and partial backlogging and time-value of system cost is developed to analyze the effect of deteriorating items on inventory management. With this model, numerical analysis and parameter sensitivity analysis are done. It shows that an optimum solution for this problem exists and different parameters have different effect on the optimal inventory control policy. Effection of time-value of system cost on net total cost of inventory was more than partial backlogging, therefore the scientific police of inventory should be paid more attention to time-value of system cost.
Multiple Imputation for Network Analyses
Krause, Robert; Huisman, Mark; Steglich, Christian; Snijders, Thomas
2016-01-01
Missing data on network ties is a fundamental problem for network analyses. The biases induced by missing edge data, even when missing completely at random (MCAR), are widely acknowledged and problematic for network analyses (Kossinets, 2006; Huisman & Steglich, 2008; Huisman, 2009). Although model-
Analysis of Two-Parameter (△K and Kmax) Fatigue Crack Propagation Models%包含△K和Kmax二参数的疲劳裂纹扩展模型
Institute of Scientific and Technical Information of China (English)
钱怡; 崔维成
2011-01-01
除了材料自身特性和环境因素外,疲劳裂纹扩展的方式取决于裂纹尖端附近的应力场.而该应力场由外加应力和残余应力组成,受到引起循环塑性区的应力强度因子变化幅度△K和产生单调塑性区的最大应力强度因子Kmax的共同影响.因此,驱动裂纹扩展的外部驱动力应该是△K和Kmax.通过比较Vasudevan和Sadananda,Kuiawski、张嘉振等人提出的3种典型的二参数疲劳裂纹扩展模型的特点,提出了一个兼顾内、外应力,适合变幅载荷下疲劳裂纹扩展的新模型.%The fatigue crack growth is dominated by stress field around the crack-tip except for the material characteristics and the effect of environment. The stress around the crack-tip is the superimposition of the residual stress and the externally applied stress, which are affected by maximum stress intensity factor, Kmax, and stress intensity factor range, △K. The former can be associated with the monotonic plastic zone, while the latter with the cyclic plastic zone. Therefore, the fatigue crack driving force should include two parameters △K and Kmax. After comparing three kinds of fatigue crack growth rate models, which were derived by Vasudevan and Sadananda, Kujawski, and Zhang, a new fatigue crack growth rate model is proposed. The model can deal with the variable amplitude loading cases, and will take into account the residual stress and the externally applied stress.
The Nullness Analyser of julia
Spoto, Fausto
This experimental paper describes the implementation and evaluation of a static nullness analyser for single-threaded Java and Java bytecode programs, built inside the julia tool. Nullness analysis determines, at compile-time, those program points where the null value might be dereferenced, leading to a run-time exception. In order to improve the quality of software, it is important to prove that such situation does not occur. Our analyser is based on a denotational abstract interpretation of Java bytecode through Boolean logical formulas, strengthened with a set of denotational and constraint-based supporting analyses for locally non-null fields and full arrays and collections. The complete integration of all such analyses results in a correct system of very high precision whose time of analysis remains in the order of minutes, as we show with some examples of analysis of large software.
Energy Technology Data Exchange (ETDEWEB)
Gouronnec, A.M. [Institut de Radioprotection et de Surete Nucleaire (IRSN), 92 - Clamart (France)
2004-06-15
The olfactometric analyses presented here are applied to industrial odors being able to generate harmful effects for people. The aim of the olfactometric analyses is to quantify odors, to qualify them or to join a pleasant or an unpleasant character to them (hedonism notion). The aim of this work is at first to present the different measurements carried out, the different measurement methods used and the current applications for each of the methods. (O.M.)
Tenero, David; Green, Justin A; Goyal, Navin
2015-10-01
Tafenoquine (TQ), a new 8-aminoquinoline with activity against all stages of the Plasmodium vivax life cycle, is being developed for the radical cure of acute P. vivax malaria in combination with chloroquine. The efficacy and exposure data from a pivotal phase 2b dose-ranging study were used to conduct exposure-response analyses for TQ after administration to subjects with P. vivax malaria. TQ exposure (i.e., area under the concentration-time curve [AUC]) and region (Thailand compared to Peru and Brazil) were found to be statistically significant predictors of clinical response based on multivariate logistic regression analyses. After accounting for region/country, the odds of being relapse free at 6 months increased by approximately 51% (95% confidence intervals [CI], 25%, 82%) for each 25-U increase in AUC above the median value of 54.5 μg · h/ml. TQ exposure was also a significant predictor of the time to relapse of the infection. The final parametric, time-to-event model for the time to relapse, included a Weibull distribution hazard function, AUC, and country as covariates. Based on the model, the risk of relapse decreased by 30% (95% CI, 17% to 42%) for every 25-U increase in AUC. Monte Carlo simulations indicated that the 300-mg dose of TQ would provide an AUC greater than the clinically relevant breakpoint obtained in a classification and regression tree (CART) analysis (56.4 μg · h/ml) in more than 90% of subjects and consequently result in a high probability of being relapse free at 6 months. This model-based approach was critical in selecting an appropriate phase 3 dose. (This study has been registered at ClinicalTrials.gov under registration no. NCT01376167.).
Reliability analysis of DOOF for Weibull distribution
Institute of Scientific and Technical Information of China (English)
陈文华; 崔杰; 樊晓燕; 卢献彪; 相平
2003-01-01
Hierarchical Bayesian method for estimating the failure probability Pi under DOOF by taking the quasi-Beta distribution B(pi-1 , 1,1, b ) as the prior distribution is proposed in this paper. The weighted Least Squares Estimate method was used to obtain the formula for computing reliability distribution parameters and estimating the reliability characteristic values under DOOF. Taking one type of aerospace electrical connectoras an example, the correctness of the above method through statistical analysis of electrical connector acceler-ated life test data was verified.
Reliability Analysis of DOOF for Weibull Distribution
Institute of Scientific and Technical Information of China (English)
陈文华; 崔杰; 樊小燕; 卢献彪; 相平
2003-01-01
Hierarchical Bayesian method for estimating the failure probability under DOOF by taking the quasi-Beta distribution as the prior distribution is proposed in this paper. The weighted Least Squares Estimate method was used to obtain the formula for computing reliability distribution parameters and estimating the reliability characteristic values under DOOF. Taking one type of aerospace electrical connector as an example, the correctness of the above method through statistical analysis of electrical connector accelerated life test data was verified.
Mitogenomic analyses from ancient DNA
DEFF Research Database (Denmark)
Paijmans, Johanna L.A.; Gilbert, M Thomas P; Hofreiter, Michael
2013-01-01
. To date, at least 124 partially or fully assembled mitogenomes from more than 20 species have been obtained, and, given the rapid progress in sequencing technology, this number is likely to dramatically increase in the future. The increased information content offered by analysing full mitogenomes has...... (mitogenomes). Such studies were initially limited to analyses of extant organisms, but developments in both DNA sequencing technologies and general methodological aspects related to working with degraded DNA have resulted in complete mitogenomes becoming increasingly popular for ancient DNA studies as well...... analyses (whether using modern or ancient DNA) were largely restricted to the analysis of short fragments of the mitochondrial genome. However, due to many technological advances during the past decade, a growing number of studies have explored the power of complete mitochondrial genome sequences...
Descriptive Analyses of Mechanical Systems
DEFF Research Database (Denmark)
Andreasen, Mogens Myrup; Hansen, Claus Thorp
2003-01-01
Forord Produktanalyse og teknologianalyse kan gennmføres med et bredt socio-teknisk sigte med henblik på at forstå kulturelle, sociologiske, designmæssige, forretningsmæssige og mange andre forhold. Et delområde heri er systemisk analyse og beskrivelse af produkter og systemer. Nærværende kompend...
Evaluation "Risk analyses of agroparks"
Ge, L.
2011-01-01
Dit TransForum project richt zich op analyse van de onzekerheden en mogelijkheden van agroparken. Dit heeft geleid tot een risicomodel dat de kwalitatieve en/of kwantitatieve onzekerheden van een agropark project in kaart brengt. Daarmee kunnen maatregelen en managementstrategiën worden geïdentifice
Multivariate Evolutionary Analyses in Astrophysics
Fraix-Burnet, Didier
2011-01-01
The large amount of data on galaxies, up to higher and higher redshifts, asks for sophisticated statistical approaches to build adequate classifications. Multivariate cluster analyses, that compare objects for their global similarities, are still confidential in astrophysics, probably because their results are somewhat difficult to interpret. We believe that the missing key is the unavoidable characteristics in our Universe: evolution. Our approach, known as Astrocladistics, is based on the evolutionary nature of both galaxies and their properties. It gathers objects according to their "histories" and establishes an evolutionary scenario among groups of objects. In this presentation, I show two recent results on globular clusters and earlytype galaxies to illustrate how the evolutionary concepts of Astrocladistics can also be useful for multivariate analyses such as K-means Cluster Analysis.
Workload analyse of assembling process
Ghenghea, L. D.
2015-11-01
The workload is the most important indicator for managers responsible of industrial technological processes no matter if these are automated, mechanized or simply manual in each case, machines or workers will be in the focus of workload measurements. The paper deals with workload analyses made to a most part manual assembling technology for roller bearings assembling process, executed in a big company, with integrated bearings manufacturing processes. In this analyses the delay sample technique have been used to identify and divide all bearing assemblers activities, to get information about time parts from 480 minutes day work time that workers allow to each activity. The developed study shows some ways to increase the process productivity without supplementary investments and also indicated the process automation could be the solution to gain maximum productivity.
Borza Natalia
2015-01-01
English as a second language (ESL) teachers instructing general English and English for specific purposes (ESP) in bilingual secondary schools face various challenges when it comes to choosing the main linguistic foci of language preparatory courses enabling non-native students to study academic subjects in English. ESL teachers intending to analyse English language subject textbooks written for secondary school students with the aim of gaining information about what bilingual secondary schoo...
An extensible analysable system model
DEFF Research Database (Denmark)
Probst, Christian W.; Hansen, Rene Rydhof
2008-01-01
, this does not hold for real physical systems. Approaches such as threat modelling try to target the formalisation of the real-world domain, but still are far from the rigid techniques available in security research. Many currently available approaches to assurance of critical infrastructure security...... allows for easy development of analyses for the abstracted systems. We briefly present one application of our approach, namely the analysis of systems for potential insider threats....
2015-01-05
Semiquantitative elemental composition. – Elemental mapping and line scans. • Fourier Transform Infrared ( FTIR ) spectroscopy – Identification of chemical...Transform Infrared ( FTIR ) spectroscopy – Nicolet 6700 spectrometer. – Harrick Scientific “praying mantis” diffuse reflectance accessory. • Qualitative...VIS-NIR Spectroscopy Dianna Alaan © The Aerospace Corporation 2015 DebriSat Laboratory Analyses 5 January, 2015 Paul M. Adams1, Zachary Lingley2
Mitogenomic analyses of eutherian relationships.
Arnason, U; Janke, A
2002-01-01
Reasonably correct phylogenies are fundamental to the testing of evolutionary hypotheses. Here, we present phylogenetic findings based on analyses of 67 complete mammalian mitochondrial (mt) genomes. The analyses, irrespective of whether they were performed at the amino acid (aa) level or on nucleotides (nt) of first and second codon positions, placed Erinaceomorpha (hedgehogs and their kin) as the sister group of remaining eutherians. Thus, the analyses separated Erinaceomorpha from other traditional lipotyphlans (e.g., tenrecs, moles, and shrews), making traditional Lipotyphla polyphyletic. Both the aa and nt data sets identified the two order-rich eutherian clades, the Cetferungulata (comprising Pholidota, Carnivora, Perissodactyla, Artiodactyla, and Cetacea) and the African clade (Tenrecomorpha, Macroscelidea, Tubulidentata, Hyracoidea, Proboscidea, and Sirenia). The study corroborated recent findings that have identified a sister-group relationship between Anthropoidea and Dermoptera (flying lemurs), thereby making our own order, Primates, a paraphyletic assembly. Molecular estimates using paleontologically well-established calibration points, placed the origin of most eutherian orders in Cretaceous times, 70-100 million years before present (MYBP). The same estimates place all primate divergences much earlier than traditionally believed. For example, the divergence between Homo and Pan is estimated to have taken place approximately 10 MYBP, a dating consistent with recent findings in primate paleontology.
Directory of Open Access Journals (Sweden)
D. M. Priyantha Wedagama
2017-04-01
Full Text Available This study aims to develop time headway distribution models to analyse traffic safety performance and road link capacities for motorcycle-dominated traffic in Denpasar, Bali. Three road links selected as the case study are Jl. Hayam Wuruk, Jl.Hang Tuah, and Jl. Padma. Data analysis showed that between 55%-80% of motorists in Denpasar during morning and evening peak hours paid less attention to the safe distance with the vehicles in front. The study found that Lognormal distribution models are best to fit time headway data during morning peak hours while either Weibull (3P or Pearson III distributions is for evening peak hours. Road link capacities for mixed traffic predominantly motorcycles are apparently affected by the behaviour of motorists in keeping safe distance with the vehicles in front. Theoretical road link capacities for Jl. Hayam Wuruk, Jl. Hang Tuah and Jl. Padma are 3,186 vehicles/hour, 3,077 vehicles/hour and 1935 vehicles/hour respectively.
Holovatch, Yurij; Szell, Michael; Thurner, Stefan
2016-01-01
We present an overview of a series of results obtained from the analysis of human behavior in a virtual environment. We focus on the massive multiplayer online game (MMOG) Pardus which has a worldwide participant base of more than 400,000 registered players. We provide evidence for striking statistical similarities between social structures and human-action dynamics in the real and virtual worlds. In this sense MMOGs provide an extraordinary way for accurate and falsifiable studies of social phenomena. We further discuss possibilities to apply methods and concepts developed in the course of these studies to analyse oral and written narratives.
[Laboratory analyses in sports medicine].
Clénin, German E; Cordes, Mareike
2015-05-01
Laboratory analyses in sports medicine are relevant for three reasons: 1. In actively exercising individuals laboratory analysis are one of the central elements in the diagnosis of diseases and overreaching. 2. Regularly done laboratory analysis in competitive athletes with high load of training and competition may help to detect certain deficiencies early on. 3. Physical activity in general and competitive exercise training specifically do change certain routine laboratory parameters significantly although not reflecting pathological changes. These so-called preanalytic variations should be taken into consideration while interpreting laboratory data in medical emergency and routine diagnostics. This article intends to help the physician to interprete laboratory data of actively exercising sportsmen.
THOR Turbulence Electron Analyser: TEA
Fazakerley, Andrew; Moore, Tom; Owen, Chris; Pollock, Craig; Wicks, Rob; Samara, Marilia; Rae, Jonny; Hancock, Barry; Kataria, Dhiren; Rust, Duncan
2016-04-01
Turbulence Heating ObserveR (THOR) is the first mission ever flown in space dedicated to plasma turbulence. The Turbulence Electron Analyser (TEA) will measure the plasma electron populations in the mission's Regions of Interest. It will collect a 3D electron velocity distribution with cadences as short as 5 ms. The instrument will be capable of measuring energies up to 30 keV. TEA consists of multiple electrostatic analyser heads arranged so as to measure electrons arriving from look directions covering the full sky, i.e. 4 pi solid angle. The baseline concept is similar to the successful FPI-DES instrument currently operating on the MMS mission. TEA is intended to have a similar angular resolution, but a larger geometric factor. In comparison to earlier missions, TEA improves on the measurement cadence. For example, MMS FPI-DES routinely operates at 30 ms cadence. The objective of measuring distributions at rates as fast as 5 ms is driven by the mission's scientific requirements to resolve electron gyroscale size structures, where plasma heating and fluctuation dissipation is predicted to occur. TEA will therefore be capable of making measurements of the evolution of distribution functions across thin (a few km) current sheets travelling past the spacecraft at up to 600 km/s, of the Power Spectral Density of fluctuations of electron moments and of distributions fast enough to match frequencies with waves expected to be dissipating turbulence (e.g. with 100 Hz whistler waves).
Severe accident recriticality analyses (SARA)
DEFF Research Database (Denmark)
Frid, W.; Højerup, C.F.; Lindholm, I.
2001-01-01
three computer codes and to further develop and adapt them for the task. The codes were SIMULATE-3K, APROS and RECRIT. Recriticality analyses were carried out for a number of selected reflooding transients for the Oskarshamn 3 plant in Sweden with SIMULATE-3K and for the Olkiluoto I plant in Finland...... with all three codes. The core initial and boundary conditions prior to recriticality have been studied with the severe accident codes SCDAP/RELAP5, MELCOR and MAAP4. The results of the analyses show that all three codes predict recriticality-both super-prompt power bursts and quasi steady-state power...... generation-for the range of parameters studied, i.e. with core uncovering and heat-up to maximum core temperatures of approximately 1800 K, and water flow rates of 45-2000 kg s(-1) injected into the downcomer. Since recriticality takes place in a small fraction of the core, the power densities are high...
Perturbation analyses of intermolecular interactions
Koyama, Yohei M.; Kobayashi, Tetsuya J.; Ueda, Hiroki R.
2011-08-01
Conformational fluctuations of a protein molecule are important to its function, and it is known that environmental molecules, such as water molecules, ions, and ligand molecules, significantly affect the function by changing the conformational fluctuations. However, it is difficult to systematically understand the role of environmental molecules because intermolecular interactions related to the conformational fluctuations are complicated. To identify important intermolecular interactions with regard to the conformational fluctuations, we develop herein (i) distance-independent and (ii) distance-dependent perturbation analyses of the intermolecular interactions. We show that these perturbation analyses can be realized by performing (i) a principal component analysis using conditional expectations of truncated and shifted intermolecular potential energy terms and (ii) a functional principal component analysis using products of intermolecular forces and conditional cumulative densities. We refer to these analyses as intermolecular perturbation analysis (IPA) and distance-dependent intermolecular perturbation analysis (DIPA), respectively. For comparison of the IPA and the DIPA, we apply them to the alanine dipeptide isomerization in explicit water. Although the first IPA principal components discriminate two states (the α state and PPII (polyproline II) + β states) for larger cutoff length, the separation between the PPII state and the β state is unclear in the second IPA principal components. On the other hand, in the large cutoff value, DIPA eigenvalues converge faster than that for IPA and the top two DIPA principal components clearly identify the three states. By using the DIPA biplot, the contributions of the dipeptide-water interactions to each state are analyzed systematically. Since the DIPA improves the state identification and the convergence rate with retaining distance information, we conclude that the DIPA is a more practical method compared with the
Directory of Open Access Journals (Sweden)
Borza Natalia
2015-03-01
Full Text Available English as a second language (ESL teachers instructing general English and English for specific purposes (ESP in bilingual secondary schools face various challenges when it comes to choosing the main linguistic foci of language preparatory courses enabling non-native students to study academic subjects in English. ESL teachers intending to analyse English language subject textbooks written for secondary school students with the aim of gaining information about what bilingual secondary school students need to know in terms of language to process academic textbooks cannot avoiding deal with a dilemma. It needs to be decided which way it is most appropriate to analyse the texts in question. Handbooks of English applied linguistics are not immensely helpful with regard to this problem as they tend not to give recommendation as to which major text analytical approaches are advisable to follow in a pre-college setting. The present theoretical research aims to address this lacuna. Respectively, the purpose of this pedagogically motivated theoretical paper is to investigate two major approaches of ESP text analysis, the register and the genre analysis, in order to find the more suitable one for exploring the language use of secondary school subject texts from the point of view of an English as a second language teacher. Comparing and contrasting the merits and limitations of the two contrastive approaches allows for a better understanding of the nature of the two different perspectives of text analysis. The study examines the goals, the scope of analysis, and the achievements of the register perspective and those of the genre approach alike. The paper also investigates and reviews in detail the starkly different methods of ESP text analysis applied by the two perspectives. Discovering text analysis from a theoretical and methodological angle supports a practical aspect of English teaching, namely making an informed choice when setting out to analyse
Proteins analysed as virtual knots
Alexander, Keith; Dennis, Mark R
2016-01-01
Long, flexible physical filaments are naturally tangled and knotted, from macroscopic string down to long-chain molecules. The existence of knotting in a filament naturally affects its configuration and properties, and may be very stable or disappear rapidly under manipulation and interaction. Knotting has been previously identified in protein backbone chains, for which these mechanical constraints are of fundamental importance to their molecular functionality, despite their being open curves in which the knots are not mathematically well defined; knotting can only be identified by closing the termini of the chain somehow. We introduce a new method for resolving knotting in open curves using virtual knots, a wider class of topological objects that do not require a classical closure and so naturally capture the topological ambiguity inherent in open curves. We describe the results of analysing proteins in the Protein Data Bank by this new scheme, recovering and extending previous knotting results, and identify...
Retorisk analyse af historiske tekster
DEFF Research Database (Denmark)
Kock, Christian Erik J
2014-01-01
In recent years, rhetoric and the rhetorical tradition has attracted increasing interest from historians, such as, e.g., Quentin Skinner. The paper aims to explain and illustrate what may be understood by a rhetorical analysis (or “rhetorical criticism”) of historical documents, i.e., how those...... scholars who identify themselves as rhetoricians tend to define and conduct such an analysis. It is argued that while rhetoricians would sympathize with Skinner’s adoption of speech act theory in his reading of historical documents, they would generally extend their rhetorical readings of such documents...... to many more features than just the key concepts invoked in them. The paper discusses examples of rhetorical analyses done by prominent contemporary rhetoricians, including Edwin Black, Kenneth Burke, Maurice Charland, and Michael Leff. It relates its view of rhetorical documents to trends in current...
HGCal Simulation Analyses for CMS
Bruno, Sarah Marie
2015-01-01
This summer, I approached the topic of fast-timing detection of photons from Higgs decays via simulation analyses, working under the supervision of Dr. Adolf Bornheim of the California Institute of Technology. My specific project focused on simulating the high granularity calorimeter for the Compact Muon Solenoid (CMS) experiment. CMS detects particles using calorimeters. The Electromagnetic Calorimeter (ECal) is arranged cylindrically to form a barrel section and two “endcaps.” Previously, both the barrel and endcap have employed lead tungstate crystal detectors, known as the “shashlik” design. The crystal detectors, however, rapidly degrade from exposure to radiation. This effect is most pronounced in the endcaps. To avoid the high expense of frequently replacing degraded detectors, it was recently decided to eliminate the endcap crystals in favor of an arrangement of silicon detectors known as the “High Granularity Calorimeter” (HGCal), while leaving the barrel detector technology unchanged. T...
Analysing Protocol Stacks for Services
DEFF Research Database (Denmark)
Gao, Han; Nielson, Flemming; Nielson, Hanne Riis
2011-01-01
We show an approach, CaPiTo, to model service-oriented applications using process algebras such that, on the one hand, we can achieve a certain level of abstraction without being overwhelmed by the underlying implementation details and, on the other hand, we respect the concrete industrial...... standards used for implementing the service-oriented applications. By doing so, we will be able to not only reason about applications at different levels of abstractions, but also to build a bridge between the views of researchers on formal methods and developers in industry. We apply our approach...... to the financial case study taken from Chapter 0-3. Finally, we develop a static analysis to analyse the security properties as they emerge at the level of concrete industrial protocols....
Energy Technology Data Exchange (ETDEWEB)
Worapittayaporn, S.; Eyink, J.; Movahed, M. [AREVA NP GmbH, P.O. Box 3220, D-91050 Erlangen (Germany)
2008-07-01
In severe accidents with core melting large amounts of hydrogen may be released into the containment. The EPR provides a combustible gas control system to prevent hydrogen combustion modes with the potential to challenge the containment integrity due to excessive pressure and temperature loads. This paper outlines the approach for the verification of the effectiveness and efficiency of this system. Specifically, the justification is a multi-step approach. It involves the deployment of integral codes, lumped parameter containment codes and CFD codes and the use of the sigma criterion, which provides the link to the broad experimental data base for flame acceleration (FA) and deflagration to detonation transition (DDT). The procedure is illustrated with an example. The performed analyses show that hydrogen combustion at any time does not lead to pressure or temperature loads that threaten the containment integrity of the EPR. (authors)
Analysing performance through value creation
Directory of Open Access Journals (Sweden)
Adrian TRIFAN
2015-12-01
Full Text Available This paper draws a parallel between measuring financial performance in 2 variants: the first one using data offered by accounting, which lays emphasis on maximizing profit, and the second one which aims to create value. The traditional approach to performance is based on some indicators from accounting data: ROI, ROE, EPS. The traditional management, based on analysing the data from accounting, has shown its limits, and a new approach is needed, based on creating value. The evaluation of value based performance tries to avoid the errors due to accounting data, by using other specific indicators: EVA, MVA, TSR, CVA. The main objective is shifted from maximizing the income to maximizing the value created for shareholders. The theoretical part is accompanied by a practical analysis regarding the creation of value and an analysis of the main indicators which evaluate this concept.
Uncertainty and Sensitivity Analyses Plan
Energy Technology Data Exchange (ETDEWEB)
Simpson, J.C.; Ramsdell, J.V. Jr.
1993-04-01
Hanford Environmental Dose Reconstruction (HEDR) Project staff are developing mathematical models to be used to estimate the radiation dose that individuals may have received as a result of emissions since 1944 from the US Department of Energy's (DOE) Hanford Site near Richland, Washington. An uncertainty and sensitivity analyses plan is essential to understand and interpret the predictions from these mathematical models. This is especially true in the case of the HEDR models where the values of many parameters are unknown. This plan gives a thorough documentation of the uncertainty and hierarchical sensitivity analysis methods recommended for use on all HEDR mathematical models. The documentation includes both technical definitions and examples. In addition, an extensive demonstration of the uncertainty and sensitivity analysis process is provided using actual results from the Hanford Environmental Dose Reconstruction Integrated Codes (HEDRIC). This demonstration shows how the approaches used in the recommended plan can be adapted for all dose predictions in the HEDR Project.
Analysing the Wrongness of Killing
DEFF Research Database (Denmark)
Di Nucci, Ezio
2014-01-01
This article provides an in-depth analysis of the wrongness of killing by comparing different versions of three influential views: the traditional view that killing is always wrong; the liberal view that killing is wrong if and only if the victim does not want to be killed; and Don Marquis‟ future...... of value account of the wrongness of killing. In particular, I illustrate the advantages that a basic version of the liberal view and a basic version of the future of value account have over competing alternatives. Still, ultimately none of the views analysed here are satisfactory; but the different...... reasons why those competing views fail provide important insights into the ethics of killing....
Proteins analysed as virtual knots
Alexander, Keith; Taylor, Alexander J.; Dennis, Mark R.
2017-02-01
Long, flexible physical filaments are naturally tangled and knotted, from macroscopic string down to long-chain molecules. The existence of knotting in a filament naturally affects its configuration and properties, and may be very stable or disappear rapidly under manipulation and interaction. Knotting has been previously identified in protein backbone chains, for which these mechanical constraints are of fundamental importance to their molecular functionality, despite their being open curves in which the knots are not mathematically well defined; knotting can only be identified by closing the termini of the chain somehow. We introduce a new method for resolving knotting in open curves using virtual knots, which are a wider class of topological objects that do not require a classical closure and so naturally capture the topological ambiguity inherent in open curves. We describe the results of analysing proteins in the Protein Data Bank by this new scheme, recovering and extending previous knotting results, and identifying topological interest in some new cases. The statistics of virtual knots in protein chains are compared with those of open random walks and Hamiltonian subchains on cubic lattices, identifying a regime of open curves in which the virtual knotting description is likely to be important.
Proteins analysed as virtual knots
Alexander, Keith; Taylor, Alexander J.; Dennis, Mark R.
2017-01-01
Long, flexible physical filaments are naturally tangled and knotted, from macroscopic string down to long-chain molecules. The existence of knotting in a filament naturally affects its configuration and properties, and may be very stable or disappear rapidly under manipulation and interaction. Knotting has been previously identified in protein backbone chains, for which these mechanical constraints are of fundamental importance to their molecular functionality, despite their being open curves in which the knots are not mathematically well defined; knotting can only be identified by closing the termini of the chain somehow. We introduce a new method for resolving knotting in open curves using virtual knots, which are a wider class of topological objects that do not require a classical closure and so naturally capture the topological ambiguity inherent in open curves. We describe the results of analysing proteins in the Protein Data Bank by this new scheme, recovering and extending previous knotting results, and identifying topological interest in some new cases. The statistics of virtual knots in protein chains are compared with those of open random walks and Hamiltonian subchains on cubic lattices, identifying a regime of open curves in which the virtual knotting description is likely to be important. PMID:28205562
Severe Accident Recriticality Analyses (SARA)
Energy Technology Data Exchange (ETDEWEB)
Frid, W. [Swedish Nuclear Power Inspectorate, Stockholm (Sweden); Hoejerup, F. [Risoe National Lab. (Denmark); Lindholm, I.; Miettinen, J.; Puska, E.K. [VTT Energy, Helsinki (Finland); Nilsson, Lars [Studsvik Eco and Safety AB, Nykoeping (Sweden); Sjoevall, H. [Teoliisuuden Voima Oy (Finland)
1999-11-01
Recriticality in a BWR has been studied for a total loss of electric power accident scenario. In a BWR, the B{sub 4}C control rods would melt and relocate from the core before the fuel during core uncovery and heat-up. If electric power returns during this time-window unborated water from ECCS systems will start to reflood the partly control rod free core. Recriticality might take place for which the only mitigating mechanisms are the Doppler effect and void formation. In order to assess the impact of recriticality on reactor safety, including accident management measures, the following issues have been investigated in the SARA project: 1. the energy deposition in the fuel during super-prompt power burst, 2. the quasi steady-state reactor power following the initial power burst and 3. containment response to elevated quasi steady-state reactor power. The approach was to use three computer codes and to further develop and adapt them for the task. The codes were SIMULATE-3K, APROS and RECRIT. Recriticality analyses were carried out for a number of selected reflooding transients for the Oskarshamn 3 plant in Sweden with SIMULATE-3K and for the Olkiluoto 1 plant in Finland with all three codes. The core state initial and boundary conditions prior to recriticality have been studied with the severe accident codes SCDAP/RELAP5, MELCOR and MAAP4. The results of the analyses show that all three codes predict recriticality - both superprompt power bursts and quasi steady-state power generation - for the studied range of parameters, i. e. with core uncovery and heat-up to maximum core temperatures around 1800 K and water flow rates of 45 kg/s to 2000 kg/s injected into the downcomer. Since the recriticality takes place in a small fraction of the core the power densities are high which results in large energy deposition in the fuel during power burst in some accident scenarios. The highest value, 418 cal/g, was obtained with SIMULATE-3K for an Oskarshamn 3 case with reflooding
ITER Safety Analyses with ISAS
Gulden, W.; Nisan, S.; Porfiri, M.-T.; Toumi, I.; de Gramont, T. Boubée
1997-06-01
Detailed analyses of accident sequences for the International Thermonuclear Experimental Reactor (ITER), from an initiating event to the environmental release of activity, have involved in the past the use of different types of computer codes in a sequential manner. Since these codes were developed at different time scales in different countries, there is no common computing structure to enable automatic data transfer from one code to the other, and no possibility exists to model or to quantify the effect of coupled physical phenomena. To solve this problem, the Integrated Safety Analysis System of codes (ISAS) is being developed, which allows users to integrate existing computer codes in a coherent manner. This approach is based on the utilization of a command language (GIBIANE) acting as a “glue” to integrate the various codes as modules of a common environment. The present version of ISAS allows comprehensive (coupled) calculations of a chain of codes such as ATHENA (thermal-hydraulic analysis of transients and accidents), INTRA (analysis of in-vessel chemical reactions, pressure built-up, and distribution of reaction products inside the vacuum vessel and adjacent rooms), and NAUA (transport of radiological species within buildings and to the environment). In the near future, the integration of S AFALY (simultaneous analysis of plasma dynamics and thermal behavior of in-vessel components) is also foreseen. The paper briefly describes the essential features of ISAS development and the associated software architecture. It gives first results of a typical ITER accident sequence, a loss of coolant accident (LOCA) in the divertor cooling loop inside the vacuum vessel, amply demonstrating ISAS capabilities.
Asquith, William H.; Kiang, Julie E.; Cohn, Timothy A.
2017-07-17
The U.S. Geological Survey (USGS), in cooperation with the U.S. Nuclear Regulatory Commission, has investigated statistical methods for probabilistic flood hazard assessment to provide guidance on very low annual exceedance probability (AEP) estimation of peak-streamflow frequency and the quantification of corresponding uncertainties using streamgage-specific data. The term “very low AEP” implies exceptionally rare events defined as those having AEPs less than about 0.001 (or 1 × 10–3 in scientific notation or for brevity 10–3). Such low AEPs are of great interest to those involved with peak-streamflow frequency analyses for critical infrastructure, such as nuclear power plants. Flood frequency analyses at streamgages are most commonly based on annual instantaneous peak streamflow data and a probability distribution fit to these data. The fitted distribution provides a means to extrapolate to very low AEPs. Within the United States, the Pearson type III probability distribution, when fit to the base-10 logarithms of streamflow, is widely used, but other distribution choices exist. The USGS-PeakFQ software, implementing the Pearson type III within the Federal agency guidelines of Bulletin 17B (method of moments) and updates to the expected moments algorithm (EMA), was specially adapted for an “Extended Output” user option to provide estimates at selected AEPs from 10–3 to 10–6. Parameter estimation methods, in addition to product moments and EMA, include L-moments, maximum likelihood, and maximum product of spacings (maximum spacing estimation). This study comprehensively investigates multiple distributions and parameter estimation methods for two USGS streamgages (01400500 Raritan River at Manville, New Jersey, and 01638500 Potomac River at Point of Rocks, Maryland). The results of this study specifically involve the four methods for parameter estimation and up to nine probability distributions, including the generalized extreme value, generalized
Pawnee Nation Energy Option Analyses
Energy Technology Data Exchange (ETDEWEB)
Matlock, M.; Kersey, K.; Riding In, C.
2009-07-21
Pawnee Nation of Oklahoma Energy Option Analyses In 2003, the Pawnee Nation leadership identified the need for the tribe to comprehensively address its energy issues. During a strategic energy planning workshop a general framework was laid out and the Pawnee Nation Energy Task Force was created to work toward further development of the tribe’s energy vision. The overarching goals of the “first steps” project were to identify the most appropriate focus for its strategic energy initiatives going forward, and to provide information necessary to take the next steps in pursuit of the “best fit” energy options. Description of Activities Performed The research team reviewed existing data pertaining to the availability of biomass (focusing on woody biomass, agricultural biomass/bio-energy crops, and methane capture), solar, wind and hydropower resources on the Pawnee-owned lands. Using these data, combined with assumptions about costs and revenue streams, the research team performed preliminary feasibility assessments for each resource category. The research team also reviewed available funding resources and made recommendations to Pawnee Nation highlighting those resources with the greatest potential for financially-viable development, both in the near-term and over a longer time horizon. Findings and Recommendations Due to a lack of financial incentives for renewable energy, particularly at the state level, combined mediocre renewable energy resources, renewable energy development opportunities are limited for Pawnee Nation. However, near-term potential exists for development of solar hot water at the gym, and an exterior wood-fired boiler system at the tribe’s main administrative building. Pawnee Nation should also explore options for developing LFGTE resources in collaboration with the City of Pawnee. Significant potential may also exist for development of bio-energy resources within the next decade. Pawnee Nation representatives should closely monitor
Techniques for Analysing Problems in Engineering Projects
DEFF Research Database (Denmark)
Thorsteinsson, Uffe
1998-01-01
Description of how CPM network can be used for analysing complex problems in engineering projects.......Description of how CPM network can be used for analysing complex problems in engineering projects....
Techniques for Analysing Problems in Engineering Projects
DEFF Research Database (Denmark)
Thorsteinsson, Uffe
1998-01-01
Description of how CPM network can be used for analysing complex problems in engineering projects.......Description of how CPM network can be used for analysing complex problems in engineering projects....
10 CFR 61.13 - Technical analyses.
2010-01-01
... 10 Energy 2 2010-01-01 2010-01-01 false Technical analyses. 61.13 Section 61.13 Energy NUCLEAR REGULATORY COMMISSION (CONTINUED) LICENSING REQUIREMENTS FOR LAND DISPOSAL OF RADIOACTIVE WASTE Licenses § 61.13 Technical analyses. The specific technical information must also include the following analyses...
Automatic incrementalization of Prolog based static analyses
DEFF Research Database (Denmark)
Eichberg, Michael; Kahl, Matthias; Saha, Diptikalyan;
2007-01-01
Modem development environments integrate various static analyses into the build process. Analyses that analyze the whole project whenever the project changes are impractical in this context. We present an approach to automatic incrementalization of analyses that are specified as tabled logic prog...
Two-parameter Rankine Heat Pumps’ COP Equations
Directory of Open Access Journals (Sweden)
Samuel Sunday Adefila
2012-05-01
Full Text Available Equations for ideal vapour compression heat pump coefficient of performance (COPR which contain two fit-parameters are reported in this work. These equations contain either temperature term alone or temperature and pressure terms as the only thermodynamic variable(s. The best equation gave error ≥5% over wide range of temperature-lift and for different working fluid types that include fluorocarbons, hydrocarbons and inorganic fluids. In these respects the equation performs better than the one-parameter models reported earlier.
Parallel axes gear set optimization in two-parameter space
Theberge, Y.; Cardou, A.; Cloutier, L.
1991-05-01
This paper presents a method for optimal spur and helical gear transmission design that may be used in a computer aided design (CAD) approach. The design objective is generally taken as obtaining the most compact set for a given power input and gear ratio. A mixed design procedure is employed which relies both on heuristic considerations and computer capabilities. Strength and kinematic constraints are considered in order to define the domain of feasible designs. Constraints allowed include: pinion tooth bending strength, gear tooth bending strength, surface stress (resistance to pitting), scoring resistance, pinion involute interference, gear involute interference, minimum pinion tooth thickness, minimum gear tooth thickness, and profile or transverse contact ratio. A computer program was developed which allows the user to input the problem parameters, to select the calculation procedure, to see constraint curves in graphic display, to have an objective function level curve drawn through the design space, to point at a feasible design point and to have constraint values calculated at that point. The user can also modify some of the parameters during the design process.
Flux Vacua Statistics for Two-Parameter Calabi-Yau's
Misra, A
2004-01-01
We study the number of flux vacua for type IIB string theory on an orientifold of the Calabi-Yau expressed as a hypersurface in WCP^4[1,1,2,2,6] by evaluating a suitable integral over the complex-structure moduli space as per the conjecture of Douglas and Ashok. We show that away from the singular conifold locus, one gets the expected power law, and that the (neighborhood) of the conifold locus indeed acts as an attractor in the (complex structure) moduli space. We also study (non)supersymmetric solutions near the conifold locus.
7 CFR 94.102 - Analyses available.
2010-01-01
... analyses for total ash, fat by acid hydrolysis, moisture, salt, protein, beta-carotene, catalase... monocytogenes, proteolytic count, psychrotrophic bacteria, Salmonella, Staphylococcus, thermoduric bacteria,...
[Anne Arold. Kontrastive Analyse...] / Paul Alvre
Alvre, Paul, 1921-2008
2001-01-01
Arvustus: Arold, Anne. Kontrastive analyse der Wortbildungsmuster im Deutschen und im Estnischen (am Beispiel der Aussehensadjektive). Tartu, 2000. (Dissertationes philologiae germanicae Universitatis Tartuensis)
Institute of Scientific and Technical Information of China (English)
史建红; 林红梅
2013-01-01
本文利用广义p值和广义置信区间理论，研究了两独立服从双参数指数分布产品平均寿命比率的统计推断问题。给出了平均寿命比率的广义置信区间，并对该区间的覆盖率和区间长度进行了数据模拟，模拟结果与已有文献中的近似置信区间进行了比较，结果显示本文给出的广义置信区间的区间覆盖率和区间长度都要优于近似置信区间，特别是在小样本的情况下。%Methods for interval estimation and hypothesis testing about the ratio of expected lifetimes of two independently distributed two-parameter exponential distribution based on the concept of generalized variable approach are proposed. As assessed by simulation, the coverage probabilities of the proposed approach are found to be very close to the nominal level even for small samples. The proposed new approaches are conceptually simple and are easy to use. Similar procedures are developed for constructing confidence intervals and hypothesis testing about the difference between means of two independent two-parameter exponential distribution.
Novel Algorithms for Astronomical Plate Analyses
Indian Academy of Sciences (India)
Rene Hudec; Lukas Hudec
2011-03-01
Powerful computers and dedicated software allow effective data mining and scientific analyses in astronomical plate archives. We give and discuss examples of newly developed algorithms for astronomical plate analyses, e.g., searches for optical transients, as well as for major spectral and brightness changes.
Moving Crystal Slow-Neutron Wavelength Analyser
DEFF Research Database (Denmark)
Buras, B.; Kjems, Jørgen
1973-01-01
Experimental proof that a moving single crystal can serve as a slow-neutron wavelength analyser of special features is presented. When the crystal moves with a velocity h/(2 md) (h-Planck constant, m-neutron mass, d-interplanar spacing) perpendicular to the diffracting plane and the analysed...
Moving Crystal Slow-Neutron Wavelength Analyser
DEFF Research Database (Denmark)
Buras, B.; Kjems, Jørgen
1973-01-01
Experimental proof that a moving single crystal can serve as a slow-neutron wavelength analyser of special features is presented. When the crystal moves with a velocity h/(2 md) (h-Planck constant, m-neutron mass, d-interplanar spacing) perpendicular to the diffracting plane and the analysed...
Diversity of primary care systems analysed.
Kringos, D.; Boerma, W.; Bourgueil, Y.; Cartier, T.; Dedeu, T.; Hasvold, T.; Hutchinson, A.; Lember, M.; Oleszczyk, M.; Pavlick, D.R.
2015-01-01
This chapter analyses differences between countries and explains why countries differ regarding the structure and process of primary care. The components of primary care strength that are used in the analyses are health policy-making, workforce development and in the care process itself (see Fig.
Institute of Scientific and Technical Information of China (English)
张仕新; 刘义乐; 陈杰翔
2012-01-01
Based on the research of Weibull proportional hazards model,the test interval is calculated which takes the acceptable failure rate as constraints.Because the utilization of equipments is affected by multiple attributes such as test expense,failure risk and operational availability,stop duration and other factors,a weighted project compromise method is used to establish a multi-attribute test interval decision-making mode to realize the optimal decision-making of state test interval under multi-factor conditions.Finally,the applicability of the model is validated with the analysis of examples.%在对威布尔比例故障率模型进行研究的基础上,以可接受的故障风险为约束,计算了装备的检测间隔期。由于装备使用受到故障风险、检测费用、可用度及停机时间等多属性影响,运用基于加权投影折中法建立了模糊多属性状态检测周期决策模型,实现了多因素条件下状态检测间隔期的综合优化决策。最后,通过实例分析验证了该模型的适用性。
Cost-Benefit Analyses of Transportation Investments
DEFF Research Database (Denmark)
Næss, Petter
2006-01-01
This paper discusses the practice of cost-benefit analyses of transportation infrastructure investment projects from the meta-theoretical perspective of critical realism. Such analyses are based on a number of untenable ontological assumptions about social value, human nature and the natural......-to-pay investigations. Accepting the ontological and epistemological assumptions of cost-benefit analysis involves an implicit acceptance of the ethical and political values favoured by these assumptions. Cost-benefit analyses of transportation investment projects tend to neglect long-term environmental consequences...
Level II Ergonomic Analyses, Dover AFB, DE
1999-02-01
IERA-RS-BR-TR-1999-0002 UNITED STATES AIR FORCE IERA Level II Ergonomie Analyses, Dover AFB, DE Andrew Marcotte Marilyn Joyce The Joyce...Project (070401881, Washington, DC 20503. 1. AGENCY USE ONLY (Leave blank) 2. REPORT DATE 4. TITLE AND SUBTITLE Level II Ergonomie Analyses, Dover...1.0 INTRODUCTION 1-1 1.1 Purpose Of The Level II Ergonomie Analyses : 1-1 1.2 Approach 1-1 1.2.1 Initial Shop Selection and Administration of the
Comparison with Russian analyses of meteor impact
Energy Technology Data Exchange (ETDEWEB)
Canavan, G.H.
1997-06-01
The inversion model for meteor impacts is used to discuss Russian analyses and compare principal results. For common input parameters, the models produce consistent estimates of impactor parameters. Directions for future research are discussed and prioritized.
Analyse of Maintenance Cost in ST
Jenssen, B W
2001-01-01
An analyse has been carried out in ST concerning the total costs for the division. Even though the target was the maintenance costs in ST, the global budget over has been analysed. This has been done since there is close relation between investments & consolidation and the required level for maintenance. The purpose of the analyse was to focus on maintenance cost in ST as a ratio of total maintenance costs over the replacement value of the equipment, and to make some comparisons with other industries and laboratories. Families of equipment have been defined and their corresponding ratios calculated. This first approach gives us some "quantitative" measurements. This analyse should be combined with performance indicators (more "qualitative" measurements) that are telling us how well we are performing. This will help us in defending our budget, make better priorities, and we will satisfy the requirements from our external auditors.
Understanding Human Error Based on Automated Analyses
National Aeronautics and Space Administration — This is a report on a continuing study of automated analyses of experiential textual reports to gain insight into the causal factors of human errors in aviation...
Multivariate differential analyses of adolescents' experiences of ...
African Journals Online (AJOL)
and second order factor analyses, correlations, multiple regression, MANOVA, ... This does not mean that the high levels of violence, crime and abuse that are aggravated by socio economic factors such as poverty, unemployment, corruption, ...
Anthocyanin analyses of Vaccinium fruit dietary supplements
Vaccinium fruit ingredients within dietary supplements were identified by comparisons with anthocyanin analyses of known Vaccinium profiles (demonstration of anthocyanin fingerprinting). Available Vaccinium supplements were purchased and analyzed; their anthocyanin profiles (based on HPLC separation...
En kritisk analyse af Leviathan & Samfundspagten
Bjelobrk, Aleksandar
2006-01-01
I denne rapport undersøges de politiske filosoffer Thomas Hobbes’ (1588-1679) og Jean-Jacques Rousseaus (1712-1778) udgaver af kontraktteorien. Med udgangspunkt i Leviathan del 1, Of Man og del 2, Of Common-Wealth af Thomas Hobbes (Hobbes 1991, oprindelig udgivet 1651) og Samfundspagten af Jean-Jacques Rousseau (Rousseau 1987, oprindelig udgivet 1762) klarlægges inkonsisten i deres teorier gennem to separate analyser. Med udgangspunkt i den fundne inkonsistens foretages en komparativ analyse,...
Thermal Analyse sof Cross-Linked Polyethylene
Directory of Open Access Journals (Sweden)
Radek Polansky
2007-01-01
Full Text Available The paper summarizes results obtained during the structural analyses measurements (Differential Scanning Calorimetry DSC, Thermogravimetry TG, Thermomechanical analysis TMA and Fourier transform infrared spectroscopy FT-IR. The samples of cross-linked polyethylene cable insulation were tested via these analyses. The DSC and TG were carried out using simultaneous thermal analyzer TA Instruments SDT Q600 with connection of Fourier transform infrared spectrometer Nicolet 380. Thermomechanical analysis was carried out by TMA Q400EM TA Instruments apparatus.
Study Of Ceramic-Polymer Composites Reliability Based On The Bending Strength Test
Directory of Open Access Journals (Sweden)
Walczak Agata
2015-11-01
Full Text Available In this paper the reliability assessment of structural reliability of the selected light-cured dental composites based on the biaxial flexural strength test results has been presented. A two-parameter Weibull distribution was applied as a reliability model in order to estimate probability of strength maintenance in the analysed population. Weibull distribution parameters were interpreted as a characteristic material strength (scale parameter and structural reliability parameter in terms of ability to maintain strength by each of specimen from the general population (shape parameter. 20 composite specimens underwent strength tests, including 2 “flow” type composites and 2 standard composites (with typical filler content. “Flow” type composites were characterized with lower characteristic strength and higher structural reliability comparing to other studied composites.
Impact of ontology evolution on functional analyses.
Groß, Anika; Hartung, Michael; Prüfer, Kay; Kelso, Janet; Rahm, Erhard
2012-10-15
Ontologies are used in the annotation and analysis of biological data. As knowledge accumulates, ontologies and annotation undergo constant modifications to reflect this new knowledge. These modifications may influence the results of statistical applications such as functional enrichment analyses that describe experimental data in terms of ontological groupings. Here, we investigate to what degree modifications of the Gene Ontology (GO) impact these statistical analyses for both experimental and simulated data. The analysis is based on new measures for the stability of result sets and considers different ontology and annotation changes. Our results show that past changes in the GO are non-uniformly distributed over different branches of the ontology. Considering the semantic relatedness of significant categories in analysis results allows a more realistic stability assessment for functional enrichment studies. We observe that the results of term-enrichment analyses tend to be surprisingly stable despite changes in ontology and annotation.
Random error in cardiovascular meta-analyses
DEFF Research Database (Denmark)
Albalawi, Zaina; McAlister, Finlay A; Thorlund, Kristian;
2013-01-01
BACKGROUND: Cochrane reviews are viewed as the gold standard in meta-analyses given their efforts to identify and limit systematic error which could cause spurious conclusions. The potential for random error to cause spurious conclusions in meta-analyses is less well appreciated. METHODS: We...... examined all reviews approved and published by the Cochrane Heart Group in the 2012 Cochrane Library that included at least one meta-analysis with 5 or more randomized trials. We used trial sequential analysis to classify statistically significant meta-analyses as true positives if their pooled sample size...... but infrequently recognized, even among methodologically robust reviews published by the Cochrane Heart Group. Meta-analysts and readers should incorporate trial sequential analysis when interpreting results....
Eksakte metodar for analyse av tovegstabellar
Aaberge, Rolf
1980-01-01
Dei fleste matematisk-statistiske metodane som er utvikla til analyse av tabellar, byggjer på føresetnader om at talet på observasjonar i tabellcellene er "stort. Haldorsen (1977a) og (1977b) omtalar metodar som kviler på dette kravet. I denne rapporten skal vi presentere eksakte metodar for analyse av to-vegstabel lar, dvs. metodar som er gyldige sjølv om vi har småe observasjonstal i tabellcellene. I mange undersøkingar vil observasjonane ofte gi uttrykk for kva slags ...
A theoretical framework for analysing preschool teaching
DEFF Research Database (Denmark)
Chaiklin, Seth
2014-01-01
This article introduces a theoretical framework for analysing preschool teaching as a historically-grounded societal practice. The aim is to present a unified framework that can be used to analyse and compare both historical and contemporary examples of preschool teaching practice within and across...... national traditions. The framework has two main components, an analysis of preschool teaching as a practice, formed in relation to societal needs, and an analysis of the categorical relations which necessarily must be addressed in preschool teaching activity. The framework is introduced and illustrated...
The application analyses for primary spectrum pyrometer
Institute of Scientific and Technical Information of China (English)
FU; TaiRan
2007-01-01
In the applications of primary spectrum pyrometry, based on the dynamic range and the minimum sensibility of the sensor, the application issues, such as the measurement range and the measurement partition, were investigated through theoretical analyses. For a developed primary spectrum pyrometer, the theoretical predictions of measurement range and the distributions of measurement partition were presented through numerical simulations. And the measurement experiments of high-temperature blackbody and standard temperature lamp were processed to further verify the above theoretical analyses and numerical results. Therefore the research in the paper provides the helpful supports for the applications of primary spectrum pyrometer and other radiation pyrometers.……
Identifying, analysing and solving problems in practice.
Hewitt-Taylor, Jaqui
When a problem is identified in practice, it is important to clarify exactly what it is and establish the cause before seeking a solution. This solution-seeking process should include input from those directly involved in the problematic situation, to enable individuals to contribute their perspective, appreciate why any change in practice is necessary and what will be achieved by the change. This article describes some approaches to identifying and analysing problems in practice so that effective solutions can be devised. It includes a case study and examples of how the Five Whys analysis, fishbone diagram, problem tree analysis, and Seven-S Model can be used to analyse a problem.
Analyses of hydraulic performance of velocity caps
DEFF Research Database (Denmark)
Christensen, Erik Damgaard; Degn Eskesen, Mark Chr.; Buhrkall, Jeppe
2014-01-01
The hydraulic performance of a velocity cap has been investigated. Velocity caps are often used in connection with offshore intakes. CFD (computational fluid dynamics) examined the flow through the cap openings and further down into the intake pipes. This was combined with dimension analyses...... in order to analyse the effect of different layouts on the flow characteristics. In particular, flow configurations going all the way through the structure were revealed. A couple of suggestions to minimize the risk for flow through have been tested....
Uncertainty quantification approaches for advanced reactor analyses.
Energy Technology Data Exchange (ETDEWEB)
Briggs, L. L.; Nuclear Engineering Division
2009-03-24
The original approach to nuclear reactor design or safety analyses was to make very conservative modeling assumptions so as to ensure meeting the required safety margins. Traditional regulation, as established by the U. S. Nuclear Regulatory Commission required conservatisms which have subsequently been shown to be excessive. The commission has therefore moved away from excessively conservative evaluations and has determined best-estimate calculations to be an acceptable alternative to conservative models, provided the best-estimate results are accompanied by an uncertainty evaluation which can demonstrate that, when a set of analysis cases which statistically account for uncertainties of all types are generated, there is a 95% probability that at least 95% of the cases meet the safety margins. To date, nearly all published work addressing uncertainty evaluations of nuclear power plant calculations has focused on light water reactors and on large-break loss-of-coolant accident (LBLOCA) analyses. However, there is nothing in the uncertainty evaluation methodologies that is limited to a specific type of reactor or to specific types of plant scenarios. These same methodologies can be equally well applied to analyses for high-temperature gas-cooled reactors and to liquid metal reactors, and they can be applied to steady-state calculations, operational transients, or severe accident scenarios. This report reviews and compares both statistical and deterministic uncertainty evaluation approaches. Recommendations are given for selection of an uncertainty methodology and for considerations to be factored into the process of evaluating uncertainties for advanced reactor best-estimate analyses.
Integrated genomic analyses of ovarian carcinoma
Bell, D.; Berchuck, A.; Birrer, M.; Chien, J.; Dao, F.; Dhir, R.; DiSaia, P.; Gabra, H.; Glenn, P.; Godwin, A. K.; Gross, J.; Hartmann, L.; Huang, M.; Huntsman, D. G.; Iacocca, M.; Imielinski, M.; Kalloger, S.; Karlan, B. Y.; Levine, D. A.; Mills, G. B.; Morrison, C.; Mutch, D.; Olvera, N.; Orsulic, S.; Park, K.; Petrelli, N.; Rabeno, B.; Rader, J. S.; Sikic, B. I.; Smith-McCune, K.; Sood, A. K.; Bowtell, D.; Penny, R.; Testa, J. R.; Chang, K.; Dinh, H. H.; Drummond, J. A.; Fowler, G.; Gunaratne, P.; Hawes, A. C.; Kovar, C. L.; Lewis, L. R.; Morgan, M. B.; Newsham, I. F.; Santibanez, J.; Reid, J. G.; Trevino, L. R.; Wu, Y. -Q.; Wang, M.; Muzny, D. M.; Wheeler, D. A.; Gibbs, R. A.; Getz, G.; Lawrence, M. S.; Cibulskis, K.; Sivachenko, A. Y.; Sougnez, C.; Voet, D.; Wilkinson, J.; Bloom, T.; Ardlie, K.; Fennell, T.; Baldwin, J.; Gabriel, S.; Lander, E. S.; Ding, L.; Fulton, R. S.; Koboldt, D. C.; McLellan, M. D.; Wylie, T.; Walker, J.; O'Laughlin, M.; Dooling, D. J.; Fulton, L.; Abbott, R.; Dees, N. D.; Zhang, Q.; Kandoth, C.; Wendl, M.; Schierding, W.; Shen, D.; Harris, C. C.; Schmidt, H.; Kalicki, J.; Delehaunty, K. D.; Fronick, C. C.; Demeter, R.; Cook, L.; Wallis, J. W.; Lin, L.; Magrini, V. J.; Hodges, J. S.; Eldred, J. M.; Smith, S. M.; Pohl, C. S.; Vandin, F.; Raphael, B. J.; Weinstock, G. M.; Mardis, R.; Wilson, R. K.; Meyerson, M.; Winckler, W.; Getz, G.; Verhaak, R. G. W.; Carter, S. L.; Mermel, C. H.; Saksena, G.; Nguyen, H.; Onofrio, R. C.; Lawrence, M. S.; Hubbard, D.; Gupta, S.; Crenshaw, A.; Ramos, A. H.; Ardlie, K.; Chin, L.; Protopopov, A.; Zhang, Juinhua; Kim, T. M.; Perna, I.; Xiao, Y.; Zhang, H.; Ren, G.; Sathiamoorthy, N.; Park, R. W.; Lee, E.; Park, P. J.; Kucherlapati, R.; Absher, D. M.; Waite, L.; Sherlock, G.; Brooks, J. D.; Li, J. Z.; Xu, J.; Myers, R. M.; Laird, P. W.; Cope, L.; Herman, J. G.; Shen, H.; Weisenberger, D. J.; Noushmehr, H.; Pan, F.; Triche, T.; Berman, B. P.; Van den Berg, D. J.; Buckley, J.; Baylin, S. B.; Spellman, P. T.; Purdom, E.; Neuvial, P.; Bengtsson, H.; Jakkula, L. R.; Durinck, S.; Han, J.; Dorton, S.; Marr, H.; Choi, Y. G.; Wang, V.; Wang, N. J.; Ngai, J.; Conboy, J. G.; Parvin, B.; Feiler, H. S.; Speed, T. P.; Gray, J. W.; Levine, D. A.; Socci, N. D.; Liang, Y.; Taylor, B. S.; Schultz, N.; Borsu, L.; Lash, A. E.; Brennan, C.; Viale, A.; Sander, C.; Ladanyi, M.; Hoadley, K. A.; Meng, S.; Du, Y.; Shi, Y.; Li, L.; Turman, Y. J.; Zang, D.; Helms, E. B.; Balu, S.; Zhou, X.; Wu, J.; Topal, M. D.; Hayes, D. N.; Perou, C. M.; Getz, G.; Voet, D.; Saksena, G.; Zhang, Junihua; Zhang, H.; Wu, C. J.; Shukla, S.; Cibulskis, K.; Lawrence, M. S.; Sivachenko, A.; Jing, R.; Park, R. W.; Liu, Y.; Park, P. J.; Noble, M.; Chin, L.; Carter, H.; Kim, D.; Karchin, R.; Spellman, P. T.; Purdom, E.; Neuvial, P.; Bengtsson, H.; Durinck, S.; Han, J.; Korkola, J. E.; Heiser, L. M.; Cho, R. J.; Hu, Z.; Parvin, B.; Speed, T. P.; Gray, J. W.; Schultz, N.; Cerami, E.; Taylor, B. S.; Olshen, A.; Reva, B.; Antipin, Y.; Shen, R.; Mankoo, P.; Sheridan, R.; Ciriello, G.; Chang, W. K.; Bernanke, J. A.; Borsu, L.; Levine, D. A.; Ladanyi, M.; Sander, C.; Haussler, D.; Benz, C. C.; Stuart, J. M.; Benz, S. C.; Sanborn, J. Z.; Vaske, C. J.; Zhu, J.; Szeto, C.; Scott, G. K.; Yau, C.; Hoadley, K. A.; Du, Y.; Balu, S.; Hayes, D. N.; Perou, C. M.; Wilkerson, M. D.; Zhang, N.; Akbani, R.; Baggerly, K. A.; Yung, W. K.; Mills, G. B.; Weinstein, J. N.; Penny, R.; Shelton, T.; Grimm, D.; Hatfield, M.; Morris, S.; Yena, P.; Rhodes, P.; Sherman, M.; Paulauskis, J.; Millis, S.; Kahn, A.; Greene, J. M.; Sfeir, R.; Jensen, M. A.; Chen, J.; Whitmore, J.; Alonso, S.; Jordan, J.; Chu, A.; Zhang, Jinghui; Barker, A.; Compton, C.; Eley, G.; Ferguson, M.; Fielding, P.; Gerhard, D. S.; Myles, R.; Schaefer, C.; Shaw, K. R. Mills; Vaught, J.; Vockley, J. B.; Good, P. J.; Guyer, M. S.; Ozenberger, B.; Peterson, J.; Thomson, E.; Cramer, D.W.
2011-01-01
A catalogue of molecular aberrations that cause ovarian cancer is critical for developing and deploying therapies that will improve patients' lives. The Cancer Genome Atlas project has analysed messenger RNA expression, microRNA expression, promoter methylation and DNA copy number in 489 high-grade
Challenges and Opportunities in Analysing Students Modelling
Blanco-Anaya, Paloma; Justi, Rosária; Díaz de Bustamante, Joaquín
2017-01-01
Modelling-based teaching activities have been designed and analysed from distinct theoretical perspectives. In this paper, we use one of them--the model of modelling diagram (MMD)--as an analytical tool in a regular classroom context. This paper examines the challenges that arise when the MMD is used as an analytical tool to characterise the…
Chemical analyses, antibacterial activity and genetic diversity ...
African Journals Online (AJOL)
SAM
2014-06-25
Jun 25, 2014 ... Key words: Citrus, genetic diversity, ISSR markers, chemical analyses, antibacterial. ... ment of DNA based marker systems has advanced our ... Total acidity of the juices was determined by titration method as ... Greek compressed C. sinensis. 37 163 ..... flavonoids have a large spectrum of biological activity.
Written Case Analyses and Critical Reflection.
Harrington, Helen L.; And Others
1996-01-01
The study investigated the use of case-based pedagogy to develop critical reflection in prospective teachers. Analysis of students written analyses of dilemma-based cases found patterns showing evidence of students open-mindedness, sense of professional responsibility, and wholeheartedness in approach to teaching. (DB)
Comparison of veterinary import risk analyses studies
Vos-de Jong, de C.J.; Conraths, F.J.; Adkin, A.; Jones, E.M.; Hallgren, G.S.; Paisley, L.G.
2011-01-01
Twenty-two veterinary import risk analyses (IRAs) were audited: a) for inclusion of the main elements of risk analysis; b) between different types of IRAs; c) between reviewers' scores. No significant differences were detected between different types of IRAs, although quantitative IRAs and IRAs publ
Hybrid Logical Analyses of the Ambient Calculus
DEFF Research Database (Denmark)
Bolander, Thomas; Hansen, Rene Rydhof
2010-01-01
In this paper, hybrid logic is used to formulate three control flow analyses for Mobile Ambients, a process calculus designed for modelling mobility. We show that hybrid logic is very well-suited to express the semantic structure of the ambient calculus and how features of hybrid logic can...
Cosmetology: Task Analyses. Competency-Based Education.
Henrico County Public Schools, Glen Allen, VA. Virginia Vocational Curriculum Center.
These task analyses are designed to be used in combination with the "Trade and Industrial Education Service Area Resource" in order to implement competency-based education in the cosmetology program in Virginia. The task analysis document contains the task inventory, suggested task sequence lists, and content outlines for the secondary…
Meta-analyses on viral hepatitis
DEFF Research Database (Denmark)
Gluud, Lise L; Gluud, Christian
2009-01-01
This article summarizes the meta-analyses of interventions for viral hepatitis A, B, and C. Some of the interventions assessed are described in small trials with unclear bias control. Other interventions are supported by large, high-quality trials. Although attempts have been made to adjust...
Control of optics in random access analysers
Truchaud, A.
1988-01-01
The technology behind random access analysers involves flexible optical systems which can measure absorbances for one reaction at different scheduled times, and for several reactions performed simultaneously at different wavelengths. Optics control involves light sources (continuous and flash mode), indexing of monochromatic filters, injection-moulded plastic cuvettes, optical fibres, and polychromatic analysis.
The Economic Cost of Homosexuality: Multilevel Analyses
Baumle, Amanda K.; Poston, Dudley, Jr.
2011-01-01
This article builds on earlier studies that have examined "the economic cost of homosexuality," by using data from the 2000 U.S. Census and by employing multilevel analyses. Our findings indicate that partnered gay men experience a 12.5 percent earnings penalty compared to married heterosexual men, and a statistically insignificant earnings…
What's missing from avian global diversification analyses?
Reddy, Sushma
2014-08-01
The accumulation of vast numbers of molecular phylogenetic studies has contributed to huge knowledge gains in the evolutionary history of birds. This permits subsequent analyses of avian diversity, such as how and why diversification varies across the globe and among taxonomic groups. However, available genetic data for these meta-analyses are unevenly distributed across different geographic regions and taxonomic groups. To comprehend the impact of this variation on the interpretation of global diversity patterns, I examined the availability of genetic data for possible biases in geographic and taxonomic sampling of birds. I identified three main disparities of sampling that are geographically associated with latitude (temperate, tropical), hemispheres (East, West), and range size. Tropical regions, which host the vast majority of species, are substantially less studied. Moreover, Eastern regions, such as the Old World Tropics and Australasia, stand out as being disproportionately undersampled, with up to half of communities not being represented in recent studies. In terms of taxonomic discrepancies, a majority of genetically undersampled clades are exclusively found in tropical regions. My analysis identifies several disparities in the key regions of interest of global diversity analyses. Differential sampling can have considerable impacts on these global comparisons and call into question recent interpretations of latitudinal or hemispheric differences of diversification rates. Moreover, this review pinpoints understudied regions whose biota are in critical need of modern systematic analyses.
Antigen/Antibody Analyses in Leishmaniasis.
1983-09-01
antibodies in human sera with antigens of protozoan parasites . It was found that enzyme substrate reactions had distinct advantages over typical...autoradiographic procedures. Analyses of various sera identified a number of antigens of protozoan parasites which may be useful in discriminating infections
Microstructure Analyses of NA-Nanodiamond Particles
2016-08-01
Microscopy Results and Discussion 1 Scanning Electron Microscope (SEM) Analysis 5 Objective 5 Experimental Procedure 5 Discussion of Results 5...420 electron microscope at 120 KV voltage was used for TEM analyses. Transmission Electronic Microscopy Results and Discussion An electron ...SCANNING ELECTRON MICROSCOPE (SEM) ANALYSIS Objective Examine the morphology and elemental chemistry of detonated nanodiamonds (DND
En kvantitativ metode til analyse af radio
Directory of Open Access Journals (Sweden)
Christine Lejre
2014-06-01
Full Text Available I den danske såvel som den internationale radiolitteratur er bud på metoder til analyse af radiomediet sparsomme. Det skyldes formentlig, at radiomediet er svært at analysere, fordi det er et medie, der ikke er visualiseret i form af billeder eller understøttet af printet tekst. Denne artikel har til formål at beskrive en ny kvantitativ metode til analyse af radio, der tager særligt hensyn til radiomediets modalitet – lyd struktureret som et lineært forløb i tid. Metoden understøtter dermed både radiomediet som et medie i tid og som et blindt medie. Metoden er udviklet i forbindelse med en komparativ analyse af kulturprogrammer på P1 og Radio24syv lavet for Danmarks Radio. Artiklen peger på, at metoden er velegnet til analyse af ikke kun radio, men også andre medieplatforme samt forskellige journalistiske stofområder.
Chemical Analyses of Silicon Aerogel Samples
van der Werf, I; De Leo, R; Marrone, S
2008-01-01
After five years of operating, two Aerogel counters: A1 and A2, taking data in Hall A at Jefferson Lab, suffered a loss of performance. In this note possible causes of degradation have been studied. In particular, various chemical and physical analyses have been carried out on several Aerogel tiles and on adhesive tape in order to reveal the presence of contaminants.
Metodekombination med kritisk analyse
DEFF Research Database (Denmark)
Bilfeldt, Annette
2007-01-01
"Metodekombination med kritisk teoretisk analyse" fremlægger en kombinationsmetode, der bygges op om en problemstilling med et normativt grundlag for studiet af køn og arbejdsliv. Med fokus på rationalitet og humaniseringsbarrierer i arbejdslivet lægges der med inspiration fra Marie Jahoda op til...
Chemical Analyses of Silicon Aerogel Samples
Energy Technology Data Exchange (ETDEWEB)
van der Werf, I.; Palmisano, F.; De Leo, Raffaele; Marrone, Stefano
2008-04-01
After five years of operating, two Aerogel counters: A1 and A2, taking data in Hall A at Jefferson Lab, suffered a loss of performance. In this note possible causes of degradation have been studied. In particular, various chemical and physical analyses have been carried out on several Aerogel tiles and on adhesive tape in order to reveal the presence of contaminants.
Direct amino acid analyses of mozzarella cheese.
Hoskins, M N
1985-12-01
The amino acid content of mozzarella (low moisture, part skim milk) and asadero cheeses was determined by the column chromatographic method. Data from the direct analyses of the mozzarella cheeses were compared with the calculated amino acid composition reported in tables in Agriculture Handbook No. 8-1. Phenylalanine and tyrosine contents were found to be higher in the direct analyses than in the calculated data in Handbook No. 8-1 (1.390 gm and 1.127 gm for phenylalanine, and 1.493 gm and 1.249 gm for tyrosine per 100 gm edible portion, respectively). That is of particular concern in the dietary management of phenylketonuria, in which accuracy in computing levels of phenylalanine and tyrosine is essential.
Analyses of cavitation instabilities in ductile metals
DEFF Research Database (Denmark)
Tvergaard, Viggo
2007-01-01
, and also tests for a thin ductile metal layer bonding two ceramic blocks have indicated rapid void growth. Analyses for these material configurations are discussed here. When the void radius is very small, a nonlocal plasticity model is needed to account for observed size-effects, and recent analyses...... analyzed for a material containing a periodic distribution of spherical voids with two different void sizes, where the stress fields around larger voids may accelerate the growth of smaller voids. Another approach has been an analysis of a unit cell model in which a central cavity is discretely represented......, while the surrounding voids are represented by a porous ductile material model in terms of a field quantity that specifies the variation of the void volume fraction in the surrounding metal....
DCH analyses using the CONTAIN code
Energy Technology Data Exchange (ETDEWEB)
Hong, Sung Wan; Kim, Hee Dong [Korea Atomic Energy Research Institute, Taejon (Korea, Republic of)
1996-08-01
This report describes CONTAIN analyses performed during participation in the project of `DCH issue resolution for ice condenser plants` which is sponsored by NRC at SNL. Even though the calculations were performed for the Ice Condenser plant, CONTAIN code has been used for analyses of many phenomena in the PWR containment and the DCH module can be commonly applied to any plant types. The present ice condenser issue resolution effort intended to provide guidance as to what might be needed to resolve DCH for ice condenser plants. It includes both a screening analysis and a scoping study if the screening analysis cannot provide an complete resolution. The followings are the results concerning DCH loads in descending order. 1. Availability of ignition sources prior to vessel breach 2. availability and effectiveness of ice in the ice condenser 3. Loads modeling uncertainties related to co-ejected RPV water 4. Other loads modeling uncertainties 10 tabs., 3 figs., 14 refs. (Author).
Pratique de l'analyse fonctionelle
Tassinari, Robert
1997-01-01
Mettre au point un produit ou un service qui soit parfaitement adapté aux besoins et aux exigences du client est indispensable pour l'entreprise. Pour ne rien laisser au hasard, il s'agit de suivre une méthodologie rigoureuse : celle de l'analyse fonctionnelle. Cet ouvrage définit précisément cette méthode ainsi que ses champs d'application. Il décrit les méthodes les plus performantes en termes de conception de produit et de recherche de qualité et introduit la notion d'analyse fonctionnelle interne. Un ouvrage clé pour optimiser les processus de conception de produit dans son entreprise. -- Idées clés, par Business Digest
Reliability of chemical analyses of water samples
Energy Technology Data Exchange (ETDEWEB)
Beardon, R.
1989-11-01
Ground-water quality investigations require reliable chemical analyses of water samples. Unfortunately, laboratory analytical results are often unreliable. The Uranium Mill Tailings Remedial Action (UMTRA) Project`s solution to this problem was to establish a two phase quality assurance program for the analysis of water samples. In the first phase, eight laboratories analyzed three solutions of known composition. The analytical accuracy of each laboratory was ranked and three laboratories were awarded contracts. The second phase consists of on-going monitoring of the reliability of the selected laboratories. The following conclusions are based on two years experience with the UMTRA Project`s Quality Assurance Program. The reliability of laboratory analyses should not be taken for granted. Analytical reliability may be independent of the prices charged by laboratories. Quality assurance programs benefit both the customer and the laboratory.
Reliability of chemical analyses of water samples
Energy Technology Data Exchange (ETDEWEB)
Beardon, R.
1989-11-01
Ground-water quality investigations require reliable chemical analyses of water samples. Unfortunately, laboratory analytical results are often unreliable. The Uranium Mill Tailings Remedial Action (UMTRA) Project`s solution to this problem was to establish a two phase quality assurance program for the analysis of water samples. In the first phase, eight laboratories analyzed three solutions of known composition. The analytical accuracy of each laboratory was ranked and three laboratories were awarded contracts. The second phase consists of on-going monitoring of the reliability of the selected laboratories. The following conclusions are based on two years experience with the UMTRA Project`s Quality Assurance Program. The reliability of laboratory analyses should not be taken for granted. Analytical reliability may be independent of the prices charged by laboratories. Quality assurance programs benefit both the customer and the laboratory.
Introduction à l'analyse fonctionnelle
Reischer, Corina; Hengartner, Walter
1981-01-01
Fruit de la collaboration des professeur Walter Hengarther de l'Université Laval, Marcel Lambert et Corina Reischer de l'Université du Québec à Trois-Rivières, Introduction à l'analyse fonctionnelle se distingue tant par l'étendue de son contenu que par l'accessibilité de sa présentation. Sans céder quoi que ce soit sur la rigueur, il est parfaitement adapté à un premier cours d'analyse fonctionnelle. Tout en étant d'abord destiné aux étudiants en mathématiques, il pourra certes être utile aux étudiants de second cycle en sciences et en génie.
En Billig GPS Data Analyse Platform
DEFF Research Database (Denmark)
Andersen, Ove; Christiansen, Nick; Larsen, Niels T.
2011-01-01
Denne artikel præsenterer en komplet software platform til analyse af GPS data. Platformen er bygget udelukkende vha. open-source komponenter. De enkelte komponenter i platformen beskrives i detaljer. Fordele og ulemper ved at bruge open-source diskuteres herunder hvilke IT politiske tiltage, der...... organisationer med et digitalt vejkort og GPS data begynde at lave trafikanalyser på disse data. Det er et krav, at der er passende IT kompetencer tilstede i organisationen....
En Billig GPS Data Analyse Platform
DEFF Research Database (Denmark)
Andersen, Ove; Christiansen, Nick; Larsen, Niels T.;
2011-01-01
Denne artikel præsenterer en komplet software platform til analyse af GPS data. Platformen er bygget udelukkende vha. open-source komponenter. De enkelte komponenter i platformen beskrives i detaljer. Fordele og ulemper ved at bruge open-source diskuteres herunder hvilke IT politiske tiltage, der...... organisationer med et digitalt vejkort og GPS data begynde at lave trafikanalyser på disse data. Det er et krav, at der er passende IT kompetencer tilstede i organisationen....
Center for Naval Analyses Annual Report 1982.
1982-01-01
recipients, alike, was due mainly to temporary rather than permanent layoffs ; they were unemployed for about the same length of time, and their post- layoff ...equal to 70 percent of average weekly wages for 52 weeks in the two years following layoff ) apparently encouraged workers to remain unemployed longer...Institute for Defense Analyses. William A. Nierenberg, Director of the Scripps Institution of Oceanog- raphy. Member, NASA Advisory Council. Member
ANALYSING SPACE: ADAPTING AND EXTENDING MULTIMODAL SEMIOTICS
Directory of Open Access Journals (Sweden)
Louise J. Ravelli
2015-07-01
Full Text Available In the field of multimodal discourse analysis, one of the most exciting sites of application is that of 3D space: examining aspects of built environment for its meaningmaking potential. For the built environment – homes, offices, public buildings, parks, etc. – does indeed make meaning. These are spaces which speak – often their meanings are so familiar, we no longer hear what they say; sometimes, new and unusual sites draw attention to their meanings, and they are hotly contested. This chapter will suggest ways of analyzing 3D texts, based on the framework of Kress and van Leeuwen (2006. This framework, developed primarily for the analysis of 2D images, has been successfully extended to a range of other multimodal texts. Extension to the built environment includes Pang (2004, O’Toole (1994, Ravelli (2006, Safeyton (2004, Stenglin (2004 and White (1994, whose studies will inform the analyses presented here. This article will identify some of the key theoretical principles which underline this approach, including the notions of text, context and metafunction, and will describe some of the main areas of analysis for 3D texts. Also, ways of bringing the analyses together will be considered. The analyses will be demonstrated in relation to the Scientia building at the University of New South Wales, Australia.
Statistical analyses in disease surveillance systems.
Lescano, Andres G; Larasati, Ria Purwita; Sedyaningsih, Endang R; Bounlu, Khanthong; Araujo-Castillo, Roger V; Munayco-Escate, Cesar V; Soto, Giselle; Mundaca, C Cecilia; Blazes, David L
2008-11-14
The performance of disease surveillance systems is evaluated and monitored using a diverse set of statistical analyses throughout each stage of surveillance implementation. An overview of their main elements is presented, with a specific emphasis on syndromic surveillance directed to outbreak detection in resource-limited settings. Statistical analyses are proposed for three implementation stages: planning, early implementation, and consolidation. Data sources and collection procedures are described for each analysis.During the planning and pilot stages, we propose to estimate the average data collection, data entry and data distribution time. This information can be collected by surveillance systems themselves or through specially designed surveys. During the initial implementation stage, epidemiologists should study the completeness and timeliness of the reporting, and describe thoroughly the population surveyed and the epidemiology of the health events recorded. Additional data collection processes or external data streams are often necessary to assess reporting completeness and other indicators. Once data collection processes are operating in a timely and stable manner, analyses of surveillance data should expand to establish baseline rates and detect aberrations. External investigations can be used to evaluate whether abnormally increased case frequency corresponds to a true outbreak, and thereby establish the sensitivity and specificity of aberration detection algorithms.Statistical methods for disease surveillance have focused mainly on the performance of outbreak detection algorithms without sufficient attention to the data quality and representativeness, two factors that are especially important in developing countries. It is important to assess data quality at each state of implementation using a diverse mix of data sources and analytical methods. Careful, close monitoring of selected indicators is needed to evaluate whether systems are reaching their
Stable isotopic analyses in paleoclimatic reconstruction
Energy Technology Data Exchange (ETDEWEB)
Wigand, P.E. [Univ. and Community College System of Nevada, Reno, NV (United States)
1995-09-01
Most traditional paleoclimatic proxy data have inherent time lags between climatic input and system response that constrain their use in accurate reconstruction of paleoclimate chronology, scaling of its variability, and the elucidation of the processes that determine its impact on the biotic and abiotic environment. With the exception of dendroclimatology, and studies of short-lived organisms and pollen recovered from annually varved lacustrine sediments, significant periods of time ranging from years, to centuries, to millennia may intervene between climate change and its first manifestation in paleoclimatic proxy data records. Reconstruction of past climate through changes in plant community composition derived from pollen sequences and plant remains from ancient woodrat middens, wet environments and dry caves all suffer from these lags. However, stable isotopic analyses can provide more immediate indication of biotic response to climate change. Evidence of past physiological response of organisms to changes in effective precipitation as climate varies can be provided by analyses of the stable isotopic content of plant macrofossils from various contexts. These analyses consider variation in the stable isotopic (hydrogen, oxygen and carbon) content of plant tissues as it reflects (1) past global or local temperature through changes in meteoric (rainfall) water chemistry in the case of the first two isotopes, and (2) plant stress through changes in plant respiration/transpiration processes under differing water availability, and varying atmospheric CO, composition (which itself may actually be a net result of biotic response to climate change). Studies currently being conducted in the Intermountain West indicate both long- and short-term responses that when calibrated with modem analogue studies have the potential of revealing not only the timing of climate events, but their direction, magnitude and rapidity.
Environmental monitoring final report: groundwater chemical analyses
Energy Technology Data Exchange (ETDEWEB)
1984-02-01
This report presents the results of analyses of groundwater qualtiy at the SRC-I Demonstration Plant site in Newman, Kentucky. Samples were obtained from a network of 23 groundwater observation wells installed during previous studies. The groundwater was well within US EPA Interim Primary Drinking Water Standards for trace metals, radioactivity, and pesticides, but exceeded the standard for coliform bacteria. Several US EPA Secondary Drinking Water Standards were exceeded, namely, manganese, color, iron, and total dissolved solids. Based on the results, Dames and Moore recommend that all wells should be sterilized and those wells built in 1980 should be redeveloped. 1 figure, 6 tables.
Introduction: Analysing Emotion and Theorising Affect
Directory of Open Access Journals (Sweden)
Peta Tait
2016-08-01
Full Text Available This discussion introduces ideas of emotion and affect for a volume of articles demonstrating the scope of approaches used in their study within the humanities and creative arts. The volume offers multiple perspectives on emotion and affect within 20th-century and 21st-century texts, arts and organisations and their histories. The discussion explains how emotion encompasses the emotions, emotional feeling, sensation and mood and how these can be analysed particularly in relation to literature, art and performance. It briefly summarises concepts of affect theory within recent approaches before introducing the articles.
Fully Coupled FE Analyses of Buried Structures
Directory of Open Access Journals (Sweden)
James T. Baylot
1994-01-01
Full Text Available Current procedures for determining the response of buried structures to the effects of the detonation of buried high explosives recommend decoupling the free-field stress analysis from the structure response analysis. A fully coupled (explosive–soil structure finite element analysis procedure was developed so that the accuracies of current decoupling procedures could be evaluated. Comparisons of the results of analyses performed using this procedure with scale-model experiments indicate that this finite element procedure can be used to effectively evaluate the accuracies of the methods currently being used to decouple the free-field stress analysis from the structure response analysis.
Analyses of containment structures with corrosion damage
Energy Technology Data Exchange (ETDEWEB)
Cherry, J.L.
1996-12-31
Corrosion damage to a nuclear power plant containment structure can degrade the pressure capacity of the vessel. For the low-carbon, low- strength steels used in containments, the effect of corrosion on material properties is discussed. Strain-to-failure tests, in uniaxial tension, have been performed on corroded material samples. Results were used to select strain-based failure criteria for corroded steel. Using the ABAQUS finite element analysis code, the capacity of a typical PWR Ice Condenser containment with corrosion damage has been studied. Multiple analyses were performed with the locations of the corrosion the containment, and the amount of corrosion varied in each analysis.
Visuelle Analyse von Eye-Tracking-Daten
2011-01-01
Eye-Tracking ist eine der am häufigsten eingesetzten Techniken zur Analyse der Mensch-Computer-Interaktion sowie zur Untersuchung der Perzeption. Die erfassten Eye-Tracking-Daten werden meist mit Heat-Maps oder Scan-Paths analysiert, um die Usability der getesteten Anwendung zu ermitteln oder auf höhere kognitive Prozesse zu schließen. Das Ziel dieser Diplomarbeit ist die Entwicklung neuer Visualisierungstechniken für Eye-Tracking-Daten beziehungsweise die Entwicklung eines Studienkonzepts...
Visuelle Analyse von Eye-Tracking-Daten
Chen, Xuemei
2011-01-01
Eye-Tracking ist eine der am häufigsten eingesetzten Techniken zur Analyse der Mensch-Computer-Interaktion sowie zur Untersuchung der Perzeption. Die erfassten Eye-Tracking-Daten werden meist mit Heat-Maps oder Scan-Paths analysiert, um die Usability der getesteten Anwendung zu ermitteln oder auf höhere kognitive Prozesse zu schließen. Das Ziel dieser Diplomarbeit ist die Entwicklung neuer Visualisierungstechniken für Eye-Tracking-Daten beziehungsweise die Entwicklung eines Studienkonzepts...
Practitioners' Tools in Analysing Financial Markets Evolution
Directory of Open Access Journals (Sweden)
Mihaela Nicolau
2010-10-01
Full Text Available In a chaotic and confusing place as world of investing is, the practitioners, who operate onmarkets every day, have continuously searched to forecast properly the market movements. Moreminded to financial speculations, practitioners analyse financial markets looking for potentialweaknesses of the Efficient Market Hypothesis, and most of the times their methods are criticized byacademics. This article intends to present the traditional tools used by traders and brokers in analysingfinancial markets, emphasizing on critical opinions and scientific works published on this argumentby now.
Institute of Scientific and Technical Information of China (English)
田霆; 刘次华; 陈家清
2012-01-01
在双边定数截尾情形下,给出了一个2参数有浴盆形状失效率的寿命分布参数的极大似然估计和在平方损失下基于无信息先验下和共轭先验信息下的Bayes估计,通过大量的Monte-Carlo 数值模拟试验,对这2种情况下的估计的结果与极大似然估计作了比较.当样本n较大时,r愈大,s愈小,即丢失数据的数目愈小,T先验的Bayes估计与极大似然估计差不多,更接近于真值,都比无先验的Bayes估计要好.%When product life time follows the two-parameter bathtub-shaped life distribution,this article provide the maximum likelihood estimation and the Bayes estimation considering squared loss based on the non-informative prior and conjugate prior for the parameter under double type-Ⅱ Censored sample. Moreover,these estimators are compared with the maximum likelihood estimation by the Monte-Carlo simulation. When the sample size n is more greater, r is more biggers, is more smaller. Namely, the number of missing data is more smaller. The Bayes estimation of T prior is as much as the MLE, is close to the true data. They are more precision than the Bayes estimation of non-informative prior.
Sensitivity in risk analyses with uncertain numbers.
Energy Technology Data Exchange (ETDEWEB)
Tucker, W. Troy; Ferson, Scott
2006-06-01
Sensitivity analysis is a study of how changes in the inputs to a model influence the results of the model. Many techniques have recently been proposed for use when the model is probabilistic. This report considers the related problem of sensitivity analysis when the model includes uncertain numbers that can involve both aleatory and epistemic uncertainty and the method of calculation is Dempster-Shafer evidence theory or probability bounds analysis. Some traditional methods for sensitivity analysis generalize directly for use with uncertain numbers, but, in some respects, sensitivity analysis for these analyses differs from traditional deterministic or probabilistic sensitivity analyses. A case study of a dike reliability assessment illustrates several methods of sensitivity analysis, including traditional probabilistic assessment, local derivatives, and a ''pinching'' strategy that hypothetically reduces the epistemic uncertainty or aleatory uncertainty, or both, in an input variable to estimate the reduction of uncertainty in the outputs. The prospects for applying the methods to black box models are also considered.
Hitchhikers’ guide to analysing bird ringing data
Directory of Open Access Journals (Sweden)
Harnos Andrea
2015-12-01
Full Text Available Bird ringing datasets constitute possibly the largest source of temporal and spatial information on vertebrate taxa available on the globe. Initially, the method was invented to understand avian migration patterns. However, data deriving from bird ringing has been used in an array of other disciplines including population monitoring, changes in demography, conservation management and to study the effects of climate change to name a few. Despite the widespread usage and importance, there are no guidelines available specifically describing the practice of data management, preparation and analyses of ringing datasets. Here, we present the first of a series of comprehensive tutorials that may help fill this gap. We describe in detail and through a real-life example the intricacies of data cleaning and how to create a data table ready for analyses from raw ringing data in the R software environment. Moreover, we created and present here the R package; ringR, designed to carry out various specific tasks and plots related to bird ringing data. Most methods described here can also be applied to a wide range of capture-recapture type data based on individual marking, regardless to taxa or research question.
NEXT Ion Thruster Performance Dispersion Analyses
Soulas, George C.; Patterson, Michael J.
2008-01-01
The NEXT ion thruster is a low specific mass, high performance thruster with a nominal throttling range of 0.5 to 7 kW. Numerous engineering model and one prototype model thrusters have been manufactured and tested. Of significant importance to propulsion system performance is thruster-to-thruster performance dispersions. This type of information can provide a bandwidth of expected performance variations both on a thruster and a component level. Knowledge of these dispersions can be used to more conservatively predict thruster service life capability and thruster performance for mission planning, facilitate future thruster performance comparisons, and verify power processor capabilities are compatible with the thruster design. This study compiles the test results of five engineering model thrusters and one flight-like thruster to determine unit-to-unit dispersions in thruster performance. Component level performance dispersion analyses will include discharge chamber voltages, currents, and losses; accelerator currents, electron backstreaming limits, and perveance limits; and neutralizer keeper and coupling voltages and the spot-to-plume mode transition flow rates. Thruster level performance dispersion analyses will include thrust efficiency.
Fractal and multifractal analyses of bipartite networks.
Liu, Jin-Long; Wang, Jian; Yu, Zu-Guo; Xie, Xian-Hua
2017-03-31
Bipartite networks have attracted considerable interest in various fields. Fractality and multifractality of unipartite (classical) networks have been studied in recent years, but there is no work to study these properties of bipartite networks. In this paper, we try to unfold the self-similarity structure of bipartite networks by performing the fractal and multifractal analyses for a variety of real-world bipartite network data sets and models. First, we find the fractality in some bipartite networks, including the CiteULike, Netflix, MovieLens (ml-20m), Delicious data sets and (u, v)-flower model. Meanwhile, we observe the shifted power-law or exponential behavior in other several networks. We then focus on the multifractal properties of bipartite networks. Our results indicate that the multifractality exists in those bipartite networks possessing fractality. To capture the inherent attribute of bipartite network with two types different nodes, we give the different weights for the nodes of different classes, and show the existence of multifractality in these node-weighted bipartite networks. In addition, for the data sets with ratings, we modify the two existing algorithms for fractal and multifractal analyses of edge-weighted unipartite networks to study the self-similarity of the corresponding edge-weighted bipartite networks. The results show that our modified algorithms are feasible and can effectively uncover the self-similarity structure of these edge-weighted bipartite networks and their corresponding node-weighted versions.
Bioinformatics tools for analysing viral genomic data.
Orton, R J; Gu, Q; Hughes, J; Maabar, M; Modha, S; Vattipally, S B; Wilkie, G S; Davison, A J
2016-04-01
The field of viral genomics and bioinformatics is experiencing a strong resurgence due to high-throughput sequencing (HTS) technology, which enables the rapid and cost-effective sequencing and subsequent assembly of large numbers of viral genomes. In addition, the unprecedented power of HTS technologies has enabled the analysis of intra-host viral diversity and quasispecies dynamics in relation to important biological questions on viral transmission, vaccine resistance and host jumping. HTS also enables the rapid identification of both known and potentially new viruses from field and clinical samples, thus adding new tools to the fields of viral discovery and metagenomics. Bioinformatics has been central to the rise of HTS applications because new algorithms and software tools are continually needed to process and analyse the large, complex datasets generated in this rapidly evolving area. In this paper, the authors give a brief overview of the main bioinformatics tools available for viral genomic research, with a particular emphasis on HTS technologies and their main applications. They summarise the major steps in various HTS analyses, starting with quality control of raw reads and encompassing activities ranging from consensus and de novo genome assembly to variant calling and metagenomics, as well as RNA sequencing.
ANALYSES ON SYSTEMATIC CONFRONTATION OF FIGHTER AIRCRAFT
Institute of Scientific and Technical Information of China (English)
HuaiJinpeng; WuZhe; HuangJun
2002-01-01
Analyses of the systematic confrontation between two military forcfes are the highest hierarchy on opera-tional effectiveness study of weapon systema.The physi-cal model for tactical many-on-many engagements of an aerial warfare with heterogeneous figher aircraft is estab-lished.On the basis of Lanchester multivariate equations of square law,a mathematical model corresponding to the established physical model is given.A superiorityh parame-ter is then derived directly from the mathematical model.With view to the high -tech condition of modern war-fare,the concept of superiority parameter which more well and truly reflects the essential of an air-to-air en-gagement is further formulated.The attrition coeffi-cients,which are key to the differential equations,are de-termined by using tactics of random target assignment and air-to-air capability index of the fighter aircraft.Hereby,taking the mathematical model and superiority parameter as cores,calculations amd analyses of complicate systemic problems such as evaluation of battle superiority,prog-mostication of combat process and optimization of colloca-tions have been accomplished.Results indicate that a clas-sical combat theory with its certain recent development has received newer applications in the military operation research for complicated confrontation analysis issues.
Transportation systems analyses: Volume 1: Executive Summary
1993-05-01
The principal objective of this study is to accomplish a systems engineering assessment of the nation's space transportation infrastructure. This analysis addresses the necessary elements to perform man delivery and return, cargo transfer, cargo delivery, payload servicing, and the exploration of the Moon and Mars. Specific elements analyzed, but not limited to, include the Space Exploration Initiative (SEI), the National Launch System (NLS), the current expendable launch vehicle (ELV) fleet, ground facilities, the Space Station Freedom (SSF), and other civil, military and commercial payloads. The performance of this study entails maintaining a broad perspective on the large number of transportation elements that could potentially comprise the U.S. space infrastructure over the next several decades. To perform this systems evaluation, top-level trade studies are conducted to enhance our understanding of the relationships between elements of the infrastructure. This broad 'infrastructure-level perspective' permits the identification of preferred infrastructures. Sensitivity analyses are performed to assure the credibility and usefulness of study results. This executive summary of the transportation systems analyses (TSM) semi-annual report addresses the SSF logistics resupply. Our analysis parallels the ongoing NASA SSF redesign effort. Therefore, there could be no SSF design to drive our logistics analysis. Consequently, the analysis attempted to bound the reasonable SSF design possibilities (and the subsequent transportation implications). No other strategy really exists until after a final decision is rendered on the SSF configuration.
Autisme et douleur – analyse bibliographique
Dubois, Amandine; Rattaz, Cécile; Pry, René; Baghdadli, Amaria
2010-01-01
La présente analyse bibliographique a pour objectif de réaliser un bilan des travaux publiés dans le champ de la douleur et de l’autisme. L’article aborde, dans un premier temps, les études publiées concernant les modes d’expression de la douleur observés dans cette population. Différentes hypothèses permettant d’expliquer les particularités expressives des personnes avec autisme sont ensuite passées en revue : excès d’endorphines, particularités dans le traitement sensoriel, déficit sociocommunicatif. Cette analyse bibliographique aborde, pour terminer, la question de l’évaluation et de la prise en compte de la douleur chez les personnes avec autisme. Les auteurs concluent à l’absence d’homogénéité des résultats des études publiées et au besoin de poursuivre les recherches afin de parvenir à des données consensuelles sur un domaine d’étude encore peu exploité au plan scientifique. Sur un plan clinique, l’approfondissement des connaissances dans ce domaine devrait permettre de mettre au point des outils d’évaluation de la douleur et d’ainsi en assurer une meilleure prise en charge au quotidien. PMID:20808970
Analyses of containment structures with corrosion damage
Energy Technology Data Exchange (ETDEWEB)
Cherry, J.L. [Sandia National Labs., Albuquerque, NM (United States)
1997-01-01
Corrosion damage that has been found in a number of nuclear power plant containment structures can degrade the pressure capacity of the vessel. This has prompted concerns regarding the capacity of corroded containments to withstand accident loadings. To address these concerns, finite element analyses have been performed for a typical PWR Ice Condenser containment structure. Using ABAQUS, the pressure capacity was calculated for a typical vessel with no corrosion damage. Multiple analyses were then performed with the location of the corrosion and the amount of corrosion varied in each analysis. Using a strain-based failure criterion, a {open_quotes}lower bound{close_quotes}, {open_quotes}best estimate{close_quotes}, and {open_quotes}upper bound{close_quotes} failure level was predicted for each case. These limits were established by: determining the amount of variability that exists in material properties of typical containments, estimating the amount of uncertainty associated with the level of modeling detail and modeling assumptions, and estimating the effect of corrosion on the material properties.
Waste Stream Analyses for Nuclear Fuel Cycles
Energy Technology Data Exchange (ETDEWEB)
N. R. Soelberg
2010-08-01
A high-level study was performed in Fiscal Year 2009 for the U.S. Department of Energy (DOE) Office of Nuclear Energy (NE) Advanced Fuel Cycle Initiative (AFCI) to provide information for a range of nuclear fuel cycle options (Wigeland 2009). At that time, some fuel cycle options could not be adequately evaluated since they were not well defined and lacked sufficient information. As a result, five families of these fuel cycle options are being studied during Fiscal Year 2010 by the Systems Analysis Campaign for the DOE NE Fuel Cycle Research and Development (FCRD) program. The quality and completeness of data available to date for the fuel cycle options is insufficient to perform quantitative radioactive waste analyses using recommended metrics. This study has been limited thus far to qualitative analyses of waste streams from the candidate fuel cycle options, because quantitative data for wastes from the front end, fuel fabrication, reactor core structure, and used fuel for these options is generally not yet available.
Fractal and multifractal analyses of bipartite networks
Liu, Jin-Long; Wang, Jian; Yu, Zu-Guo; Xie, Xian-Hua
2017-01-01
Bipartite networks have attracted considerable interest in various fields. Fractality and multifractality of unipartite (classical) networks have been studied in recent years, but there is no work to study these properties of bipartite networks. In this paper, we try to unfold the self-similarity structure of bipartite networks by performing the fractal and multifractal analyses for a variety of real-world bipartite network data sets and models. First, we find the fractality in some bipartite networks, including the CiteULike, Netflix, MovieLens (ml-20m), Delicious data sets and (u, v)-flower model. Meanwhile, we observe the shifted power-law or exponential behavior in other several networks. We then focus on the multifractal properties of bipartite networks. Our results indicate that the multifractality exists in those bipartite networks possessing fractality. To capture the inherent attribute of bipartite network with two types different nodes, we give the different weights for the nodes of different classes, and show the existence of multifractality in these node-weighted bipartite networks. In addition, for the data sets with ratings, we modify the two existing algorithms for fractal and multifractal analyses of edge-weighted unipartite networks to study the self-similarity of the corresponding edge-weighted bipartite networks. The results show that our modified algorithms are feasible and can effectively uncover the self-similarity structure of these edge-weighted bipartite networks and their corresponding node-weighted versions. PMID:28361962
CHEMICAL ANALYSES OF SODIUM SYSTEMS FINAL REPORT
Energy Technology Data Exchange (ETDEWEB)
Greenhalgh, W. O.; Yunker, W. H.; Scott, F. A.
1970-06-01
BNWL-1407 summarizes information gained from the Chemical Analyses of Sodium Systems Program pursued by Battelle- Northwest over the period from July 1967 through June 1969. Tasks included feasibility studies for performing coulometric titration and polarographic determinations of oxygen in sodium, and the development of new separation techniques for sodium impurities and their subsequent analyses. The program was terminated ahead of schedule so firm conclusions were not obtained in all areas of the work. At least 40 coulometric titrations were carried out and special test cells were developed for coulometric application. Data indicated that polarographic measurements are theoretically feasible, but practical application of the method was not verified. An emission spectrographic procedure for trace metal impurities was developed and published. Trace metal analysis by a neutron activation technique was shown to be feasible; key to the success of the activation technique was the application of a new ion exchange resin which provided a sodium separation factor of 10{sup 11}. Preliminary studies on direct scavenging of trace metals produced no conclusive results.
Fractal and multifractal analyses of bipartite networks
Liu, Jin-Long; Wang, Jian; Yu, Zu-Guo; Xie, Xian-Hua
2017-03-01
Bipartite networks have attracted considerable interest in various fields. Fractality and multifractality of unipartite (classical) networks have been studied in recent years, but there is no work to study these properties of bipartite networks. In this paper, we try to unfold the self-similarity structure of bipartite networks by performing the fractal and multifractal analyses for a variety of real-world bipartite network data sets and models. First, we find the fractality in some bipartite networks, including the CiteULike, Netflix, MovieLens (ml-20m), Delicious data sets and (u, v)-flower model. Meanwhile, we observe the shifted power-law or exponential behavior in other several networks. We then focus on the multifractal properties of bipartite networks. Our results indicate that the multifractality exists in those bipartite networks possessing fractality. To capture the inherent attribute of bipartite network with two types different nodes, we give the different weights for the nodes of different classes, and show the existence of multifractality in these node-weighted bipartite networks. In addition, for the data sets with ratings, we modify the two existing algorithms for fractal and multifractal analyses of edge-weighted unipartite networks to study the self-similarity of the corresponding edge-weighted bipartite networks. The results show that our modified algorithms are feasible and can effectively uncover the self-similarity structure of these edge-weighted bipartite networks and their corresponding node-weighted versions.
Estimation of wind speed and wave height during cyclones
Digital Repository Service at National Institute of Oceanography (India)
SanilKumar, V.; Mandal, S.; AshokKumar, K.
reported by ships were comparable. Empirical expressions relating wind speed, wave height and wave period to storm parameters were derived. The design wave height for different return periods was obtained by fitting a two-parameter Weibull distribution...
Les conditions de l’analyse qualitative
Directory of Open Access Journals (Sweden)
Pierre Paillé
2011-07-01
Full Text Available Les méthodes d’analyse des données qualitatives et le monde informatique étaient faits pour se rencontrer. Et en effet, la question est d’actualité et les outils informatiques nombreux et avancés. Ce phénomène ne saurait s’estomper, d’autant moins que l’analyse des données secondaires connaît en même temps des développements importants. Mais l’attrait pour les logiciels d’analyse peut devenir tel qu’on ne verrait plus trop à quel titre et pour quelles raisons on pourrait s’en passer. L’article tente de cerner une vision et une pratique de l’analyse qualitative qui, dans son essence, ne se prête pas à l’utilisation d’outils informatiques spécialisés. Il situe sa réflexion dans le cadre de la méthodologie qualitative (démarche qualitative, recherche qualitative, analyse qualitative, plus particulièrement au niveau de l’enquête qualitative de terrain.The conditions of qualitative analysisThe methods for analyzing qualitative data and the computer world were meant to meet. And, by that, the question is valid and computer tools numerous and advanced. This phenomenon will not slowdown, especially not since secondary data analysis experiences significant developments. But the attraction for analysis software could develop so that we would not see too much why and for what type of reasons we could part from them. This article attempts to define a vision and a practice of qualitative analysis that, in essence, does not use specialized computer tools. It situates its reflection within qualitative methodology (qualitative approaches, qualitative research, qualitative analysis and moreover in the level of qualitative fieldwork investigation.Las condiciones del análisis cualitativo. Reflexiones sobre la utilización de programas informáticosLos métodos de análisis de los datos cualitativos y el mundo de la informática han nacido para entenderse mutualmente. En efecto, la cuestión es actual y los
Remuestreo Bootstrap y Jackknife en confiabilidad: Caso Exponencial y Weibull
Directory of Open Access Journals (Sweden)
Javier Ramírez-Montoya
2016-01-01
Full Text Available Se comparan los métodos de remuestreo Bootstrap-t y Jackknife delete I y delete II, utilizando los estimadores no paramétricos de Kaplan-Meier y Nelson-Aalen, que se utilizan con frecuencia en la práctica, teniendo en cuenta diferentes porcentajes de censura, tamaños de muestra y tiempos de interés. La comparación se realiza vía simulación, mediante el error cuadrático medio.
Weibull Analysis and Area Scaling for Infrared Window Materials (U)
2016-08-01
Published by ..................................................................... Technical Communication Office Collation ...are ground and polished by the same methods used to make the window. Even if machining of coupons is matched as well as possible to that of the
Weibull Parameters Estimation Based on Physics of Failure Model
DEFF Research Database (Denmark)
Kostandyan, Erik; Sørensen, John Dalsgaard
2012-01-01
Reliability estimation procedures are discussed for the example of fatigue development in solder joints using a physics of failure model. The accumulated damage is estimated based on a physics of failure model, the Rainflow counting algorithm and the Miner’s rule. A threshold model is used...... distribution. Methods from structural reliability analysis are used to model the uncertainties and to assess the reliability for fatigue failure. Maximum Likelihood and Least Square estimation techniques are used to estimate fatigue life distribution parameters....
Pathway analyses implicate glial cells in schizophrenia.
Directory of Open Access Journals (Sweden)
Laramie E Duncan
Full Text Available BACKGROUND: The quest to understand the neurobiology of schizophrenia and bipolar disorder is ongoing with multiple lines of evidence indicating abnormalities of glia, mitochondria, and glutamate in both disorders. Despite high heritability estimates of 81% for schizophrenia and 75% for bipolar disorder, compelling links between findings from neurobiological studies, and findings from large-scale genetic analyses, are only beginning to emerge. METHOD: Ten publically available gene sets (pathways related to glia, mitochondria, and glutamate were tested for association to schizophrenia and bipolar disorder using MAGENTA as the primary analysis method. To determine the robustness of associations, secondary analyses were performed with: ALIGATOR, INRICH, and Set Screen. Data from the Psychiatric Genomics Consortium (PGC were used for all analyses. There were 1,068,286 SNP-level p-values for schizophrenia (9,394 cases/12,462 controls, and 2,088,878 SNP-level p-values for bipolar disorder (7,481 cases/9,250 controls. RESULTS: The Glia-Oligodendrocyte pathway was associated with schizophrenia, after correction for multiple tests, according to primary analysis (MAGENTA p = 0.0005, 75% requirement for individual gene significance and also achieved nominal levels of significance with INRICH (p = 0.0057 and ALIGATOR (p = 0.022. For bipolar disorder, Set Screen yielded nominally and method-wide significant associations to all three glial pathways, with strongest association to the Glia-Astrocyte pathway (p = 0.002. CONCLUSIONS: Consistent with findings of white matter abnormalities in schizophrenia by other methods of study, the Glia-Oligodendrocyte pathway was associated with schizophrenia in our genomic study. These findings suggest that the abnormalities of myelination observed in schizophrenia are at least in part due to inherited factors, contrasted with the alternative of purely environmental causes (e.g. medication effects or
Genetic analyses of a seasonal interval timer.
Prendergast, Brian J; Renstrom, Randall A; Nelson, Randy J
2004-08-01
Seasonal clocks (e.g., circannual clocks, seasonal interval timers) permit anticipation of regularly occurring environmental events by timing the onset of seasonal transitions in reproduction, metabolism, and behavior. Implicit in the concept that seasonal clocks reflect adaptations to the local environment is the unexamined assumption that heritable genetic variance exists in the critical features of such clocks, namely, their temporal properties. These experiments quantified the intraspecific variance in, and heritability of, the photorefractoriness interval timer in Siberian hamsters (Phodopus sungorus), a seasonal clock that provides temporal information to mechanisms that regulate seasonal transitions in body weight. Twenty-seven families consisting of 54 parents and 109 offspring were raised in a long-day photoperiod and transferred as adults to an inhibitory photoperiod (continuous darkness; DD). Weekly body weight measurements permitted specification of the interval of responsiveness to DD, a reflection of the duration of the interval timer, in each individual. Body weights of males and females decreased after exposure to DD, but 3 to 5 months later, somatic recrudescence occurred, indicative of photorefractoriness to DD. The interval timer was approximately 5 weeks longer and twice as variable in females relative to males. Analyses of variance of full siblings revealed an overall intraclass correlation of 0.71 +/- 0.04 (0.51 +/- 0.10 for male offspring and 0.80 +/- 0.06 for female offspring), suggesting a significant family resemblance in the duration of interval timers. Parent-offspring regression analyses yielded an overall heritability estimate of 0.61 +/- 0.2; h(2) estimates from parent-offspring regression analyses were significant for female offspring (0.91 +/- 0.4) but not for male offspring (0.35 +/- 0.2), indicating strong additive genetic components for this trait, primarily in females. In nature, individual differences, both within and between
Feasibility Analyses of Integrated Broiler Production
Directory of Open Access Journals (Sweden)
L. Komalasari
2010-12-01
Full Text Available The major obstacles in the development of broiler raising is the expensive price of feed and the fluctuative price of DOCs. The cheap price of imported leg quarters reduces the competitiveness of the local broilers. Therefore, an effort to increase production efficiency is needed through integration between broiler raising and corn farmers and feed producers (integrated farming. The purpose of this study is to analyze the feasibility of integrating broiler raising with corn cultivation and feed production. Besides that, a simulation was conducted to analyze the effects of DOC price changes, broiler price and production capacity. The analyses showed that integrated farming and a mere combination between broiler raising and feed factory of a 10,000 bird capacity is not financially feasible. Increasing the production to 25,000 broiler chickens will make the integrated farming financially feasible. Unintegrated broiler raising is relatively sensitive to broiler price decreases and DOC price increases compared to integrated farming.
Cointegration Approach to Analysing Inflation in Croatia
Directory of Open Access Journals (Sweden)
Lena Malešević-Perović
2009-06-01
Full Text Available The aim of this paper is to analyse the determinants of inflation in Croatia in the period 1994:6-2006:6. We use a cointegration approach and find that increases in wages positively influence inflation in the long-run. Furthermore, in the period from June 1994 onward, the depreciation of the currency also contributed to inflation. Money does not explain Croatian inflation. This irrelevance of the money supply is consistent with its endogeneity to exchange rate targeting, whereby the money supply is determined by developments in the foreign exchange market. The value of inflation in the previous period is also found to be significant, thus indicating some inflation inertia.
Analysing transfer phenomena in osmotic evaporation
Directory of Open Access Journals (Sweden)
Freddy Forero Longas
2011-12-01
Full Text Available Osmotic evaporation is a modification of traditional processes using membranes; by means of a vapour pressure differential, produced by a highly concentrated extraction solution, water is transferred through a hydrophobic membrane as vapour. This technique has many advantages over traditional processes, allowing work at atmospheric pressure and low temperatures, this being ideal for heatsensitive products. This paper presents and synthetically analyses the phenomena of heat and mass transfer which occurs in the process and describes the models used for estimating the parameters of interest, such as flow, temperature, heat transfer rate and the relationships that exist amongst them when hollow fibre modules are used, providing a quick reference tool and specific information about this process.
Modelling and Analysing Socio-Technical Systems
DEFF Research Database (Denmark)
Aslanyan, Zaruhi; Ivanova, Marieta Georgieva; Nielson, Flemming
2015-01-01
with social engineering. Due to this combination of attack steps on technical and social levels, risk assessment in socio-technical systems is complex. Therefore, established risk assessment methods often abstract away the internal structure of an organisation and ignore human factors when modelling...... and assessing attacks. In our work we model all relevant levels of socio-technical systems, and propose evaluation techniques for analysing the security properties of the model. Our approach simplifies the identification of possible attacks and provides qualified assessment and ranking of attacks based...... on the expected impact. We demonstrate our approach on a home-payment system. The system is specifically designed to help elderly or disabled people, who may have difficulties leaving their home, to pay for some services, e.g., care-taking or rent. The payment is performed using the remote control of a television...
Digital analyses of cartometric Fruska Gora guidelines
Directory of Open Access Journals (Sweden)
Živković Dragica
2013-01-01
Full Text Available Modern geo morphological topography research have been using quantity statistic and cartographic methods for topographic relief features, mutual relief features, mutual connection analyses on the grounds of good quality numeric parameters etc. Topographic features are important for topographic activities are important for important natural activities. Important morphological characteristics are precisely at the angle of topography, hypsometry, and topography exposition and so on. Small yet unknown relief slants can deeply affect land configuration, hypsometry, topographic exposition etc. Expositions modify the light and heat of interconnected phenomena: soil and air temperature, soil disintegration, the length of vegetation period, the complexity of photosynthesis, the fruitfulness of agricultural crops, the height of snow limit etc. [Projekat Ministarstva nauke Republike Srbije, br. 176008 i br. III44006
Conceptualizing analyses of ecological momentary assessment data.
Shiffman, Saul
2014-05-01
Ecological momentary assessment (EMA) methods, which involve collection of real-time data in subjects' real-world environments, are particularly well suited to studying tobacco use. Analyzing EMA datasets can be challenging, as the datasets include a large and varied number of observations per subject and are relatively unstructured. This paper suggests that time is typically a key organizing principle in EMA data and that conceptualizing the data as a timeline of events, behaviors, and experiences can help define analytic approaches. EMA datasets lend themselves to answering a diverse array of research questions, and the research question must drive how data are arranged for analysis, and the kinds of statistical models that are applied. This is illustrated this with brief examples of diverse analyses applied to answer different questions from an EMA study of tobacco use and relapse.
Analysing Medieval Urban Space; a methodology
Directory of Open Access Journals (Sweden)
Marlous L. Craane MA
2007-08-01
Full Text Available This article has been written in reaction to recent developments in medieval history and archaeology, to study not only the buildings in a town but also the spaces that hold them together. It discusses a more objective and interdisciplinary approach for analysing urban morphology and use of space. It proposes a 'new' methodology by combining town plan analysis and space syntax. This methodology was trialled on the city of Utrecht in the Netherlands. By comparing the results of this 'new' methodology with the results of previous, more conventional, research, this article shows that space syntax can be applied successfully to medieval urban contexts. It does this by demonstrating a strong correlation between medieval economic spaces and the most integrated spaces, just as is found in the study of modern urban environments. It thus provides a strong basis for the use of this technique in future research of medieval urban environments.
Lagune de Salses - Leucate. I - Analyse bibliographique.
Ladagnous, Helene; Le Bec, Claude
1997-01-01
La mise en place récente de procédures visant à améliorer la qualité des eaux de la lagune de Salses-Leucate (Schéma d'Aménagement et de Gestion de l'Eau, contrat d'étang) nous a amenés à réaliser une synthèse des connaissances disponibles sur ce secteur, phase nécessaire et préalable à toute concertation. Il ressort de cette analyse bibliographique que bon nombre des études réalisées sont aujourd'hui obsolètes dans une perspective de gestion intégrée du site. Les connaissances hydrogéologiqu...
Spatial Analyses of Harappan Urban Settlements
Directory of Open Access Journals (Sweden)
Hirofumi Teramura
2006-12-01
Full Text Available The Harappan Civilization occupies a unique place among the early civilizations of the world with its well planned urban settlements, advanced handicraft and technology, religious and trade activities. Using a Geographical Information Systems (GIS, this study presents spatial analyses that locate urban settlements on a digital elevation model (DEM according to the three phases of early, mature and late. Understanding the relationship between the spatial distribution of Harappan sites and the change in some factors, such as topographic features, river passages or sea level changes, will lead to an understanding of the dynamism of this civilization. It will also afford a glimpse of the factors behind the formation, development, and decline of the Harappan Civilization.
ANALYSE OF PULSE WAVE PROPAGATION IN ARTERIES
Institute of Scientific and Technical Information of China (English)
PAN Yi-shan; JIA Xiao-bo; CUI Chang-kui; XIAO Xiao-chun
2006-01-01
Based upon the blood vessel of being regarded as the elasticity tube, and that the tissue restricts the blood vessel wall, the rule of pulse wave propagation in blood vessel was studied. The viscosity of blood, the elastic modulus of blood vessel, the radius of tube that influenced the pulse wave propagation were analyzed. Comparing the result that considered the viscosity of blood with another result that did not consider the viscosity of blood, we finally discover that the viscosity of blood that influences the pulse wave propagation can not be neglected; and with the accretion of the elastic modulus the speed of propagation augments and the press value of blood stream heightens; when diameter of blood vessel reduces, the press of blood stream also heightens and the speed of pulse wave also augments. These results will contribute to making use of the information of pulse wave to analyse and auxiliarily diagnose some causes of human disease.
Analysing Terrorism from a Systems Thinking Perspective
Directory of Open Access Journals (Sweden)
Lukas Schoenenberger
2014-02-01
Full Text Available Given the complexity of terrorism, solutions based on single factors are destined to fail. Systems thinking offers various tools for helping researchers and policy makers comprehend terrorism in its entirety. We have developed a semi-quantitative systems thinking approach for characterising relationships between variables critical to terrorism and their impact on the system as a whole. For a better understanding of the mechanisms underlying terrorism, we present a 16-variable model characterising the critical components of terrorism and perform a series of highly focused analyses. We show how to determine which variables are best suited for government intervention, describing in detail their effects on the key variable—the political influence of a terrorist network. We also offer insights into how to elicit variables that destabilise and ultimately break down these networks. Because we clarify our novel approach with fictional data, the primary importance of this paper lies in the new framework for reasoning that it provides.
First international intercomparison of image analysers
Pálfalvi, J; Eoerdoegh, I
1999-01-01
Image analyser systems used for evaluating solid state nuclear track detectors (SSNTD) were compared in order to establish minimum hardware and software requirements and methodology necessary in different fields of radiation dosimetry. For the purpose, CR-39 detectors (TASL, Bristol, U.K.) were irradiated with different (n,alpha) and (n,p) converters in a reference Pu-Be neutron field, in an underground laboratory with high radon concentration and by different alpha sources at the Atomic Energy Research Institute (AERI) in Budapest, Hungary. 6 sets of etched and pre-evaluated detectors and the 7th one without etching were distributed among the 14 laboratories from 11 countries. The participants measured the different track parameters and statistically evaluated the results, to determine the performance of their system. The statistical analysis of results showed high deviations from the mean values in many cases. As the conclusion of the intercomparison recommendations were given to fulfill those requirements ...
Genetic Analyses of Meiotic Recombination in Arabidopsis
Institute of Scientific and Technical Information of China (English)
无
2007-01-01
Meiosis is essential for sexual reproduction and recombination is a critical step required for normal meiosis. Understanding the underlying molecular mechanisms that regulate recombination ie important for medical, agricultural and ecological reasons. Readily available molecular and cytological tools make Arabidopsis an excellent system to study meiosis. Here we review recent developments in molecular genetic analyses on meiotic recombination. These Include studies on plant homologs of yeast and animal genes, as well as novel genes that were first identified in plants. The characterizations of these genes have demonstrated essential functions from the initiation of recombination by double-strand breaks to repair of such breaks, from the formation of double-Holliday junctions to possible resolution of these junctions, both of which are critical for crossover formation. The recent advances have ushered a new era in plant meiosis, in which the combination of genetics, genomics, and molecular cytology can uncover important gene functions.
Externalizing Behaviour for Analysing System Models
DEFF Research Database (Denmark)
Ivanova, Marieta Georgieva; Probst, Christian W.; Hansen, René Rydhof
2013-01-01
attackers. Therefore, many attacks are considerably easier to be performed for insiders than for outsiders. However, current models do not support explicit specification of different behaviours. Instead, behaviour is deeply embedded in the analyses supported by the models, meaning that it is a complex......, if not impossible task to change behaviours. Especially when considering social engineering or the human factor in general, the ability to use different kinds of behaviours is essential. In this work we present an approach to make the behaviour a separate component in system models, and explore how to integrate......System models have recently been introduced to model organisations and evaluate their vulnerability to threats and especially insider threats. Especially for the latter these models are very suitable, since insiders can be assumed to have more knowledge about the attacked organisation than outside...
Autisme et Douleur – Analyse Bibliographique
Amandine Dubois; Cécile Rattaz; René Pry; Amaria Baghdadli
2010-01-01
La présente analyse bibliographique a pour objectif de réaliser un bilan des travaux publiés dans le champ de la douleur et de l’autisme. L’article aborde, dans un premier temps, les études publiées concernant les modes d’expression de la douleur observés dans cette population. Différentes hypothèses permettant d’expliquer les particularités expressives des personnes avec autisme sont ensuite passées en revue : excès d’endorphines, particularités dans le traitement sensoriel, déficit sociocom...
Accurate renormalization group analyses in neutrino sector
Energy Technology Data Exchange (ETDEWEB)
Haba, Naoyuki [Graduate School of Science and Engineering, Shimane University, Matsue 690-8504 (Japan); Kaneta, Kunio [Kavli IPMU (WPI), The University of Tokyo, Kashiwa, Chiba 277-8568 (Japan); Takahashi, Ryo [Graduate School of Science and Engineering, Shimane University, Matsue 690-8504 (Japan); Yamaguchi, Yuya [Department of Physics, Faculty of Science, Hokkaido University, Sapporo 060-0810 (Japan)
2014-08-15
We investigate accurate renormalization group analyses in neutrino sector between ν-oscillation and seesaw energy scales. We consider decoupling effects of top quark and Higgs boson on the renormalization group equations of light neutrino mass matrix. Since the decoupling effects are given in the standard model scale and independent of high energy physics, our method can basically apply to any models beyond the standard model. We find that the decoupling effects of Higgs boson are negligible, while those of top quark are not. Particularly, the decoupling effects of top quark affect neutrino mass eigenvalues, which are important for analyzing predictions such as mass squared differences and neutrinoless double beta decay in an underlying theory existing at high energy scale.
Risques naturels en montagne et analyse spatiale
Directory of Open Access Journals (Sweden)
Yannick Manche
1999-06-01
Full Text Available Le concept de risque repose sur deux notions :l'aléa, qui représente le phénomène physique par son amplitude et sa période retour ;la vulnérabilité, qui représente l'ensemble des biens et des personnes pouvant être touchés par un phénomène naturel.Le risque se définit alors comme le croisement de ces deux notions. Cette vision théorique permet de modéliser indépendamment les aléas et la vulnérabilité.Ce travail s'intéresse essentiellement à la prise en compte de la vulnérabilité dans la gestion des risques naturels. Son évaluation passe obligatoirement par une certaine analyse spatiale qui prend en compte l'occupation humaine et différentes échelles de l'utilisation de l'espace. Mais l'évaluation spatiale, que ce soit des biens et des personnes, ou des effets indirects se heurte à de nombreux problèmes. Il faut estimer l'importance de l'occupation de l'espace. Par ailleurs, le traitement des données implique des changements constants d'échelle pour passer des éléments ponctuels aux surfaces, ce que les systèmes d'information géographique ne gèrent pas parfaitement. La gestion des risques entraîne de fortes contraintes d'urbanisme, la prise en compte de la vulnérabilité permet de mieux comprendre et gérer les contraintes spatiales qu'impliquent les risques naturels. aléa, analyse spatiale, risques naturels, S.I.G., vulnérabilité
GPU based framework for geospatial analyses
Cosmin Sandric, Ionut; Ionita, Cristian; Dardala, Marian; Furtuna, Titus
2017-04-01
Parallel processing on multiple CPU cores is already used at large scale in geocomputing, but parallel processing on graphics cards is just at the beginning. Being able to use an simple laptop with a dedicated graphics card for advanced and very fast geocomputation is an advantage that each scientist wants to have. The necessity to have high speed computation in geosciences has increased in the last 10 years, mostly due to the increase in the available datasets. These datasets are becoming more and more detailed and hence they require more space to store and more time to process. Distributed computation on multicore CPU's and GPU's plays an important role by processing one by one small parts from these big datasets. These way of computations allows to speed up the process, because instead of using just one process for each dataset, the user can use all the cores from a CPU or up to hundreds of cores from GPU The framework provide to the end user a standalone tools for morphometry analyses at multiscale level. An important part of the framework is dedicated to uncertainty propagation in geospatial analyses. The uncertainty may come from the data collection or may be induced by the model or may have an infinite sources. These uncertainties plays important roles when a spatial delineation of the phenomena is modelled. Uncertainty propagation is implemented inside the GPU framework using Monte Carlo simulations. The GPU framework with the standalone tools proved to be a reliable tool for modelling complex natural phenomena The framework is based on NVidia Cuda technology and is written in C++ programming language. The code source will be available on github at https://github.com/sandricionut/GeoRsGPU Acknowledgement: GPU framework for geospatial analysis, Young Researchers Grant (ICUB-University of Bucharest) 2016, director Ionut Sandric
Topological Analyses of Symmetric Eruptive Prominences
Panasenco, O.; Martin, S. F.
Erupting prominences (filaments) that we have analyzed from Hα Doppler data at Helio Research and from SOHO/EIT 304 Å, show strong coherency between their chirality, the direction of the vertical and lateral motions of the top of the prominences, and the directions of twisting of their legs. These coherent properties in erupting prominences occur in two patterns of opposite helicity; they constitute a form of dynamic chirality called the ``roll effect." Viewed from the positive network side as they erupt, many symmetrically-erupting dextral prominences develop rolling motion toward the observer along with right-hand helicity in the left leg and left-hand helicity in the right leg. Many symmetricaly-erupting sinistral prominences, also viewed from the positive network field side, have the opposite pattern: rolling motion at the top away from the observer, left-hand helical twist in the left leg, and right-hand twist in the right leg. We have analysed the motions seen in the famous movie of the ``Grand Daddy" erupting prominence and found that it has all the motions that define the roll effect. From our analyses of this and other symmetric erupting prominences, we show that the roll effect is an alternative to the popular hypothetical configuration of an eruptive prominence as a twisted flux rope or flux tube. Instead we find that a simple flat ribbon can be bent such that it reproduces nearly all of the observed forms. The flat ribbon is the most logical beginning topology because observed prominence spines already have this topology prior to eruption and an initial long magnetic ribbon with parallel, non-twisted threads, as a basic form, can be bent into many more and different geometrical forms than a flux rope.
WIND SPEED AND ENERGY POTENTIAL ANALYSES
Directory of Open Access Journals (Sweden)
A. TOKGÖZLÜ
2013-01-01
Full Text Available This paper provides a case study on application of wavelet techniques to analyze wind speed and energy (renewable and environmental friendly energy. Solar and wind are main sources of energy that allows farmers to have the potential for transferring kinetic energy captured by the wind mill for pumping water, drying crops, heating systems of green houses, rural electrification's or cooking. Larger wind turbines (over 1 MW can pump enough water for small-scale irrigation. This study tried to initiate data gathering process for wavelet analyses, different scale effects and their role on wind speed and direction variations. The wind data gathering system is mounted at latitudes: 37° 50" N; longitude 30° 33" E and height: 1200 m above mean sea level at a hill near Süleyman Demirel University campus. 10 minutes average values of two levels wind speed and direction (10m and 30m above ground level have been recorded by a data logger between July 2001 and February 2002. Wind speed values changed between the range of 0 m/s and 54 m/s. Annual mean speed value is 4.5 m/s at 10 m ground level. Prevalent wind
Fast and accurate methods for phylogenomic analyses
Directory of Open Access Journals (Sweden)
Warnow Tandy
2011-10-01
Full Text Available Abstract Background Species phylogenies are not estimated directly, but rather through phylogenetic analyses of different gene datasets. However, true gene trees can differ from the true species tree (and hence from one another due to biological processes such as horizontal gene transfer, incomplete lineage sorting, and gene duplication and loss, so that no single gene tree is a reliable estimate of the species tree. Several methods have been developed to estimate species trees from estimated gene trees, differing according to the specific algorithmic technique used and the biological model used to explain differences between species and gene trees. Relatively little is known about the relative performance of these methods. Results We report on a study evaluating several different methods for estimating species trees from sequence datasets, simulating sequence evolution under a complex model including indels (insertions and deletions, substitutions, and incomplete lineage sorting. The most important finding of our study is that some fast and simple methods are nearly as accurate as the most accurate methods, which employ sophisticated statistical methods and are computationally quite intensive. We also observe that methods that explicitly consider errors in the estimated gene trees produce more accurate trees than methods that assume the estimated gene trees are correct. Conclusions Our study shows that highly accurate estimations of species trees are achievable, even when gene trees differ from each other and from the species tree, and that these estimations can be obtained using fairly simple and computationally tractable methods.
Network-Based and Binless Frequency Analyses.
Directory of Open Access Journals (Sweden)
Sybil Derrible
Full Text Available We introduce and develop a new network-based and binless methodology to perform frequency analyses and produce histograms. In contrast with traditional frequency analysis techniques that use fixed intervals to bin values, we place a range ±ζ around each individual value in a data set and count the number of values within that range, which allows us to compare every single value of a data set with one another. In essence, the methodology is identical to the construction of a network, where two values are connected if they lie within a given a range (±ζ. The value with the highest degree (i.e., most connections is therefore assimilated to the mode of the distribution. To select an optimal range, we look at the stability of the proportion of nodes in the largest cluster. The methodology is validated by sampling 12 typical distributions, and it is applied to a number of real-world data sets with both spatial and temporal components. The methodology can be applied to any data set and provides a robust means to uncover meaningful patterns and trends. A free python script and a tutorial are also made available to facilitate the application of the method.
Evaluation of the Olympus AU-510 analyser.
Farré, C; Velasco, J; Ramón, F
1991-01-01
The selective multitest Olympus AU-510 analyser was evaluated according to the recommendations of the Comision de Instrumentacion de la Sociedad Española de Quimica Clinica and the European Committee for Clinical Laboratory Standards. The evaluation was carried out in two stages: an examination of the analytical units and then an evaluation in routine work conditions. The operational characteristics of the system were also studied.THE FIRST STAGE INCLUDED A PHOTOMETRIC STUDY: dependent on the absorbance, the inaccuracy varies between +0.5% to -0.6% at 405 nm and from -5.6% to 10.6% at 340 nm; the imprecision ranges between -0.22% and 0.56% at 405 nm and between 0.09% and 2.74% at 340 nm. Linearity was acceptable, apart from a very low absorbance for NADH at 340 nm; and the imprecision of the serum sample pipetter was satisfactory.TWELVE SERUM ANALYTES WERE STUDIED UNDER ROUTINE CONDITIONS: glucose, urea urate, cholesterol, triglycerides, total bilirubin, creatinine, phosphate, iron, aspartate aminotransferase, alanine aminotransferase and gamma-glutamyl transferase.The within-run imprecision (CV%) ranged from 0.67% for phosphate to 2.89% for iron and the between-run imprecision from 0.97% for total bilirubin to 7.06% for iron. There was no carryover in a study of the serum sample pipetter. Carry-over studies with the reagent and sample pipetters shows some cross contamination in the iron assay.
Trend Analyses of Nitrate in Danish Groundwater
Hansen, B.; Thorling, L.; Dalgaard, T.; Erlandsen, M.
2012-04-01
This presentation assesses the long-term development in the oxic groundwater nitrate concentration and nitrogen (N) loss due to intensive farming in Denmark. Firstly, up to 20-year time-series from the national groundwater monitoring network enable a statistically systematic analysis of distribution, trends and trend reversals in the groundwater nitrate concentration. Secondly, knowledge about the N surplus in Danish agriculture since 1950 is used as an indicator of the potential loss of N. Thirdly, groundwater recharge CFC (Chlorofluorocarbon) age determination allows linking of the first two dataset. The development in the nitrate concentration of oxic groundwater clearly mirrors the development in the national agricultural N surplus, and a corresponding trend reversal is found in groundwater. Regulation and technical improvements in the intensive farming in Denmark have succeeded in decreasing the N surplus by 40% since the mid 1980s while at the same time maintaining crop yields and increasing the animal production of especially pigs. Trend analyses prove that the youngest (0-15 years old) oxic groundwater shows more pronounced significant downward nitrate trends (44%) than the oldest (25-50 years old) oxic groundwater (9%). This amounts to clear evidence of the effect of reduced nitrate leaching on groundwater nitrate concentrations in Denmark. Are the Danish groundwater monitoring strategy obtimal for detection of nitrate trends? Will the nitrate concentrations in Danish groundwater continue to decrease or are the Danish nitrate concentration levels now appropriate according to the Water Framework Directive?
Consumption patterns and perception analyses of hangwa.
Kwock, Chang Geun; Lee, Min A; Park, So Hyun
2012-03-01
Hangwa is a traditional food, corresponding to the current consumption trend, in need of marketing strategies to extend its consumption. Therefore, the purpose of this study was to analyze consumers' consumption patterns and perception of Hangwa to increase consumption in the market. A questionnaire was sent to 250 consumers by e-mail from Oct 8∼23, 2009 and the data from 231 persons were analyzed in this study. Statistical, descriptive, paired samples t-test, and importance-performance analyses were conducted using SPSS WIN 17.0. According to the results, Hangwa was purchased mainly 'for present' (39.8%) and the main reasons for buying it were 'traditional image' (33.3%) and 'taste' (22.5%). When importance and performance of attributes considered in purchasing Hangwa were evaluated, performance was assessed to be lower than importance for all attributes. The attributes in the first quadrant with a high importance and a high performance were 'a sanitary process', 'a rigorous quality mark' and 'taste', which were related with quality of the products. In addition, those with a high importance but a low performance were 'popularization through advertisement', 'promotion through mass media', 'conversion of thought on traditional foods', 'a reasonable price' and 'a wide range of price'. In conclusion, Hangwa manufacturers need to diversify products and extend the expiration date based on technologies to promote its consumption. In terms of price, Hangwa should become more available by lowering the price barrier for consumers who are sensitive to price.
Comparative analyses of bidirectional promoters in vertebrates
Directory of Open Access Journals (Sweden)
Taylor James
2008-05-01
Full Text Available Abstract Background Orthologous genes with deep phylogenetic histories are likely to retain similar regulatory features. In this report we utilize orthology assignments for pairs of genes co-regulated by bidirectional promoters to map the ancestral history of the promoter regions. Results Our mapping of bidirectional promoters from humans to fish shows that many such promoters emerged after the divergence of chickens and fish. Furthermore, annotations of promoters in deep phylogenies enable detection of missing data or assembly problems present in higher vertebrates. The functional importance of bidirectional promoters is indicated by selective pressure to maintain the arrangement of genes regulated by the promoter over long evolutionary time spans. Characteristics unique to bidirectional promoters are further elucidated using a technique for unsupervised classification, known as ESPERR. Conclusion Results of these analyses will aid in our understanding of the evolution of bidirectional promoters, including whether the regulation of two genes evolved as a consequence of their proximity or if function dictated their co-regulation.
Kinematic gait analyses in healthy Golden Retrievers
Directory of Open Access Journals (Sweden)
Gabriela C.A. Silva
2014-12-01
Full Text Available Kinematic analysis relates to the relative movement between rigid bodies and finds application in gait analysis and other body movements, interpretation of their data when there is change, determines the choice of treatment to be instituted. The objective of this study was to standardize the march of Dog Golden Retriever Healthy to assist in the diagnosis and treatment of musculoskeletal disorders. We used a kinematic analysis system to analyse the gait of seven dogs Golden Retriever, female, aged between 2 and 4 years, weighing 21.5 to 28 kg, clinically normal. Flexion and extension were described for shoulder, elbow, carpal, hip, femorotibialis and tarsal joints. The gait was characterized lateral and had accepted hypothesis of normality for all variables, except for the stance of hip and elbow, considering a confidence level of 95%, significance level α = 0.05. Variations have been attributed to displacement of the stripes during movement and the duplicated number of reviews. The kinematic analysis proved to be a consistent method of evaluation of the movement during canine gait and the data can be used in the diagnosis and evaluation of canine gait in comparison to other studies and treatment of dogs with musculoskeletal disorders.
A new modular chemiluminescence immunoassay analyser evaluated.
Ognibene, A; Drake, C J; Jeng, K Y; Pascucci, T E; Hsu, S; Luceri, F; Messeri, G
2000-03-01
Thyrotropin (TSH), free thyroxine (fT4) and testosterone assays have been used as a probe to evaluate the performances of a new modular chemiluminescence (CL) immunoassay analyser, the Abbott Architect 2000. The evaluation was run in parallel on other systems that use CL as the detection reaction: DPC Immulite, Chiron Diagnostics ACS-180 and ACS Centaur (TSH functional sensitivity only). TSH functional sensitivity was 0.0012, 0.009, 0.033 and 0.039 mU/I for the Architect, Immulite, ACS Centaur and ACS-180, respectively. Testosterone functional sensitivity was 0.38, 3.7 and 2.0 nmol/l for Architect, Immulite and ACS-180, respectively. Good correlation was obtained between the ACS-180 and Architect for all assays. The Immulite correlation did not agree well with the Architect or ACS-180 for fT4 and testosterone but was in good agreement for TSH. Regarding fT4 and testosterone, equilibrium dialysis and isotopic dilution gas-chromatography mass-spectrometry (GC-MS) respectively were used as reference methods. For both within- and between-run precision, the Architect showed the best reproducibility for all three analytes (CV < 6%).
Statistical analyses of a screen cylinder wake
Mohd Azmi, Azlin; Zhou, Tongming; Zhou, Yu; Cheng, Liang
2017-02-01
The evolution of a screen cylinder wake was studied by analysing its statistical properties over a streamwise range of x/d={10-60}. The screen cylinder was made of a stainless steel screen mesh of 67% porosity. The experiments were conducted in a wind tunnel at a Reynolds number of 7000 using an X-probe. The results were compared with those obtained in the wake generated by a solid cylinder. It was observed that the evolution of the statistics in the wake of the screen cylinder was different from that of a solid cylinder, reflecting the differences in the formation of the organized large-scale vortices in both wakes. The streamwise evolution of the Reynolds stresses, energy spectra and cross-correlation coefficients indicated that there exists a critical location that differentiates the screen cylinder wake into two regions over the measured streamwise range. The formation of the fully formed large-scale vortices was delayed until this critical location. Comparison with existing results for screen strips showed that although the near-wake characteristics and the vortex formation mechanism were similar between the two wake generators, variation in the Strouhal frequencies was observed and the self-preservation states were non-universal, reconfirming the dependence of a wake on its initial condition.
ANALYSES AND INFLUENCES OF GLAZED BUILDING ENVELOPES
Directory of Open Access Journals (Sweden)
Sabina Jordan
2011-01-01
Full Text Available The article presents the results of an analytical study of the functioning of glazing at two different yet interacting levels: at the level of the building as a whole, and at that of glazing as a building element. At the building level, analyses were performed on a sample of high-rise business buildings in Slovenia, where the glazing"s share of the building envelope was calculated, and estimates of the proportion of shade provided by external blinds were made. It is shown that, especially in the case of modern buildings with large proportions of glazing and buildings with no shading devices, careful glazing design is needed, together with a sound knowledge of energy performance. In the second part of the article, the energy balance values relating to selected types of glazing are presented, including solar control glazing. The paper demonstrates the need for a holistic energy approach to glazing problems, as well as how different types of glazing can be methodically compared, thus improving the design of sustainability-orientated buildings.
Genomic analyses of the CAM plant pineapple.
Zhang, Jisen; Liu, Juan; Ming, Ray
2014-07-01
The innovation of crassulacean acid metabolism (CAM) photosynthesis in arid and/or low CO2 conditions is a remarkable case of adaptation in flowering plants. As the most important crop that utilizes CAM photosynthesis, the genetic and genomic resources of pineapple have been developed over many years. Genetic diversity studies using various types of DNA markers led to the reclassification of the two genera Ananas and Pseudananas and nine species into one genus Ananas and two species, A. comosus and A. macrodontes with five botanical varieties in A. comosus. Five genetic maps have been constructed using F1 or F2 populations, and high-density genetic maps generated by genotype sequencing are essential resources for sequencing and assembling the pineapple genome and for marker-assisted selection. There are abundant expression sequence tag resources but limited genomic sequences in pineapple. Genes involved in the CAM pathway has been analysed in several CAM plants but only a few of them are from pineapple. A reference genome of pineapple is being generated and will accelerate genetic and genomic research in this major CAM crop. This reference genome of pineapple provides the foundation for studying the origin and regulatory mechanism of CAM photosynthesis, and the opportunity to evaluate the classification of Ananas species and botanical cultivars.
Abundance analyses of cool extreme helium stars
Pandey, G; Lambert, D L; Jeffery, C S; Asplund, M; Pandey, Gajendra; Lambert, David L.; Asplund, Martin
2001-01-01
Extreme helium stars (EHe) with effective temperatures from 8000K to 13000K are among the coolest EHe stars and overlap the hotter R CrB stars in effective temperature. The cool EHes may represent an evolutionary link between the hot EHes and the R CrBs. Abundance analyses of four cool EHes are presented. To test for an evolutionary connection, the chemical compositions of cool EHes are compared with those of hot EHes and R CrBs. Relative to Fe, the N abundance of these stars is intermediate between those of hot EHes and R CrBs. For the R CrBs, the metallicity M derived from the mean of Si and S appears to be more consistent with the kinematics than that derived from Fe. When metallicity M derived from Si and S replaces Fe, the observed N abundances of EHes and R CrBs fall at or below the upper limit corresponding to thorough conversion of initial C and O to N. There is an apparent difference between the composition of R CrBs and EHes; the former having systematically higher [N/M] ratios. The material present...
Social Media Analyses for Social Measurement.
Schober, Michael F; Pasek, Josh; Guggenheim, Lauren; Lampe, Cliff; Conrad, Frederick G
2016-01-01
Demonstrations that analyses of social media content can align with measurement from sample surveys have raised the question of whether survey research can be supplemented or even replaced with less costly and burdensome data mining of already-existing or "found" social media content. But just how trustworthy such measurement can be-say, to replace official statistics-is unknown. Survey researchers and data scientists approach key questions from starting assumptions and analytic traditions that differ on, for example, the need for representative samples drawn from frames that fully cover the population. New conversations between these scholarly communities are needed to understand the potential points of alignment and non-alignment. Across these approaches, there are major differences in (a) how participants (survey respondents and social media posters) understand the activity they are engaged in; (b) the nature of the data produced by survey responses and social media posts, and the inferences that are legitimate given the data; and (c) practical and ethical considerations surrounding the use of the data. Estimates are likely to align to differing degrees depending on the research topic and the populations under consideration, the particular features of the surveys and social media sites involved, and the analytic techniques for extracting opinions and experiences from social media. Traditional population coverage may not be required for social media content to effectively predict social phenomena to the extent that social media content distills or summarizes broader conversations that are also measured by surveys.
Field analyses of tritium at environmental levels
Energy Technology Data Exchange (ETDEWEB)
Hofstetter, K.J.; Cable, P.R.; Beals, D.M
1999-02-11
An automated, remote system to analyze tritium in aqueous solutions at environmental levels has been tested and has demonstrated laboratory quality tritium analysis capability in near real time. The field deployable tritium analysis system (FDTAS) consists of a novel multi-port autosampler, an on-line water purification system, and a prototype stop-flow liquid scintillation counter (LSC) which can be remotely controlled for unmanned operation. Backgrounds of {approx}1.5 counts/min in the tritium channel are routinely measured with a tritium detection efficiency of {approx}25% for the custom 11 ml cell. A detection limit of <0.3 pCi/ml has been achieved for 100-min counts using a 50 : 50 mixture of sample and cocktail. To assess the long-term performance characteristics of the FDTAS, a composite sampler was installed on the Savannah River, downstream of the Savannah River Site, and collected repetitive 12-hour composite samples over a 14 day period. The samples were analyzed using the FDTAS and in the laboratory using a standard bench-top LSC. The results of the tritium analyses by the FDTAS and by the laboratory LSC were consistent for comparable counting times at the typical river tritium background levels ({approx}1 pCi/ml)
Reliability Analyses of Groundwater Pollutant Transport
Energy Technology Data Exchange (ETDEWEB)
Dimakis, Panagiotis
1997-12-31
This thesis develops a probabilistic finite element model for the analysis of groundwater pollution problems. Two computer codes were developed, (1) one using finite element technique to solve the two-dimensional steady state equations of groundwater flow and pollution transport, and (2) a first order reliability method code that can do a probabilistic analysis of any given analytical or numerical equation. The two codes were connected into one model, PAGAP (Probability Analysis of Groundwater And Pollution). PAGAP can be used to obtain (1) the probability that the concentration at a given point at a given time will exceed a specified value, (2) the probability that the maximum concentration at a given point will exceed a specified value and (3) the probability that the residence time at a given point will exceed a specified period. PAGAP could be used as a tool for assessment purposes and risk analyses, for instance the assessment of the efficiency of a proposed remediation technique or to study the effects of parameter distribution for a given problem (sensitivity study). The model has been applied to study the greatest self sustained, precipitation controlled aquifer in North Europe, which underlies Oslo`s new major airport. 92 refs., 187 figs., 26 tabs.
Reproducibility of neuroimaging analyses across operating systems.
Glatard, Tristan; Lewis, Lindsay B; Ferreira da Silva, Rafael; Adalat, Reza; Beck, Natacha; Lepage, Claude; Rioux, Pierre; Rousseau, Marc-Etienne; Sherif, Tarek; Deelman, Ewa; Khalili-Mahani, Najmeh; Evans, Alan C
2015-01-01
Neuroimaging pipelines are known to generate different results depending on the computing platform where they are compiled and executed. We quantify these differences for brain tissue classification, fMRI analysis, and cortical thickness (CT) extraction, using three of the main neuroimaging packages (FSL, Freesurfer and CIVET) and different versions of GNU/Linux. We also identify some causes of these differences using library and system call interception. We find that these packages use mathematical functions based on single-precision floating-point arithmetic whose implementations in operating systems continue to evolve. While these differences have little or no impact on simple analysis pipelines such as brain extraction and cortical tissue classification, their accumulation creates important differences in longer pipelines such as subcortical tissue classification, fMRI analysis, and cortical thickness extraction. With FSL, most Dice coefficients between subcortical classifications obtained on different operating systems remain above 0.9, but values as low as 0.59 are observed. Independent component analyses (ICA) of fMRI data differ between operating systems in one third of the tested subjects, due to differences in motion correction. With Freesurfer and CIVET, in some brain regions we find an effect of build or operating system on cortical thickness. A first step to correct these reproducibility issues would be to use more precise representations of floating-point numbers in the critical sections of the pipelines. The numerical stability of pipelines should also be reviewed.
Non-independence and sensitivity analyses in ecological and evolutionary meta-analyses.
Noble, Daniel W A; Lagisz, Malgorzata; O'dea, Rose E; Nakagawa, Shinichi
2017-01-30
Meta-analysis is an important tool for synthesizing research on a variety of topics in ecology and evolution, including molecular ecology, but can be susceptible to non-independence. Non-independence can affect two major interrelated components of a meta-analysis: 1) the calculation of effect size statistics and 2) the estimation of overall meta-analytic estimates and their uncertainty. While some solutions to non-independence exist at the statistical analysis stages, there is little advice on what to do when complex analyses are not possible, or when studies with non-independent experimental designs exist in the data. Here we argue that exploring the effects of procedural decisions in a meta-analysis (e.g., inclusion of different quality data, choice of effect size) and statistical assumptions (e.g., assuming no phylogenetic covariance) using sensitivity analyses are extremely important in assessing the impact of non-independence. Sensitivity analyses can provide greater confidence in results and highlight important limitations of empirical work (e.g., impact of study design on overall effects). Despite their importance, sensitivity analyses are seldom applied to problems of non-independence. To encourage better practice for dealing with non-independence in meta-analytic studies, we present accessible examples demonstrating the impact that ignoring non-independence can have on meta-analytic estimates. We also provide pragmatic solutions for dealing with non-independent study designs, and for analyzing dependent effect sizes. Additionally, we offer reporting guidelines that will facilitate disclosure of the sources of non-independence in meta-analyses, leading to greater transparency and more robust conclusions. This article is protected by copyright. All rights reserved.
Analyse textuelle des discours: Niveaux ou plans d´analyse
Directory of Open Access Journals (Sweden)
Jean-Michel Adam
2012-12-01
Full Text Available L’article porte sur la théorie de l´Analyse Textuelle des Discours, à partir d´une reprisede la traduction brésilienne de La linguistique textuelle: introduction à l’analyse textuelle desdiscours (Cortez, 2008. L’ATD est pensée en fonction de trois observations préliminaires: lalinguistique textuelle est une des disciplines de l’analyse de discours, le texte est l’objet d’analysede l’ATD, et, dès qu’il y a texte, c’est-à-dire reconnaissance du fait qu’une suite d’énoncésforme un tout de communication, il y a effet de généricité, c’est-à-dire inscription de cette suited’énoncés dans une classe de discours. Le modèle théorique de l’ATD est éclairé par une reprisede son schéma 4, où sont représentés huit niveaux d’analyse. L´ATD est abordée sous l’angled’une double exigence – des raisons théoriques et des raisons méthodologiques et didactiquesqui conduisent à ces niveaux – et sont détaillées et illustrées les cinq plans ou niveaux d’analysetextuelle. Pour finir, des parties de l’oeuvre sont reprises et élargies, avec d’autres analyses où denouveaux aspcts théoriques sont détaillés.
Fracturing and brittleness index analyses of shales
Barnhoorn, Auke; Primarini, Mutia; Houben, Maartje
2016-04-01
The formation of a fracture network in rocks has a crucial control on the flow behaviour of fluids. In addition, an existing network of fractures , influences the propagation of new fractures during e.g. hydraulic fracturing or during a seismic event. Understanding of the type and characteristics of the fracture network that will be formed during e.g. hydraulic fracturing is thus crucial to better predict the outcome of a hydraulic fracturing job. For this, knowledge of the rock properties is crucial. The brittleness index is often used as a rock property that can be used to predict the fracturing behaviour of a rock for e.g. hydraulic fracturing of shales. Various terminologies of the brittleness index (BI1, BI2 and BI3) exist based on mineralogy, elastic constants and stress-strain behaviour (Jin et al., 2014, Jarvie et al., 2007 and Holt et al., 2011). A maximum brittleness index of 1 predicts very good and efficient fracturing behaviour while a minimum brittleness index of 0 predicts a much more ductile shale behaviour. Here, we have performed systematic petrophysical, acoustic and geomechanical analyses on a set of shale samples from Whitby (UK) and we have determined the three different brittleness indices on each sample by performing all the analyses on each of the samples. We show that each of the three brittleness indices are very different for the same sample and as such it can be concluded that the brittleness index is not a good predictor of the fracturing behaviour of shales. The brittleness index based on the acoustic data (BI1) all lie around values of 0.5, while the brittleness index based on the stress strain data (BI2) give an average brittleness index around 0.75, whereas the mineralogy brittleness index (BI3) predict values below 0.2. This shows that by using different estimates of the brittleness index different decisions can be made for hydraulic fracturing. If we would rely on the mineralogy (BI3), the Whitby mudstone is not a suitable
Summary of the analyses for recovery factors
Verma, Mahendra K.
2017-07-17
IntroductionIn order to determine the hydrocarbon potential of oil reservoirs within the U.S. sedimentary basins for which the carbon dioxide enhanced oil recovery (CO2-EOR) process has been considered suitable, the CO2 Prophet model was chosen by the U.S. Geological Survey (USGS) to be the primary source for estimating recovery-factor values for individual reservoirs. The choice was made because of the model’s reliability and the ease with which it can be used to assess a large number of reservoirs. The other two approaches—the empirical decline curve analysis (DCA) method and a review of published literature on CO2-EOR projects—were deployed to verify the results of the CO2 Prophet model. This chapter discusses the results from CO2 Prophet (chapter B, by Emil D. Attanasi, this report) and compares them with results from decline curve analysis (chapter C, by Hossein Jahediesfanjani) and those reported in the literature for selected reservoirs with adequate data for analyses (chapter D, by Ricardo A. Olea).To estimate the technically recoverable hydrocarbon potential for oil reservoirs where CO2-EOR has been applied, two of the three approaches—CO2 Prophet modeling and DCA—do not include analysis of economic factors, while the third approach—review of published literature—implicitly includes economics. For selected reservoirs, DCA has provided estimates of the technically recoverable hydrocarbon volumes, which, in combination with calculated amounts of original oil in place (OOIP), helped establish incremental CO2-EOR recovery factors for individual reservoirs.The review of published technical papers and reports has provided substantial information on recovery factors for 70 CO2-EOR projects that are either commercially profitable or classified as pilot tests. When comparing the results, it is important to bear in mind the differences and limitations of these three approaches.
Pipeline for macro- and microarray analyses
Directory of Open Access Journals (Sweden)
R. Vicentini
2007-05-01
Full Text Available The pipeline for macro- and microarray analyses (PMmA is a set of scripts with a web interface developed to analyze DNA array data generated by array image quantification software. PMmA is designed for use with single- or double-color array data and to work as a pipeline in five classes (data format, normalization, data analysis, clustering, and array maps. It can also be used as a plugin in the BioArray Software Environment, an open-source database for array analysis, or used in a local version of the web service. All scripts in PMmA were developed in the PERL programming language and statistical analysis functions were implemented in the R statistical language. Consequently, our package is a platform-independent software. Our algorithms can correctly select almost 90% of the differentially expressed genes, showing a superior performance compared to other methods of analysis. The pipeline software has been applied to 1536 expressed sequence tags macroarray public data of sugarcane exposed to cold for 3 to 48 h. PMmA identified thirty cold-responsive genes previously unidentified in this public dataset. Fourteen genes were up-regulated, two had a variable expression and the other fourteen were down-regulated in the treatments. These new findings certainly were a consequence of using a superior statistical analysis approach, since the original study did not take into account the dependence of data variability on the average signal intensity of each gene. The web interface, supplementary information, and the package source code are available, free, to non-commercial users at http://ipe.cbmeg.unicamp.br/pub/PMmA.
Computational Analyses of Pressurization in Cryogenic Tanks
Ahuja, Vineet; Hosangadi, Ashvin; Lee, Chun P.; Field, Robert E.; Ryan, Harry
2010-01-01
A comprehensive numerical framework utilizing multi-element unstructured CFD and rigorous real fluid property routines has been developed to carry out analyses of propellant tank and delivery systems at NASA SSC. Traditionally CFD modeling of pressurization and mixing in cryogenic tanks has been difficult primarily because the fluids in the tank co-exist in different sub-critical and supercritical states with largely varying properties that have to be accurately accounted for in order to predict the correct mixing and phase change between the ullage and the propellant. For example, during tank pressurization under some circumstances, rapid mixing of relatively warm pressurant gas with cryogenic propellant can lead to rapid densification of the gas and loss of pressure in the tank. This phenomenon can cause serious problems during testing because of the resulting decrease in propellant flow rate. With proper physical models implemented, CFD can model the coupling between the propellant and pressurant including heat transfer and phase change effects and accurately capture the complex physics in the evolving flowfields. This holds the promise of allowing the specification of operational conditions and procedures that could minimize the undesirable mixing and heat transfer inherent in propellant tank operation. In our modeling framework, we incorporated two different approaches to real fluids modeling: (a) the first approach is based on the HBMS model developed by Hirschfelder, Beuler, McGee and Sutton and (b) the second approach is based on a cubic equation of state developed by Soave, Redlich and Kwong (SRK). Both approaches cover fluid properties and property variation spanning sub-critical gas and liquid states as well as the supercritical states. Both models were rigorously tested and properties for common fluids such as oxygen, nitrogen, hydrogen etc were compared against NIST data in both the sub-critical as well as supercritical regimes.
Analysing policy transfer: perspectives for operational research.
Bissell, K; Lee, K; Freeman, R
2011-09-01
Policy transfer occurs regularly. In essence, a strategy developed elsewhere is taken up and applied in another policy context. Yet what precisely is policy transfer and, more importantly, under what conditions does it occur? This paper describes policy transfer and addresses three main questions, exploring what perspectives of policy transfer might contribute to operational research (OR) efforts. First, what facilitates the transfer of OR results into policy and practice? Second, what facilitates effective lesson-drawing about OR results and processes between and within countries? And third, what would increase the amount of OR being carried out by low- and middle-income countries and used to inform policy and practice at local and global levels? Mexico's adoption and adaptation of the DOTS strategy is used here as an example of policy transfer. Policy transfer is relevant to all countries, levels and arenas of people, institutions and organisations involved in health. With a more systematic analysis of learning and policy processes, OR policy and practice outcomes could be improved at all levels, from local to global. Policy transfer offers theory and concepts for analysing OR from a new perspective. The present paper proposes a model of the policy transfer process for qualitative research use. Comprehensive policy transfer research, given its length, complexity and need for qualitative researchers, should not be envisaged for all OR projects. All OR projects could, however, incorporate some concepts and practical tools inspired from this model. This should help to plan, evaluate and improve OR processes and the resulting changes in policy and practice.
Kuosheng Mark III containment analyses using GOTHIC
Energy Technology Data Exchange (ETDEWEB)
Lin, Ansheng, E-mail: samuellin1999@iner.gov.tw; Chen, Yen-Shu; Yuann, Yng-Ruey
2013-10-15
Highlights: • The Kuosheng Mark III containment model is established using GOTHIC. • Containment pressure and temperature responses due to LOCA are presented. • The calculated results are all below the design values and compared with the FSAR results. • The calculated results can be served as an analysis reference for an SPU project in the future. -- Abstract: Kuosheng nuclear power plant in Taiwan is a twin-unit BWR/6 plant, and both units utilize the Mark III containment. Currently, the plant is performing a stretch power uprate (SPU) project to increase the core thermal power to 103.7% OLTP (original licensed thermal power). However, the containment response in the Kuosheng Final Safety Analysis Report (FSAR) was completed more than twenty-five years ago. The purpose of this study is to establish a Kuosheng Mark III containment model using the containment program GOTHIC. The containment pressure and temperature responses under the design-basis accidents, which are the main steam line break (MSLB) and the recirculation line break (RCLB) accidents, are investigated. Short-term and long-term analyses are presented in this study. The short-term analysis is to calculate the drywell peak pressure and temperature which happen in the early stage of the LOCAs. The long-term analysis is to calculate the peak pressure and temperature of the reactor building space. In the short-term analysis, the calculated peak drywell to wetwell differential pressure is 140.6 kPa for the MSLB, which is below than the design value of 189.6 kPa. The calculated peak drywell temperature is 158 °C, which is still below the design value of 165.6 °C. In addition, in the long-term analysis, the calculated peak containment pressure is 47 kPa G, which is below the design value of 103.4 kPa G. The calculated peak values of containment temperatures are 74.7 °C, which is lower than the design value of 93.3 °C. Therefore, the Kuosheng Mark III containment can maintain the integrity after
Genome-Facilitated Analyses of Geomicrobial Processes
Energy Technology Data Exchange (ETDEWEB)
Kenneth H. Nealson
2012-05-02
that makes up chitin, virtually all of the strains were in fact capable. This led to the discovery of a great many new genes involved with chitin and NAG metabolism (7). In a similar vein, a detailed study of the sugar utilization pathway revealed a major new insight into the regulation of sugar metabolism in this genus (19). Systems Biology and Comparative Genomics of the shewanellae: Several publications were put together describing the use of comparative genomics for analyses of the group Shewanella, and these were a logical culmination of our genomic-driven research (10,15,18). Eight graduate students received their Ph.D. degrees doing part of the work described here, and four postdoctoral fellows were supported. In addition, approximately 20 undergraduates took part in projects during the grant period.
First Super-Earth Atmosphere Analysed
2010-12-01
The atmosphere around a super-Earth exoplanet has been analysed for the first time by an international team of astronomers using ESO's Very Large Telescope. The planet, which is known as GJ 1214b, was studied as it passed in front of its parent star and some of the starlight passed through the planet's atmosphere. We now know that the atmosphere is either mostly water in the form of steam or is dominated by thick clouds or hazes. The results will appear in the 2 December 2010 issue of the journal Nature. The planet GJ 1214b was confirmed in 2009 using the HARPS instrument on ESO's 3.6-metre telescope in Chile (eso0950) [1]. Initial findings suggested that this planet had an atmosphere, which has now been confirmed and studied in detail by an international team of astronomers, led by Jacob Bean (Harvard-Smithsonian Center for Astrophysics), using the FORS instrument on ESO's Very Large Telescope. "This is the first super-Earth to have its atmosphere analysed. We've reached a real milestone on the road toward characterising these worlds," said Bean. GJ 1214b has a radius of about 2.6 times that of the Earth and is about 6.5 times as massive, putting it squarely into the class of exoplanets known as super-Earths. Its host star lies about 40 light-years from Earth in the constellation of Ophiuchus (the Serpent Bearer). It is a faint star [2], but it is also small, which means that the size of the planet is large compared to the stellar disc, making it relatively easy to study [3]. The planet travels across the disc of its parent star once every 38 hours as it orbits at a distance of only two million kilometres: about seventy times closer than the Earth orbits the Sun. To study the atmosphere, the team observed the light coming from the star as the planet passed in front of it [4]. During these transits, some of the starlight passes through the planet's atmosphere and, depending on the chemical composition and weather on the planet, specific wavelengths of light are
Analyse énonciative du point du vue, narration et analyse de discours
Directory of Open Access Journals (Sweden)
Alain Rabatel
2007-01-01
Full Text Available Cet article montre que l’analyse énonciative du point de vue (PDV, en rupture avec la typologie des focalisations de Genette, peut renouveler partiellement la narratologie, à la condition de substituter à l’approche immanentiste du récit une analyse interactionnelle de la narration. L’article présente d’abord l’approche énonciative du PDV, en appui des théories de Ducrot, et, sur cette base, propose diverses modalités de PDV (représenté, raconté, asserté qui donnent corps au point de vue des personnages ou du narrateur en modifiant sensiblement les analyses de Genette. Dans une deuxième partie, l’article envisage le rôle des PDV dans la narration, notamment dans la réévaluation des dimensions cognitive et pragmatique de la mimésis, puis dans les mécanismes inférentiels-interprétatifs, proches du système de sympathie de Jouve, enfin, dans la revalorisation du rôle du narrateur •puisque ce dernier se construit dans le même temps qu’il construit ses personnages.
Bowden, Jack; Fall, Tove; Ingelsson, Erik; Thompson, Simon G.
2017-01-01
Mendelian randomization investigations are becoming more powerful and simpler to perform, due to the increasing size and coverage of genome-wide association studies and the increasing availability of summarized data on genetic associations with risk factors and disease outcomes. However, when using multiple genetic variants from different gene regions in a Mendelian randomization analysis, it is highly implausible that all the genetic variants satisfy the instrumental variable assumptions. This means that a simple instrumental variable analysis alone should not be relied on to give a causal conclusion. In this article, we discuss a range of sensitivity analyses that will either support or question the validity of causal inference from a Mendelian randomization analysis with multiple genetic variants. We focus on sensitivity analyses of greatest practical relevance for ensuring robust causal inferences, and those that can be undertaken using summarized data. Aside from cases in which the justification of the instrumental variable assumptions is supported by strong biological understanding, a Mendelian randomization analysis in which no assessment of the robustness of the findings to violations of the instrumental variable assumptions has been made should be viewed as speculative and incomplete. In particular, Mendelian randomization investigations with large numbers of genetic variants without such sensitivity analyses should be treated with skepticism. PMID:27749700
Runtime and Pressurization Analyses of Propellant Tanks
Field, Robert E.; Ryan, Harry M.; Ahuja, Vineet; Hosangadi, Ashvin; Lee, Chung P.
2007-01-01
Multi-element unstructured CFD has been utilized at NASA SSC to carry out analyses of propellant tank systems in different modes of operation. The three regimes of interest at SSC include (a) tank chill down (b) tank pressurization and (c) runtime propellant draw-down and purge. While tank chill down is an important event that is best addressed with long time-scale heat transfer calculations, CFD can play a critical role in the tank pressurization and runtime modes of operation. In these situations, problems with contamination of the propellant by inclusion of the pressurant gas from the ullage causes a deterioration of the quality of the propellant delivered to the test article. CFD can be used to help quantify the mixing and propellant degradation. During tank pressurization under some circumstances, rapid mixing of relatively warm pressurant gas with cryogenic propellant can lead to rapid densification of the gas and loss of pressure in the tank. This phenomenon can cause serious problems during testing because of the resulting decrease in propellant flow rate. With proper physical models implemented, CFD can model the coupling between the propellant and pressurant including heat transfer and phase change effects and accurately capture the complex physics in the evolving flowfields. This holds the promise of allowing the specification of operational conditions and procedures that could minimize the undesirable mixing and heat transfer inherent in propellant tank operation. It should be noted that traditional CFD modeling is inadequate for such simulations because the fluids in the tank are in a range of different sub-critical and supercritical states and elaborate phase change and mixing rules have to be developed to accurately model the interaction between the ullage gas and the propellant. We show a typical run-time simulation of a spherical propellant tank, containing RP-1 in this case, being pressurized with room-temperature nitrogen at 540 R. Nitrogen
FY01 Supplemental Science and Performance Analyses, Volume 1: Scientific Bases and Analyses, Rev 00
Energy Technology Data Exchange (ETDEWEB)
David Dobson
2001-06-30
The U.S. Department of Energy (DOE) is considering the possible recommendation of a site at Yucca Mountain, Nevada, for development as a geologic repository for the disposal of high-level radioactive waste and spent nuclear fuel. To facilitate public review and comment, in May 2001 the DOE released the Yucca Mountain Science and Engineering Report (S&ER) (DOE 2001 [DIRS 153849]), which presents technical information supporting the consideration of the possible site recommendation. The report summarizes the results of more than 20 years of scientific and engineering studies. A decision to recommend the site has not been made: the DOE has provided the S&ER and its supporting documents as an aid to the public in formulating comments on the possible recommendation. When the S&ER (DOE 2001 [DIRS 153849]) was released, the DOE acknowledged that technical and scientific analyses of the site were ongoing. Therefore, the DOE noted in the Federal Register Notice accompanying the report (66 FR 23 013 [DIRS 155009], p. 2) that additional technical information would be released before the dates, locations, and times for public hearings on the possible recommendation were announced. This information includes: (1) the results of additional technical studies of a potential repository at Yucca Mountain, contained in this FY01 Supplemental Science and Performance Analyses: Vol. 1, Scientific Bases and Analyses; and FY01 Supplemental Science and Performance Analyses: Vol. 2, Performance Analyses (McNeish 2001 [DIRS 155023]) (collectively referred to as the SSPA) and (2) a preliminary evaluation of the Yucca Mountain site's preclosure and postclosure performance against the DOE's proposed site suitability guidelines (10 CFR Part 963 [64 FR 67054] [DIRS 124754]). By making the large amount of information developed on Yucca Mountain available in stages, the DOE intends to provide the public and interested parties with time to review the available materials and to formulate and
MEDUSA (Martian Environmental DUst Systematic Analyser)
Battaglia, R.; Colangeli, L.; della Corte, V.; Esposito, F.; Ferrini, G.; Mazzotta Epifani, E.; Palomba, E.; Palumbo, P.; Panizza, A.; Rotundi, A.
2003-04-01
) onboard the Mars Global Surveyor. Seasonal variations in the column abundance are due to the combined effect of exchange of H_2O between atmosphere and water reservoirs (i.e. polar caps, regolith) and atmospheric transport. Despite the low absolute water content (0.03% by volume), relative humidity can exceed 100% leading to frosting phenomena, thanks to low Martian temperatures. The typical value of the pressure at surface, close to the triple point value of water phase diagram, makes the persistence of liquid water at the surface of Mars highly improbable. This means that the water is probably present exclusively in gaseous and solid states, at the surface level. Attempts to use space-born and earth-based observations to estimate quantitatively surface and near-surface sources and sinks of water vapour have had good but also partial success. Most important questions that appear from the present knowledge is how the water vapour atmospheric circulation occurs and how to explain the difference in the hemispheric and seasonal behaviour of the water vapour. Despite TES results showed that a percentage of hemispheric "asymmetry" of the seasonal vapour abundance was probably due to the presence of two dust storms during MAWD observations, an evident difference remains partially unexplained. In this context, it is extremely important to study the role of the different contributions to the production of atmospheric vapour from the main reservoirs and to the formation of water ice clouds most probably catalysed by the atmospheric dust. At present, no in situ measurement of water vapour content was performed yet. We discuss the possibility of using a new concept instrument for extraterrestrial planetary environments, based on the past experience acquired for dust monitoring in space and on Earth and new possible technologies for space applications. MEDUSA (Martian Environmental Dust Analyser) project is a multisensor and multistage instrument based on an optical detector of dust
L'analyse qualitative comme approche multiple
Directory of Open Access Journals (Sweden)
Roberto Cipriani
2009-11-01
Full Text Available L’exemple de l’enquête historique, visant à identifier les caractéristiques de la naissance et du développement d’une science et des lectures qu’elle donne des événements sociaux, est des plus originaux. Toute méthodologie historique non seulement débouche sur une pure et simple masse d’épisodes et d’événements, mais est également une narration et une élaboration critique de ces mêmes faits. Michael Postan écrit à juste titre que la complexité des données historiques est cependant de telle nature, et les différences et les similitudes tellement difficiles à cerner, que les efforts des historiens et des sociologues pour construire des comparaisons explicites se sont soldées, pour la plupart, par des tentatives grossières et naïves. La leçon des Annales a contribué en effet à construire l’idée d’une histoire qui puisse lire et expliquer ce qui est uniforme et ce qui est singulier. Rien de plus naturel que la réunion d’« êtres psychiques », à l’instar de l’assemblage des cellules en un organisme, en un « être psychique » nouveau et différent. Un tournant s’impose donc vers une expérimentation empirique plus ample et plus correcte, afin de disposer des instruments adéquats, capables de garantir à la méthodologie micro, qualitative et biographique, une fiabilité suffisante.Historical approach offers a relevant contribution in order to find the features of birth and development of a science which analyses social events. Historical methodology produces not only a lot of data but also a narrative, and an interpretation of facts. According to Michael Postan, history and sociology have made many efforts to compare data that are complex but similar and different at the same time. And the results seem to be naïf. Thanks to Les Annales suggestion it is possible to read and to explain what is uniform and what is singular. To put together “psychical beings”, like organic cells, in a new
Non-destructive infrared analyses: a method for provenance analyses of sandstones
Bowitz, Jörg; Ehling, Angela
2008-12-01
Infrared spectroscopy (IR spectroscopy) is commonly applied in the laboratory for mineral analyses in addition to XRD. Because such technical efforts are time and cost consuming, we present an infrared-based mobile method for non-destructive mineral and provenance analyses of sandstones. IR spectroscopy is based on activating chemical bonds. By irradiating a mineral mixture, special bonds are activated to vibrate depending on the bond energy (resonance vibration). Accordingly, the energy of the IR spectrum will be reduced thereby generating an absorption spectrum. The positions of the absorption maxima within the spectral region indicate the type of the bonds and in many cases identify minerals containing these bonds. The non-destructive reflection spectroscopy operates in the near infrared region (NIR) and can detect all common clay minerals as well as sulfates, hydroxides and carbonates. The spectra produced have been interpreted by computer using digital mineral libraries that have been especially collected for sandstones. The comparison of all results with XRD, RFA and interpretations of thin sections demonstrates impressively the accuracy and reliability of this method. Not only are different minerals detectable, but also differently ordered kaolinites and varieties of illites can be identified by the shape and size of the absorption bands. Especially clay minerals and their varieties in combination with their relative contents form the characteristic spectra of sandstones. Other components such as limonite, hematite and amorphous silica also influence the spectra. Sandstones, similar in colour and texture, often can be identified by their characteristic reflectance spectra. Reference libraries with more than 60 spectra of important German sandstones have been created to enable entirely computerized interpretations and identifications of these dimension stones. The analysis of infrared spectroscopy results is demonstrated with examples of different sandstones
Statistical power analyses using G*Power 3.1: tests for correlation and regression analyses.
Faul, Franz; Erdfelder, Edgar; Buchner, Axel; Lang, Albert-Georg
2009-11-01
G*Power is a free power analysis program for a variety of statistical tests. We present extensions and improvements of the version introduced by Faul, Erdfelder, Lang, and Buchner (2007) in the domain of correlation and regression analyses. In the new version, we have added procedures to analyze the power of tests based on (1) single-sample tetrachoric correlations, (2) comparisons of dependent correlations, (3) bivariate linear regression, (4) multiple linear regression based on the random predictor model, (5) logistic regression, and (6) Poisson regression. We describe these new features and provide a brief introduction to their scope and handling.
Integrating and scheduling an open set of static analyses
DEFF Research Database (Denmark)
Eichberg, Michael; Mezini, Mira; Kloppenburg, Sven
2006-01-01
to keep the set of analyses open. We propose an approach to integrating and scheduling an open set of static analyses which decouples the individual analyses and coordinates the analysis executions such that the overall time and space consumption is minimized. The approach has been implemented...... for the Eclipse IDE and has been used to integrate a wide range of analyses such as finding bug patterns, detecting violations of design guidelines, or type system extensions for Java....
Analyses of the stability and core taxonomic memberships of the human microbiome.
Li, Kelvin; Bihan, Monika; Methé, Barbara A
2013-01-01
Analyses of the taxonomic diversity associated with the human microbiome continue to be an area of great importance. The study of the nature and extent of the commonly shared taxa ("core"), versus those less prevalent, establishes a baseline for comparing healthy and diseased groups by quantifying the variation among people, across body habitats and over time. The National Institutes of Health (NIH) sponsored Human Microbiome Project (HMP) has provided an unprecedented opportunity to examine and better define what constitutes the taxonomic core within and across body habitats and individuals through pyrosequencing-based profiling of 16S rRNA gene sequences from oral, skin, distal gut (stool), and vaginal body habitats from over 200 healthy individuals. A two-parameter model is introduced to quantitatively identify the core taxonomic members of each body habitat's microbiota across the healthy cohort. Using only cutoffs for taxonomic ubiquity and abundance, core taxonomic members were identified for each of the 18 body habitats and also for the 4 higher-level body regions. Although many microbes were shared at low abundance, they exhibited a relatively continuous spread in both their abundance and ubiquity, as opposed to a more discretized separation. The numbers of core taxa members in the body regions are comparatively small and stable, reflecting the relatively high, but conserved, interpersonal variability within the cohort. Core sizes increased across the body regions in the order of: vagina, skin, stool, and oral cavity. A number of "minor" oral taxonomic core were also identified by their majority presence across the cohort, but with relatively low and stable abundances. A method for quantifying the difference between two cohorts was introduced and applied to samples collected on a second visit, revealing that over time, the oral, skin, and stool body regions tended to be more transient in their taxonomic structure than the vaginal body region.
Energy Technology Data Exchange (ETDEWEB)
Jantzen, Carol M. [Savannah River Site (SRS), Aiken, SC (United States). Savannah River National Lab. (SRNL); Missimer, David M. [Savannah River Site (SRS), Aiken, SC (United States). Savannah River National Lab. (SRNL); Guenther, Chris P. [National Energy Technology Lab. (NETL), Morgantown, WV (United States); Shekhawat, Dushyant [National Energy Technology Lab. (NETL), Morgantown, WV (United States); VanEssendelft, Dirk T. [National Energy Technology Lab. (NETL), Morgantown, WV (United States); Means, Nicholas C. [AECOM Technology Corp., Oak Ridge, TN (United States)
2015-04-23
in process piping and materials, in excessive off-gas absorbent loading, and in undesired process emissions. The ash content of the coal is important as the ash adds to the DMR and other vessel products which affect the final waste product mass and composition. The amount and composition of the ash also affects the reaction kinetics. Thus ash content and composition contributes to the mass balance. In addition, sodium, potassium, calcium, sulfur, and maybe silica and alumina in the ash may contribute to wall-scale formation. Sodium, potassium, and alumina in the ash will be overwhelmed by the sodium, potassium, and alumina from the feed but the impact from the other ash components needs to be quantified. A maximum coal particle size is specified so the feed system does not plug and a minimum particle size is specified to prevent excess elutriation from the DMR to the Process Gas Filter (PGF). A vendor specification was used to procure the calcined coal for IWTU processing. While the vendor supplied a composite analysis for the 22 tons of coal (Appendix A), this study compares independent analyses of the coal performed at the Savannah River National Laboratory (SRNL) and at the National Energy Technology Laboratory (NETL). Three supersacks a were sampled at three different heights within the sack in order to determine within bag variability and between bag variability of the coal. These analyses were also compared to the vendor’s composite analyses and to the coal specification. These analyses were also compared to historic data on Bestac coal analyses that had been performed at Hazen Research Inc. (HRI) between 2004-2011.
Energy Technology Data Exchange (ETDEWEB)
Moussa, P. [Commissariat a l' Energie Atomique, 91 - Saclay (France). Centre d' Etudes Nucleaires
1968-06-01
This work describes the angular analysis of reactions between particles with spin in a fully relativistic fashion. One particle states are introduced, following Wigner's method, as representations of the inhomogeneous Lorentz group. In order to perform the angular analyses, the reduction of the product of two representations of the inhomogeneous Lorentz group is studied. Clebsch-Gordan coefficients are computed for the following couplings: l-s coupling, helicity coupling, multipolar coupling, and symmetric coupling for more than two particles. Massless and massive particles are handled simultaneously. On the way we construct spinorial amplitudes and free fields; we recall how to establish convergence theorems for angular expansions from analyticity hypothesis. Finally we substitute these hypotheses to the idea of 'potential radius', which gives at low energy the usual 'centrifugal barrier' factors. The presence of such factors had never been deduced from hypotheses compatible with relativistic invariance. (author) [French] On decrit un formalisme permettant de tenir compte de l'invariance relativiste, dans l'analyse angulaire des amplitudes de reaction entre particules de spin quelconque. Suivant Wigner, les etats a une particule sont introduits a l'aide des representations du groupe de Lorentz inhomogene. Pour effectuer les analyses angulaires, on etudie la reduction du produit de deux representations du groupe de Lorentz inhomogene. Les coefficients de Clebsch-Gordan correspondants sont calcules dans les couplages suivants: couplage l-s couplage d'helicite, couplage multipolaire, couplage symetrique pour plus de deux particules. Les particules de masse nulle et de masse non nulle sont traitees simultanement. Au passage, on introduit les amplitudes spinorielles et on construit les champs libres, on rappelle comment des hypotheses d'analyticite permettent d'etablir des theoremes de convergence pour les
SENSITIVITY ANALYSIS FOR SALTSTONE DISPOSAL UNIT COLUMN DEGRADATION ANALYSES
Energy Technology Data Exchange (ETDEWEB)
Flach, G.
2014-10-28
PORFLOW related analyses supporting a Sensitivity Analysis for Saltstone Disposal Unit (SDU) column degradation were performed. Previous analyses, Flach and Taylor 2014, used a model in which the SDU columns degraded in a piecewise manner from the top and bottom simultaneously. The current analyses employs a model in which all pieces of the column degrade at the same time. Information was extracted from the analyses which may be useful in determining the distribution of Tc-99 in the various SDUs throughout time and in determining flow balances for the SDUs.
Analysing harmonic motions with an iPhone’s magnetometer
Yavuz, Ahmet; Kağan Temiz, Burak
2016-05-01
In this paper, we propose an experiment for analysing harmonic motion using an iPhone’s (or iPad’s) magnetometer. This experiment consists of the detection of magnetic field variations obtained from an iPhone’s magnetometer sensor. A graph of harmonic motion is directly displayed on the iPhone’s screen using the Sensor Kinetics application. Data from this application was analysed with Eureqa software to establish the equation of the harmonic motion. Analyses show that the use of an iPhone’s magnetometer to analyse harmonic motion is a practical and effective method for small oscillations and frequencies less than 15-20 Hz.
41 CFR 101-27.208 - Inventory analyses.
2010-07-01
...-Management of Shelf-Life Materials § 101-27.208 Inventory analyses. (a) An inventory analysis shall be... 41 Public Contracts and Property Management 2 2010-07-01 2010-07-01 true Inventory analyses. 101-27.208 Section 101-27.208 Public Contracts and Property Management Federal Property Management...
Aftaler om arbejdsmiljø - en analyse af udvalgte overenskomster
DEFF Research Database (Denmark)
Petersen, Jens Voxtrup; Wiegmann, Inger-Marie; Vogt-Nielsen, Karl
En analyse af overenskomsters betydning for arbejdsmiljøet indenfor industri, slagterier, rengøring, det grønne område, hotel og restauration og busdrift.......En analyse af overenskomsters betydning for arbejdsmiljøet indenfor industri, slagterier, rengøring, det grønne område, hotel og restauration og busdrift....
What can we do about exploratory analyses in clinical trials?
Moyé, Lem
2015-11-01
The research community has alternatively embraced then repudiated exploratory analyses since the inception of clinical trials in the middle of the twentieth century. After a series of important but ultimately unreproducible findings, these non-prospectively declared evaluations were relegated to hypothesis generating. Since the majority of evaluations conducted in clinical trials with their rich data sets are exploratory, the absence of their persuasive power adds to the inefficiency of clinical trial analyses in an atmosphere of fiscal frugality. However, the principle argument against exploratory analyses is not based in statistical theory, but pragmatism and observation. The absence of any theoretical treatment of exploratory analyses postpones the day when their statistical weaknesses might be repaired. Here, we introduce examination of the characteristics of exploratory analyses from a probabilistic and statistical framework. Setting the obvious logistical concerns aside (i.e., the absence of planning produces poor precision), exploratory analyses do not appear to suffer from estimation theory weaknesses. The problem appears to be a difficulty in what is actually reported as the p-value. The use of Bayes Theorem provides p-values that are more in line with confirmatory analyses. This development may inaugurate a body of work that would lead to the readmission of exploratory analyses to a position of persuasive power in clinical trials.
The EADGENE and SABRE post-analyses workshop
DEFF Research Database (Denmark)
Jaffrezic, Florence; Hedegaard, Jakob; Sancristobal, Magali;
2009-01-01
on the statistical analyses of a microarray experiment (i.e. getting a gene list), the subsequently analysis of the gene list is still an area of much confusion to many scientists. During a three-day workshop in November 2008, we discussed five aspects of these so-called post analyses of microarray data: 1) re...
On the reproducibility of meta-analyses : six practical recommendations
Lakens, D.; Hilgard, J.; Staaks, J.
2016-01-01
Meta-analyses play an important role in cumulative science by combining information across multiple studies and attempting to provide effect size estimates corrected for publication bias. Research on the reproducibility of meta-analyses reveals that errors are common, and the percentage of effect si
Integrating and scheduling an open set of static analyses
DEFF Research Database (Denmark)
Eichberg, Michael; Mezini, Mira; Kloppenburg, Sven
2006-01-01
To improve the productivity of the development process, more and more tools for static software analysis are tightly integrated into the incremental build process of an IDE. If multiple interdependent analyses are used simultaneously, the coordination between the analyses becomes a major obstacle...