Statistical modeling of urban air temperature distributions under different synoptic conditions
Beck, Christoph; Breitner, Susanne; Cyrys, Josef; Hald, Cornelius; Hartz, Uwe; Jacobeit, Jucundus; Richter, Katja; Schneider, Alexandra; Wolf, Kathrin
2015-04-01
Within urban areas air temperature may vary distinctly between different locations. These intra-urban air temperature variations partly reach magnitudes that are relevant with respect to human thermal comfort. Therefore and furthermore taking into account potential interrelations with other health related environmental factors (e.g. air quality) it is important to estimate spatial patterns of intra-urban air temperature distributions that may be incorporated into urban planning processes. In this contribution we present an approach to estimate spatial temperature distributions in the urban area of Augsburg (Germany) by means of statistical modeling. At 36 locations in the urban area of Augsburg air temperatures are measured with high temporal resolution (4 min.) since December 2012. These 36 locations represent different typical urban land use characteristics in terms of varying percentage coverages of different land cover categories (e.g. impervious, built-up, vegetated). Percentage coverages of these land cover categories have been extracted from different sources (Open Street Map, European Urban Atlas, Urban Morphological Zones) for regular grids of varying size (50, 100, 200 meter horizonal resolution) for the urban area of Augsburg. It is well known from numerous studies that land use characteristics have a distinct influence on air temperature and as well other climatic variables at a certain location. Therefore air temperatures at the 36 locations are modeled utilizing land use characteristics (percentage coverages of land cover categories) as predictor variables in Stepwise Multiple Regression models and in Random Forest based model approaches. After model evaluation via cross-validation appropriate statistical models are applied to gridded land use data to derive spatial urban air temperature distributions. Varying models are tested and applied for different seasons and times of the day and also for different synoptic conditions (e.g. clear and calm
Statistical distribution sampling
Johnson, E. S.
1975-01-01
Determining the distribution of statistics by sampling was investigated. Characteristic functions, the quadratic regression problem, and the differential equations for the characteristic functions are analyzed.
Heimann, G; Neuhaus, G
1998-03-01
In the random censorship model, the log-rank test is often used for comparing a control group with different dose groups. If the number of tumors is small, so-called exact methods are often applied for computing critical values from a permutational distribution. Two of these exact methods are discussed and shown to be incorrect. The correct permutational distribution is derived and studied with respect to its behavior under unequal censoring in the light of recent results proving that the permutational version and the unconditional version of the log-rank test are asymptotically equivalent even under unequal censoring. The log-rank test is studied by simulations of a realistic scenario from a bioassay with small numbers of tumors.
Statistical distribution of quantum particles
Indian Academy of Sciences (India)
S B Khasare
2018-02-08
Feb 8, 2018 ... In this work, the statistical distribution functions for boson, fermions and their mixtures have been ... index is greater than unity, then it is easy in the present approach to ... ability W. Section 3 gives the derivation and graphical.
International Nuclear Information System (INIS)
Dai, Wu-Sheng; Xie, Mi
2013-01-01
In this paper, we give a general discussion on the calculation of the statistical distribution from a given operator relation of creation, annihilation, and number operators. Our result shows that as long as the relation between the number operator and the creation and annihilation operators can be expressed as a † b=Λ(N) or N=Λ −1 (a † b), where N, a † , and b denote the number, creation, and annihilation operators, i.e., N is a function of quadratic product of the creation and annihilation operators, the corresponding statistical distribution is the Gentile distribution, a statistical distribution in which the maximum occupation number is an arbitrary integer. As examples, we discuss the statistical distributions corresponding to various operator relations. In particular, besides the Bose–Einstein and Fermi–Dirac cases, we discuss the statistical distributions for various schemes of intermediate statistics, especially various q-deformation schemes. Our result shows that the statistical distributions corresponding to various q-deformation schemes are various Gentile distributions with different maximum occupation numbers which are determined by the deformation parameter q. This result shows that the results given in much literature on the q-deformation distribution are inaccurate or incomplete. -- Highlights: ► A general discussion on calculating statistical distribution from relations of creation, annihilation, and number operators. ► A systemic study on the statistical distributions corresponding to various q-deformation schemes. ► Arguing that many results of q-deformation distributions in literature are inaccurate or incomplete
Distributions with given marginals and statistical modelling
Fortiana, Josep; Rodriguez-Lallena, José
2002-01-01
This book contains a selection of the papers presented at the meeting `Distributions with given marginals and statistical modelling', held in Barcelona (Spain), July 17-20, 2000. In 24 chapters, this book covers topics such as the theory of copulas and quasi-copulas, the theory and compatibility of distributions, models for survival distributions and other well-known distributions, time series, categorical models, definition and estimation of measures of dependence, monotonicity and stochastic ordering, shape and separability of distributions, hidden truncation models, diagonal families, orthogonal expansions, tests of independence, and goodness of fit assessment. These topics share the use and properties of distributions with given marginals, this being the fourth specialised text on this theme. The innovative aspect of the book is the inclusion of statistical aspects such as modelling, Bayesian statistics, estimation, and tests.
Predicting Statistical Distributions of Footbridge Vibrations
DEFF Research Database (Denmark)
Pedersen, Lars; Frier, Christian
2009-01-01
The paper considers vibration response of footbridges to pedestrian loading. Employing Newmark and Monte Carlo simulation methods, a statistical distribution of bridge vibration levels is calculated modelling walking parameters such as step frequency and stride length as random variables...
Distributional Properties of Order Statistics and Record Statistics
Directory of Open Access Journals (Sweden)
Abdul Hamid Khan
2012-07-01
Full Text Available Normal 0 false false false EN-US X-NONE X-NONE /* Style Definitions */ table.MsoNormalTable {mso-style-name:"Table Normal"; mso-tstyle-rowband-size:0; mso-tstyle-colband-size:0; mso-style-noshow:yes; mso-style-priority:99; mso-style-parent:""; mso-padding-alt:0in 5.4pt 0in 5.4pt; mso-para-margin:0in; mso-para-margin-bottom:.0001pt; mso-pagination:widow-orphan; font-size:10.0pt; font-family:"Times New Roman","serif";} Distributional properties of the order statistics, upper and lower records have been utilized to characterize distributions of interest. Further, one sided random dilation and contraction are utilized to obtain the distribution of non-adjacent ordered statistics and also their important deductions are discussed.
Statistical distributions applications and parameter estimates
Thomopoulos, Nick T
2017-01-01
This book gives a description of the group of statistical distributions that have ample application to studies in statistics and probability. Understanding statistical distributions is fundamental for researchers in almost all disciplines. The informed researcher will select the statistical distribution that best fits the data in the study at hand. Some of the distributions are well known to the general researcher and are in use in a wide variety of ways. Other useful distributions are less understood and are not in common use. The book describes when and how to apply each of the distributions in research studies, with a goal to identify the distribution that best applies to the study. The distributions are for continuous, discrete, and bivariate random variables. In most studies, the parameter values are not known a priori, and sample data is needed to estimate parameter values. In other scenarios, no sample data is available, and the researcher seeks some insight that allows the estimate of ...
Statistical models based on conditional probability distributions
International Nuclear Information System (INIS)
Narayanan, R.S.
1991-10-01
We present a formulation of statistical mechanics models based on conditional probability distribution rather than a Hamiltonian. We show that it is possible to realize critical phenomena through this procedure. Closely linked with this formulation is a Monte Carlo algorithm, in which a configuration generated is guaranteed to be statistically independent from any other configuration for all values of the parameters, in particular near the critical point. (orig.)
Football goal distributions and extremal statistics
Greenhough, J.; Birch, P. C.; Chapman, S. C.; Rowlands, G.
2002-12-01
We analyse the distributions of the number of goals scored by home teams, away teams, and the total scored in the match, in domestic football games from 169 countries between 1999 and 2001. The probability density functions (PDFs) of goals scored are too heavy-tailed to be fitted over their entire ranges by Poisson or negative binomial distributions which would be expected for uncorrelated processes. Log-normal distributions cannot include zero scores and here we find that the PDFs are consistent with those arising from extremal statistics. In addition, we show that it is sufficient to model English top division and FA Cup matches in the seasons of 1970/71-2000/01 on Poisson or negative binomial distributions, as reported in analyses of earlier seasons, and that these are not consistent with extremal statistics.
Van Hulle, Carol A; Rathouz, Paul J
2015-02-01
Accurately identifying interactions between genetic vulnerabilities and environmental factors is of critical importance for genetic research on health and behavior. In the previous work of Van Hulle et al. (Behavior Genetics, Vol. 43, 2013, pp. 71-84), we explored the operating characteristics for a set of biometric (e.g., twin) models of Rathouz et al. (Behavior Genetics, Vol. 38, 2008, pp. 301-315), for testing gene-by-measured environment interaction (GxM) in the presence of gene-by-measured environment correlation (rGM) where data followed the assumed distributional structure. Here we explore the effects that violating distributional assumptions have on the operating characteristics of these same models even when structural model assumptions are correct. We simulated N = 2,000 replicates of n = 1,000 twin pairs under a number of conditions. Non-normality was imposed on either the putative moderator or on the ultimate outcome by ordinalizing or censoring the data. We examined the empirical Type I error rates and compared Bayesian information criterion (BIC) values. In general, non-normality in the putative moderator had little impact on the Type I error rates or BIC comparisons. In contrast, non-normality in the outcome was often mistaken for or masked GxM, especially when the outcome data were censored.
Statistical analysis of partial reduced width distributions
International Nuclear Information System (INIS)
Tran Quoc Thuong.
1973-01-01
The aim of this study was to develop rigorous methods for analysing experimental event distributions according to a law in chi 2 and to check if the number of degrees of freedom ν is compatible with the value 1 for the reduced neutron width distribution. Two statistical methods were used (the maximum-likelihood method and the method of moments); it was shown, in a few particular cases, that ν is compatible with 1. The difference between ν and 1, if it exists, should not exceed 3%. These results confirm the validity of the compound nucleus model [fr
Nonparametric Bayesian predictive distributions for future order statistics
Richard A. Johnson; James W. Evans; David W. Green
1999-01-01
We derive the predictive distribution for a specified order statistic, determined from a future random sample, under a Dirichlet process prior. Two variants of the approach are treated and some limiting cases studied. A practical application to monitoring the strength of lumber is discussed including choices of prior expectation and comparisons made to a Bayesian...
Improvement of Statistical Decisions under Parametric Uncertainty
Nechval, Nicholas A.; Nechval, Konstantin N.; Purgailis, Maris; Berzins, Gundars; Rozevskis, Uldis
2011-10-01
A large number of problems in production planning and scheduling, location, transportation, finance, and engineering design require that decisions be made in the presence of uncertainty. Decision-making under uncertainty is a central problem in statistical inference, and has been formally studied in virtually all approaches to inference. The aim of the present paper is to show how the invariant embedding technique, the idea of which belongs to the authors, may be employed in the particular case of finding the improved statistical decisions under parametric uncertainty. This technique represents a simple and computationally attractive statistical method based on the constructive use of the invariance principle in mathematical statistics. Unlike the Bayesian approach, an invariant embedding technique is independent of the choice of priors. It allows one to eliminate unknown parameters from the problem and to find the best invariant decision rule, which has smaller risk than any of the well-known decision rules. To illustrate the proposed technique, application examples are given.
Aftershock Energy Distribution by Statistical Mechanics Approach
Daminelli, R.; Marcellini, A.
2015-12-01
The aim of our work is to research the most probable distribution of the energy of aftershocks. We started by applying one of the fundamental principles of statistical mechanics that, in case of aftershock sequences, it could be expressed as: the greater the number of different ways in which the energy of aftershocks can be arranged among the energy cells in phase space the more probable the distribution. We assume that each cell in phase space has the same possibility to be occupied, and that more than one cell in the phase space can have the same energy. Seeing that seismic energy is proportional to products of different parameters, a number of different combinations of parameters can produce different energies (e.g., different combination of stress drop and fault area can release the same seismic energy). Let us assume that there are gi cells in the aftershock phase space characterised by the same energy released ɛi. Therefore we can assume that the Maxwell-Boltzmann statistics can be applied to aftershock sequences with the proviso that the judgment on the validity of this hypothesis is the agreement with the data. The aftershock energy distribution can therefore be written as follow: n(ɛ)=Ag(ɛ)exp(-βɛ)where n(ɛ) is the number of aftershocks with energy, ɛ, A and β are constants. Considering the above hypothesis, we can assume g(ɛ) is proportional to ɛ. We selected and analysed different aftershock sequences (data extracted from Earthquake Catalogs of SCEC, of INGV-CNT and other institutions) with a minimum magnitude retained ML=2 (in some cases ML=2.6) and a time window of 35 days. The results of our model are in agreement with the data, except in the very low energy band, where our model resulted in a moderate overestimation.
Device for flattening statistically distributed pulses
International Nuclear Information System (INIS)
Il'kanaev, G.I.; Iskenderov, V.G.; Rudnev, O.V.; Teller, V.S.
1976-01-01
The description is given of a device that converts the series of statistically distributed pulses into a pseudo-uniform one. The inlet pulses switch over the first counter, and the second one is switched over by the clock pulses each time the uniformity of the counters' states is violated. This violation is recorded by the logic circuit which passes to the output the clock pulses in the amount equal to that of the pulses that reached the device inlet. Losses at the correlation between the light velocity and the sampling rate up to 0.3 do not exceed 0.7 per cent for the memory of pulse counters 3, and 0.035 per cent for memory 7
Empirical distribution function under heteroscedasticity
Czech Academy of Sciences Publication Activity Database
Víšek, Jan Ámos
2011-01-01
Roč. 45, č. 5 (2011), s. 497-508 ISSN 0233-1888 Grant - others:GA UK(CZ) GA402/09/0557 Institutional research plan: CEZ:AV0Z10750506 Keywords : Robustness * Convergence * Empirical distribution * Heteroscedasticity Subject RIV: BB - Applied Statistics , Operational Research Impact factor: 0.724, year: 2011 http://library.utia.cas.cz/separaty/2011/SI/visek-0365534.pdf
Modified Distribution-Free Goodness-of-Fit Test Statistic.
Chun, So Yeon; Browne, Michael W; Shapiro, Alexander
2018-03-01
Covariance structure analysis and its structural equation modeling extensions have become one of the most widely used methodologies in social sciences such as psychology, education, and economics. An important issue in such analysis is to assess the goodness of fit of a model under analysis. One of the most popular test statistics used in covariance structure analysis is the asymptotically distribution-free (ADF) test statistic introduced by Browne (Br J Math Stat Psychol 37:62-83, 1984). The ADF statistic can be used to test models without any specific distribution assumption (e.g., multivariate normal distribution) of the observed data. Despite its advantage, it has been shown in various empirical studies that unless sample sizes are extremely large, this ADF statistic could perform very poorly in practice. In this paper, we provide a theoretical explanation for this phenomenon and further propose a modified test statistic that improves the performance in samples of realistic size. The proposed statistic deals with the possible ill-conditioning of the involved large-scale covariance matrices.
Soil nuclide distribution coefficients and their statistical distributions
International Nuclear Information System (INIS)
Sheppard, M.I.; Beals, D.I.; Thibault, D.H.; O'Connor, P.
1984-12-01
Environmental assessments of the disposal of nuclear fuel waste in plutonic rock formations require analysis of the migration of nuclides from the disposal vault to the biosphere. Analyses of nuclide migration via groundwater through the disposal vault, the buffer and backfill, the plutonic rock, and the consolidated and unconsolidated overburden use models requiring distribution coefficients (Ksub(d)) to describe the interaction of the nuclides with the geological and man-made materials. This report presents element-specific soil distribution coefficients and their statistical distributions, based on a detailed survey of the literature. Radioactive elements considered were actinium, americium, bismuth, calcium, carbon, cerium, cesium, iodine, lead, molybdenum, neptunium, nickel, niobium, palladium, plutonium, polonium, protactinium, radium, samarium, selenium, silver, strontium, technetium, terbium, thorium, tin, uranium and zirconium. Stable elements considered were antimony, boron, cadmium, tellurium and zinc. Where sufficient data were available, distribution coefficients and their distributions are given for sand, silt, clay and organic soils. Our values are recommended for use in assessments for the Canadian Nuclear Fuel Waste Management Program
Alternative derivations of the statistical mechanical distribution laws.
Wall, F T
1971-08-01
A new approach is presented for the derivation of statistical mechanical distribution laws. The derivations are accomplished by minimizing the Helmholtz free energy under constant temperature and volume, instead of maximizing the entropy under constant energy and volume. An alternative method involves stipulating equality of chemical potential, or equality of activity, for particles in different energy levels. This approach leads to a general statement of distribution laws applicable to all systems for which thermodynamic probabilities can be written. The methods also avoid use of the calculus of variations, Lagrangian multipliers, and Stirling's approximation for the factorial. The results are applied specifically to Boltzmann, Fermi-Dirac, and Bose-Einstein statistics. The special significance of chemical potential and activity is discussed for microscopic systems.
On precipitation monitoring with theoretical statistical distributions
Cindrić, Ksenija; Juras, Josip; Pasarić, Zoran
2018-04-01
A common practice in meteorological drought monitoring is to transform the observed precipitation amounts to the standardised precipitation index (SPI). Though the gamma distribution is usually employed for this purpose, some other distribution may be used, particularly in regions where zero precipitation amounts are recorded frequently. In this study, two distributions are considered alongside with the gamma distribution: the compound Poisson exponential distribution (CPE) and the square root normal distribution (SRN). They are fitted to monthly precipitation amounts measured at 24 stations in Croatia in the 55-year-long period (1961-2015). At five stations, long-term series (1901-2015) are available and they have been used for a more detailed investigation. The accommodation of the theoretical distributions to empirical ones is tested by comparison of the corresponding empirical and theoretical ratios of the skewness and the coefficient of variation. Furthermore, following the common approach to precipitation monitoring (CLIMAT reports), the comparison of the empirical and theoretical quintiles in the two periods (1961-1990 and 1991-2015) is examined. The results from the present study reveal that it would be more appropriate to implement theoretical distributions in such climate reports, since they provide better evaluation for monitoring purposes than the current empirical distribution. Nevertheless, deciding on an optimal theoretical distribution for different climate regimes and for different time periods is not easy to accomplish. With regard to Croatian stations (covering different climate regimes), the CPE or SRN distribution could also be the right choice in the climatological practice, in addition to the gamma distribution.
A goodness of fit statistic for the geometric distribution
J.A. Ferreira
2003-01-01
textabstractWe propose a goodness of fit statistic for the geometric distribution and compare it in terms of power, via simulation, with the chi-square statistic. The statistic is based on the Lau-Rao theorem and can be seen as a discrete analogue of the total time on test statistic. The results
statistical tests for frequency distribution of mean gravity anomalies
African Journals Online (AJOL)
ES Obe
1980-03-01
Mar 1, 1980 ... STATISTICAL TESTS FOR FREQUENCY DISTRIBUTION OF MEAN. GRAVITY ANOMALIES. By ... approach. Kaula [1,2] discussed the method of applying statistical techniques in the ..... mathematical foundation of physical ...
Bug Distribution and Statistical Pattern Classification.
Tatsuoka, Kikumi K.; Tatsuoka, Maurice M.
1987-01-01
The rule space model permits measurement of cognitive skill acquisition and error diagnosis. Further discussion introduces Bayesian hypothesis testing and bug distribution. An illustration involves an artificial intelligence approach to testing fractions and arithmetic. (Author/GDC)
Statistical distribution for generalized ideal gas of fractional-statistics particles
International Nuclear Information System (INIS)
Wu, Y.
1994-01-01
We derive the occupation-number distribution in a generalized ideal gas of particles obeying fractional statistics, including mutual statistics, by adopting a state-counting definition. When there is no mutual statistics, the statistical distribution interpolates between bosons and fermions, and respects a fractional exclusion principle (except for bosons). Anyons in a strong magnetic field at low temperatures constitute such a physical system. Applications to the thermodynamic properties of quasiparticle excitations in the Laughlin quantum Hall fluid are discussed
A goodness of fit statistic for the geometric distribution
Ferreira, J.A.
2003-01-01
textabstractWe propose a goodness of fit statistic for the geometric distribution and compare it in terms of power, via simulation, with the chi-square statistic. The statistic is based on the Lau-Rao theorem and can be seen as a discrete analogue of the total time on test statistic. The results suggest that the test based on the new statistic is generally superior to the chi-square test.
Football fever: goal distributions and non-Gaussian statistics
Bittner, E.; Nußbaumer, A.; Janke, W.; Weigel, M.
2009-02-01
Analyzing football score data with statistical techniques, we investigate how the not purely random, but highly co-operative nature of the game is reflected in averaged properties such as the probability distributions of scored goals for the home and away teams. As it turns out, especially the tails of the distributions are not well described by the Poissonian or binomial model resulting from the assumption of uncorrelated random events. Instead, a good effective description of the data is provided by less basic distributions such as the negative binomial one or the probability densities of extreme value statistics. To understand this behavior from a microscopical point of view, however, no waiting time problem or extremal process need be invoked. Instead, modifying the Bernoulli random process underlying the Poissonian model to include a simple component of self-affirmation seems to describe the data surprisingly well and allows to understand the observed deviation from Gaussian statistics. The phenomenological distributions used before can be understood as special cases within this framework. We analyzed historical football score data from many leagues in Europe as well as from international tournaments, including data from all past tournaments of the “FIFA World Cup” series, and found the proposed models to be applicable rather universally. In particular, here we analyze the results of the German women’s premier football league and consider the two separate German men’s premier leagues in the East and West during the cold war times as well as the unified league after 1990 to see how scoring in football and the component of self-affirmation depend on cultural and political circumstances.
Statistical decisions under nonparametric a priori information
International Nuclear Information System (INIS)
Chilingaryan, A.A.
1985-01-01
The basic module of applied program package for statistical analysis of the ANI experiment data is described. By means of this module tasks of choosing theoretical model most adequately fitting to experimental data, selection of events of definte type, identification of elementary particles are carried out. For mentioned problems solving, the Bayesian rules, one-leave out test and KNN (K Nearest Neighbour) adaptive density estimation are utilized
Comparing Simulated and Theoretical Sampling Distributions of the U3 Person-Fit Statistic.
Emons, Wilco H. M.; Meijer, Rob R.; Sijtsma, Klaas
2002-01-01
Studied whether the theoretical sampling distribution of the U3 person-fit statistic is in agreement with the simulated sampling distribution under different item response theory models and varying item and test characteristics. Simulation results suggest that the use of standard normal deviates for the standardized version of the U3 statistic may…
Fitting the Statistical Distribution for Daily Rainfall in Ibadan, Based ...
African Journals Online (AJOL)
PROF. O. E. OSUAGWU
2013-06-01
Jun 1, 2013 ... Abstract. This paper presents several types of statistical distributions to describe rainfall distribution in Ibadan metropolis over a period of 30 years. The exponential, gamma, normal and poison distributions are compared to identify the optimal model for daily rainfall amount based on data recorded at rain ...
Prototyping a Distributed Information Retrieval System That Uses Statistical Ranking.
Harman, Donna; And Others
1991-01-01
Built using a distributed architecture, this prototype distributed information retrieval system uses statistical ranking techniques to provide better service to the end user. Distributed architecture was shown to be a feasible alternative to centralized or CD-ROM information retrieval, and user testing of the ranking methodology showed both…
Statistical distributions as applied to environmental surveillance data
International Nuclear Information System (INIS)
Speer, D.R.; Waite, D.A.
1975-09-01
Application of normal, log normal, and Weibull distributions to environmental surveillance data was investigated for approximately 300 nuclide-medium-year-location combinations. Corresponding W test calculations were made to determine the probability of a particular data set falling within the distribution of interest. Conclusions are drawn as to the fit of any data group to the various distributions. The significance of fitting statistical distributions to the data is discussed
Statistical Inference for a Class of Multivariate Negative Binomial Distributions
DEFF Research Database (Denmark)
Rubak, Ege H.; Møller, Jesper; McCullagh, Peter
This paper considers statistical inference procedures for a class of models for positively correlated count variables called -permanental random fields, and which can be viewed as a family of multivariate negative binomial distributions. Their appealing probabilistic properties have earlier been...... studied in the literature, while this is the first statistical paper on -permanental random fields. The focus is on maximum likelihood estimation, maximum quasi-likelihood estimation and on maximum composite likelihood estimation based on uni- and bivariate distributions. Furthermore, new results...
Analysis of thrips distribution: application of spatial statistics and Kriging
John Aleong; Bruce L. Parker; Margaret Skinner; Diantha Howard
1991-01-01
Kriging is a statistical technique that provides predictions for spatially and temporally correlated data. Observations of thrips distribution and density in Vermont soils are made in both space and time. Traditional statistical analysis of such data assumes that the counts taken over space and time are independent, which is not necessarily true. Therefore, to analyze...
The Role of the Sampling Distribution in Understanding Statistical Inference
Lipson, Kay
2003-01-01
Many statistics educators believe that few students develop the level of conceptual understanding essential for them to apply correctly the statistical techniques at their disposal and to interpret their outcomes appropriately. It is also commonly believed that the sampling distribution plays an important role in developing this understanding.…
Illustrating Sampling Distribution of a Statistic: Minitab Revisited
Johnson, H. Dean; Evans, Marc A.
2008-01-01
Understanding the concept of the sampling distribution of a statistic is essential for the understanding of inferential procedures. Unfortunately, this topic proves to be a stumbling block for students in introductory statistics classes. In efforts to aid students in their understanding of this concept, alternatives to a lecture-based mode of…
Statistical test for the distribution of galaxies on plates
International Nuclear Information System (INIS)
Garcia Lambas, D.
1985-01-01
A statistical test for the distribution of galaxies on plates is presented. We apply the test to synthetic astronomical plates obtained by means of numerical simulation (Garcia Lambas and Sersic 1983) with three different models for the 3-dimensional distribution, comparison with an observational plate, suggest the presence of filamentary structure. (author)
Exact distributions of two-sample rank statistics and block rank statistics using computer algebra
Wiel, van de M.A.
1998-01-01
We derive generating functions for various rank statistics and we use computer algebra to compute the exact null distribution of these statistics. We present various techniques for reducing time and memory space used by the computations. We use the results to write Mathematica notebooks for
On the statistical mechanics of species abundance distributions.
Bowler, Michael G; Kelly, Colleen K
2012-09-01
A central issue in ecology is that of the factors determining the relative abundance of species within a natural community. The proper application of the principles of statistical physics to species abundance distributions (SADs) shows that simple ecological properties could account for the near universal features observed. These properties are (i) a limit on the number of individuals in an ecological guild and (ii) per capita birth and death rates. They underpin the neutral theory of Hubbell (2001), the master equation approach of Volkov et al. (2003, 2005) and the idiosyncratic (extreme niche) theory of Pueyo et al. (2007); they result in an underlying log series SAD, regardless of neutral or niche dynamics. The success of statistical mechanics in this application implies that communities are in dynamic equilibrium and hence that niches must be flexible and that temporal fluctuations on all sorts of scales are likely to be important in community structure. Copyright © 2012 Elsevier Inc. All rights reserved.
Statistical Analysis Of Failure Strength Of Material Using Weibull Distribution
International Nuclear Information System (INIS)
Entin Hartini; Mike Susmikanti; Antonius Sitompul
2008-01-01
In evaluation of ceramic and glass materials strength a statistical approach is necessary Strength of ceramic and glass depend on its measure and size distribution of flaws in these material. The distribution of strength for ductile material is narrow and close to a Gaussian distribution while strength of brittle materials as ceramic and glass following Weibull distribution. The Weibull distribution is an indicator of the failure of material strength resulting from a distribution of flaw size. In this paper, cumulative probability of material strength to failure probability, cumulative probability of failure versus fracture stress and cumulative probability of reliability of material were calculated. Statistical criteria calculation supporting strength analysis of Silicon Nitride material were done utilizing MATLAB. (author)
Statistical distributions as applied to environmental surveillance data
International Nuclear Information System (INIS)
Speer, D.R.; Waite, D.A.
1976-01-01
Application of normal, lognormal, and Weibull distributions to radiological environmental surveillance data was investigated for approximately 300 nuclide-medium-year-location combinations. The fit of data to distributions was compared through probability plotting (special graph paper provides a visual check) and W test calculations. Results show that 25% of the data fit the normal distribution, 50% fit the lognormal, and 90% fit the Weibull.Demonstration of how to plot each distribution shows that normal and lognormal distributions are comparatively easy to use while Weibull distribution is complicated and difficult to use. Although current practice is to use normal distribution statistics, normal fit the least number of data groups considered in this study
A NEW STATISTICAL PERSPECTIVE TO THE COSMIC VOID DISTRIBUTION
International Nuclear Information System (INIS)
Pycke, J-R; Russell, E.
2016-01-01
In this study, we obtain the size distribution of voids as a three-parameter redshift-independent log-normal void probability function (VPF) directly from the Cosmic Void Catalog (CVC). Although many statistical models of void distributions are based on the counts in randomly placed cells, the log-normal VPF that we obtain here is independent of the shape of the voids due to the parameter-free void finder of the CVC. We use three void populations drawn from the CVC generated by the Halo Occupation Distribution (HOD) Mocks, which are tuned to three mock SDSS samples to investigate the void distribution statistically and to investigate the effects of the environments on the size distribution. As a result, it is shown that void size distributions obtained from the HOD Mock samples are satisfied by the three-parameter log-normal distribution. In addition, we find that there may be a relation between the hierarchical formation, skewness, and kurtosis of the log-normal distribution for each catalog. We also show that the shape of the three-parameter distribution from the samples is strikingly similar to the galaxy log-normal mass distribution obtained from numerical studies. This similarity between void size and galaxy mass distributions may possibly indicate evidence of nonlinear mechanisms affecting both voids and galaxies, such as large-scale accretion and tidal effects. Considering the fact that in this study, all voids are generated by galaxy mocks and show hierarchical structures in different levels, it may be possible that the same nonlinear mechanisms of mass distribution affect the void size distribution.
Fisher information and statistical inference for phase-type distributions
DEFF Research Database (Denmark)
Bladt, Mogens; Esparza, Luz Judith R; Nielsen, Bo Friis
2011-01-01
This paper is concerned with statistical inference for both continuous and discrete phase-type distributions. We consider maximum likelihood estimation, where traditionally the expectation-maximization (EM) algorithm has been employed. Certain numerical aspects of this method are revised and we...
Statistical inference for a class of multivariate negative binomial distributions
DEFF Research Database (Denmark)
Rubak, Ege Holger; Møller, Jesper; McCullagh, Peter
This paper considers statistical inference procedures for a class of models for positively correlated count variables called α-permanental random fields, and which can be viewed as a family of multivariate negative binomial distributions. Their appealing probabilistic properties have earlier been...
[Rank distributions in community ecology from the statistical viewpoint].
Maksimov, V N
2004-01-01
Traditional statistical methods for definition of empirical functions of abundance distribution (population, biomass, production, etc.) of species in a community are applicable for processing of multivariate data contained in the above quantitative indices of the communities. In particular, evaluation of moments of distribution suffices for convolution of the data contained in a list of species and their abundance. At the same time, the species should be ranked in the list in ascending rather than descending population and the distribution models should be analyzed on the basis of the data on abundant species only.
Log-concave Probability Distributions: Theory and Statistical Testing
DEFF Research Database (Denmark)
An, Mark Yuing
1996-01-01
This paper studies the broad class of log-concave probability distributions that arise in economics of uncertainty and information. For univariate, continuous, and log-concave random variables we prove useful properties without imposing the differentiability of density functions. Discrete...... and multivariate distributions are also discussed. We propose simple non-parametric testing procedures for log-concavity. The test statistics are constructed to test one of the two implicati ons of log-concavity: increasing hazard rates and new-is-better-than-used (NBU) property. The test for increasing hazard...... rates are based on normalized spacing of the sample order statistics. The tests for NBU property fall into the category of Hoeffding's U-statistics...
A generalized statistical model for the size distribution of wealth
International Nuclear Information System (INIS)
Clementi, F; Gallegati, M; Kaniadakis, G
2012-01-01
In a recent paper in this journal (Clementi et al 2009 J. Stat. Mech. P02037), we proposed a new, physically motivated, distribution function for modeling individual incomes, having its roots in the framework of the κ-generalized statistical mechanics. The performance of the κ-generalized distribution was checked against real data on personal income for the United States in 2003. In this paper we extend our previous model so as to be able to account for the distribution of wealth. Probabilistic functions and inequality measures of this generalized model for wealth distribution are obtained in closed form. In order to check the validity of the proposed model, we analyze the US household wealth distributions from 1984 to 2009 and conclude an excellent agreement with the data that is superior to any other model already known in the literature. (paper)
A generalized statistical model for the size distribution of wealth
Clementi, F.; Gallegati, M.; Kaniadakis, G.
2012-12-01
In a recent paper in this journal (Clementi et al 2009 J. Stat. Mech. P02037), we proposed a new, physically motivated, distribution function for modeling individual incomes, having its roots in the framework of the κ-generalized statistical mechanics. The performance of the κ-generalized distribution was checked against real data on personal income for the United States in 2003. In this paper we extend our previous model so as to be able to account for the distribution of wealth. Probabilistic functions and inequality measures of this generalized model for wealth distribution are obtained in closed form. In order to check the validity of the proposed model, we analyze the US household wealth distributions from 1984 to 2009 and conclude an excellent agreement with the data that is superior to any other model already known in the literature.
Karian, Zaven A
2000-01-01
Throughout the physical and social sciences, researchers face the challenge of fitting statistical distributions to their data. Although the study of statistical modelling has made great strides in recent years, the number and variety of distributions to choose from-all with their own formulas, tables, diagrams, and general properties-continue to create problems. For a specific application, which of the dozens of distributions should one use? What if none of them fit well?Fitting Statistical Distributions helps answer those questions. Focusing on techniques used successfully across many fields, the authors present all of the relevant results related to the Generalized Lambda Distribution (GLD), the Generalized Bootstrap (GB), and Monte Carlo simulation (MC). They provide the tables, algorithms, and computer programs needed for fitting continuous probability distributions to data in a wide variety of circumstances-covering bivariate as well as univariate distributions, and including situations where moments do...
Statistical distributions of optimal global alignment scores of random protein sequences
Directory of Open Access Journals (Sweden)
Tang Jiaowei
2005-10-01
Full Text Available Abstract Background The inference of homology from statistically significant sequence similarity is a central issue in sequence alignments. So far the statistical distribution function underlying the optimal global alignments has not been completely determined. Results In this study, random and real but unrelated sequences prepared in six different ways were selected as reference datasets to obtain their respective statistical distributions of global alignment scores. All alignments were carried out with the Needleman-Wunsch algorithm and optimal scores were fitted to the Gumbel, normal and gamma distributions respectively. The three-parameter gamma distribution performs the best as the theoretical distribution function of global alignment scores, as it agrees perfectly well with the distribution of alignment scores. The normal distribution also agrees well with the score distribution frequencies when the shape parameter of the gamma distribution is sufficiently large, for this is the scenario when the normal distribution can be viewed as an approximation of the gamma distribution. Conclusion We have shown that the optimal global alignment scores of random protein sequences fit the three-parameter gamma distribution function. This would be useful for the inference of homology between sequences whose relationship is unknown, through the evaluation of gamma distribution significance between sequences.
Experimental investigation of statistical models describing distribution of counts
International Nuclear Information System (INIS)
Salma, I.; Zemplen-Papp, E.
1992-01-01
The binomial, Poisson and modified Poisson models which are used for describing the statistical nature of the distribution of counts are compared theoretically, and conclusions for application are considered. The validity of the Poisson and the modified Poisson statistical distribution for observing k events in a short time interval is investigated experimentally for various measuring times. The experiments to measure the influence of the significant radioactive decay were performed with 89 Y m (T 1/2 =16.06 s), using a multichannel analyser (4096 channels) in the multiscaling mode. According to the results, Poisson statistics describe the counting experiment for short measuring times (up to T=0.5T 1/2 ) and its application is recommended. However, analysis of the data demonstrated, with confidence, that for long measurements (T≥T 1/2 ) Poisson distribution is not valid and the modified Poisson function is preferable. The practical implications in calculating uncertainties and in optimizing the measuring time are discussed. Differences between the standard deviations evaluated on the basis of the Poisson and binomial models are especially significant for experiments with long measuring time (T/T 1/2 ≥2) and/or large detection efficiency (ε>0.30). Optimization of the measuring time for paired observations yields the same solution for either the binomial or the Poisson distribution. (orig.)
Neti, Prasad V.S.V.; Howell, Roger W.
2010-01-01
Recently, the distribution of radioactivity among a population of cells labeled with 210Po was shown to be well described by a log-normal (LN) distribution function (J Nucl Med. 2006;47:1049–1058) with the aid of autoradiography. To ascertain the influence of Poisson statistics on the interpretation of the autoradiographic data, the present work reports on a detailed statistical analysis of these earlier data. Methods The measured distributions of α-particle tracks per cell were subjected to statistical tests with Poisson, LN, and Poisson-lognormal (P-LN) models. Results The LN distribution function best describes the distribution of radioactivity among cell populations exposed to 0.52 and 3.8 kBq/mL of 210Po-citrate. When cells were exposed to 67 kBq/mL, the P-LN distribution function gave a better fit; however, the underlying activity distribution remained log-normal. Conclusion The present analysis generally provides further support for the use of LN distributions to describe the cellular uptake of radioactivity. Care should be exercised when analyzing autoradiographic data on activity distributions to ensure that Poisson processes do not distort the underlying LN distribution. PMID:18483086
New advances in the statistical parton distributions approach*
Directory of Open Access Journals (Sweden)
Soffer Jacques
2016-01-01
Full Text Available The quantum statistical parton distributions approach proposed more than one decade ago is revisited by considering a larger set of recent and accurate Deep Inelastic Scattering experimental results. It enables us to improve the description of the data by means of a new determination of the parton distributions. This global next-to-leading order QCD analysis leads to a good description of several structure functions, involving unpolarized parton distributions and helicity distributions, in terms of a rather small number of free parameters. There are many serious challenging issues. The predictions of this theoretical approach will be tested for single-jet production and charge asymmetry in W± production in p̄p and pp collisions up to LHC energies, using recent data and also for forthcoming experimental results.
On widths of mass distributions in statistical theory of fission
International Nuclear Information System (INIS)
Volkov, N.G.; Emel'yanov, V.M.
1979-01-01
The process of nucleon tunneling from one fragment to another near the point of the compoUnd-nucleus fragmentation has been studied in the model of a two-center oscillator. The effect of the number of transferred nucleons on the mass distribution of fragments is estimated. Sensitivity of the model to the form of the single-particle potential, excitation eneraies and deformation of fragments is examined. The calculations performed show that it is possible to calculate the mass distributions at the point of fragment contact in the statistical fission model, taking account of the nucleon exchange between fragments
Encryption of covert information into multiple statistical distributions
International Nuclear Information System (INIS)
Venkatesan, R.C.
2007-01-01
A novel strategy to encrypt covert information (code) via unitary projections into the null spaces of ill-conditioned eigenstructures of multiple host statistical distributions, inferred from incomplete constraints, is presented. The host pdf's are inferred using the maximum entropy principle. The projection of the covert information is dependent upon the pdf's of the host statistical distributions. The security of the encryption/decryption strategy is based on the extreme instability of the encoding process. A self-consistent procedure to derive keys for both symmetric and asymmetric cryptography is presented. The advantages of using a multiple pdf model to achieve encryption of covert information are briefly highlighted. Numerical simulations exemplify the efficacy of the model
Robustness of S1 statistic with Hodges-Lehmann for skewed distributions
Ahad, Nor Aishah; Yahaya, Sharipah Soaad Syed; Yin, Lee Ping
2016-10-01
Analysis of variance (ANOVA) is a common use parametric method to test the differences in means for more than two groups when the populations are normally distributed. ANOVA is highly inefficient under the influence of non- normal and heteroscedastic settings. When the assumptions are violated, researchers are looking for alternative such as Kruskal-Wallis under nonparametric or robust method. This study focused on flexible method, S1 statistic for comparing groups using median as the location estimator. S1 statistic was modified by substituting the median with Hodges-Lehmann and the default scale estimator with the variance of Hodges-Lehmann and MADn to produce two different test statistics for comparing groups. Bootstrap method was used for testing the hypotheses since the sampling distributions of these modified S1 statistics are unknown. The performance of the proposed statistic in terms of Type I error was measured and compared against the original S1 statistic, ANOVA and Kruskal-Wallis. The propose procedures show improvement compared to the original statistic especially under extremely skewed distribution.
Statistical analysis of the spatial distribution of galaxies and clusters
International Nuclear Information System (INIS)
Cappi, Alberto
1993-01-01
This thesis deals with the analysis of the distribution of galaxies and clusters, describing some observational problems and statistical results. First chapter gives a theoretical introduction, aiming to describe the framework of the formation of structures, tracing the history of the Universe from the Planck time, t_p = 10"-"4"3 sec and temperature corresponding to 10"1"9 GeV, to the present epoch. The most usual statistical tools and models of the galaxy distribution, with their advantages and limitations, are described in chapter two. A study of the main observed properties of galaxy clustering, together with a detailed statistical analysis of the effects of selecting galaxies according to apparent magnitude or diameter, is reported in chapter three. Chapter four delineates some properties of groups of galaxies, explaining the reasons of discrepant results on group distributions. Chapter five is a study of the distribution of galaxy clusters, with different statistical tools, like correlations, percolation, void probability function and counts in cells; it is found the same scaling-invariant behaviour of galaxies. Chapter six describes our finding that rich galaxy clusters too belong to the fundamental plane of elliptical galaxies, and gives a discussion of its possible implications. Finally chapter seven reviews the possibilities offered by multi-slit and multi-fibre spectrographs, and I present some observational work on nearby and distant galaxy clusters. In particular, I show the opportunities offered by ongoing surveys of galaxies coupled with multi-object fibre spectrographs, focusing on the ESO Key Programme A galaxy redshift survey in the south galactic pole region to which I collaborate and on MEFOS, a multi-fibre instrument with automatic positioning. Published papers related to the work described in this thesis are reported in the last appendix. (author) [fr
Environmental radionuclide concentrations: statistical model to determine uniformity of distribution
International Nuclear Information System (INIS)
Cawley, C.N.; Fenyves, E.J.; Spitzberg, D.B.; Wiorkowski, J.; Chehroudi, M.T.
1980-01-01
In the evaluation of data from environmental sampling and measurement, a basic question is whether the radionuclide (or pollutant) is distributed uniformly. Since physical measurements have associated errors, it is inappropriate to consider the measurements alone in this determination. Hence, a statistical model has been developed. It consists of a weighted analysis of variance with subsequent t-tests between weighted and independent means. A computer program to perform the calculations is included
Statistical distribution of solar soft X-ray bursts
Energy Technology Data Exchange (ETDEWEB)
Kaufmann, P; Piazza, L R; Schaal, R E [Universidade Mackenzie, Sao Paulo (Brazil). Centro de Radio-Astronomia e Astrofisica
1979-03-01
Nearly 1000 solar events with fluxes measured in 0.5-3A/sup 0/, 1-8A/sup 0/ and 8-20A/sup 0/ bands by Explorer 37 (US NRL Solrad) satellite are statistically analyzed. The differential distribution of peak fluxes can be represented by power laws with exponents -1.4, -2.2, -2.9 respectively, which are compared to 2-12A/sup 0/ results. For the 0.5-3A/sup 0/ band there is a suggested peak in the distribution. Autocorrelation analyses of the distribution have shown that in the harder band (0.5-3A/sup 0/) there is a concentration of events at preferred values multiplied of about 10x10/sup -5/erg cm/sup -2/S/sup -1/ of unknown origin.
Statistical distribution of solar soft X-ray bursts
International Nuclear Information System (INIS)
Kaufmann, P.; Piazza, L.R.; Schaal, R.E.
1979-01-01
Nearly 1000 solar events with fluxes measured in 0.5-3A 0 , 1-8A 0 and 8-20A 0 bands by Explorer 37 (US NRL Solrad) satelite are statistically analysed. The differential distribution of peak fluxes can be represented by power laws with exponents -1.4, -2.2, -2.9 respectively, which are compared to 2-12A 0 results. At the 0.5-3A 0 band there is a suggested peak in the distribution. Autocorrelation analysis of the distribution have shown that in the harder band (0.5-3A 0 ) there is a concentration of events at preferred values multiplied of about 10x10 -5 erg cm -2 S -1 of unknown origin [pt
A method for statistically comparing spatial distribution maps
Directory of Open Access Journals (Sweden)
Reynolds Mary G
2009-01-01
Full Text Available Abstract Background Ecological niche modeling is a method for estimation of species distributions based on certain ecological parameters. Thus far, empirical determination of significant differences between independently generated distribution maps for a single species (maps which are created through equivalent processes, but with different ecological input parameters, has been challenging. Results We describe a method for comparing model outcomes, which allows a statistical evaluation of whether the strength of prediction and breadth of predicted areas is measurably different between projected distributions. To create ecological niche models for statistical comparison, we utilized GARP (Genetic Algorithm for Rule-Set Production software to generate ecological niche models of human monkeypox in Africa. We created several models, keeping constant the case location input records for each model but varying the ecological input data. In order to assess the relative importance of each ecological parameter included in the development of the individual predicted distributions, we performed pixel-to-pixel comparisons between model outcomes and calculated the mean difference in pixel scores. We used a two sample Student's t-test, (assuming as null hypothesis that both maps were identical to each other regardless of which input parameters were used to examine whether the mean difference in corresponding pixel scores from one map to another was greater than would be expected by chance alone. We also utilized weighted kappa statistics, frequency distributions, and percent difference to look at the disparities in pixel scores. Multiple independent statistical tests indicated precipitation as the single most important independent ecological parameter in the niche model for human monkeypox disease. Conclusion In addition to improving our understanding of the natural factors influencing the distribution of human monkeypox disease, such pixel-to-pixel comparison
Exact null distributions of quadratic distribution-free statistics for two-way classification
Wiel, van de M.A.
2004-01-01
Abstract We present new techniques for computing exact distributions of `Friedman-type¿ statistics. Representing the null distribution by a generating function allows for the use of general, not necessarily integer-valued rank scores. Moreover, we use symmetry properties of the multivariate
Wu, Hao
2018-05-01
In structural equation modelling (SEM), a robust adjustment to the test statistic or to its reference distribution is needed when its null distribution deviates from a χ 2 distribution, which usually arises when data do not follow a multivariate normal distribution. Unfortunately, existing studies on this issue typically focus on only a few methods and neglect the majority of alternative methods in statistics. Existing simulation studies typically consider only non-normal distributions of data that either satisfy asymptotic robustness or lead to an asymptotic scaled χ 2 distribution. In this work we conduct a comprehensive study that involves both typical methods in SEM and less well-known methods from the statistics literature. We also propose the use of several novel non-normal data distributions that are qualitatively different from the non-normal distributions widely used in existing studies. We found that several under-studied methods give the best performance under specific conditions, but the Satorra-Bentler method remains the most viable method for most situations. © 2017 The British Psychological Society.
Precision Statistical Analysis of Images Based on Brightness Distribution
Directory of Open Access Journals (Sweden)
Muzhir Shaban Al-Ani
2017-07-01
Full Text Available Study the content of images is considered an important topic in which reasonable and accurate analysis of images are generated. Recently image analysis becomes a vital field because of huge number of images transferred via transmission media in our daily life. These crowded media with images lead to highlight in research area of image analysis. In this paper, the implemented system is passed into many steps to perform the statistical measures of standard deviation and mean values of both color and grey images. Whereas the last step of the proposed method concerns to compare the obtained results in different cases of the test phase. In this paper, the statistical parameters are implemented to characterize the content of an image and its texture. Standard deviation, mean and correlation values are used to study the intensity distribution of the tested images. Reasonable results are obtained for both standard deviation and mean value via the implementation of the system. The major issue addressed in the work is concentrated on brightness distribution via statistical measures applying different types of lighting.
Yuan, Ke-Hai
2008-01-01
In the literature of mean and covariance structure analysis, noncentral chi-square distribution is commonly used to describe the behavior of the likelihood ratio (LR) statistic under alternative hypothesis. Due to the inaccessibility of the rather technical literature for the distribution of the LR statistic, it is widely believed that the…
Inverted rank distributions: Macroscopic statistics, universality classes, and critical exponents
Eliazar, Iddo; Cohen, Morrel H.
2014-01-01
An inverted rank distribution is an infinite sequence of positive sizes ordered in a monotone increasing fashion. Interlacing together Lorenzian and oligarchic asymptotic analyses, we establish a macroscopic classification of inverted rank distributions into five “socioeconomic” universality classes: communism, socialism, criticality, feudalism, and absolute monarchy. We further establish that: (i) communism and socialism are analogous to a “disordered phase”, feudalism and absolute monarchy are analogous to an “ordered phase”, and criticality is the “phase transition” between order and disorder; (ii) the universality classes are characterized by two critical exponents, one governing the ordered phase, and the other governing the disordered phase; (iii) communism, criticality, and absolute monarchy are characterized by sharp exponent values, and are inherently deterministic; (iv) socialism is characterized by a continuous exponent range, is inherently stochastic, and is universally governed by continuous power-law statistics; (v) feudalism is characterized by a continuous exponent range, is inherently stochastic, and is universally governed by discrete exponential statistics. The results presented in this paper yield a universal macroscopic socioeconophysical perspective of inverted rank distributions.
Structure Learning and Statistical Estimation in Distribution Networks - Part II
Energy Technology Data Exchange (ETDEWEB)
Deka, Deepjyoti [Univ. of Texas, Austin, TX (United States); Backhaus, Scott N. [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Chertkov, Michael [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)
2015-02-13
Limited placement of real-time monitoring devices in the distribution grid, recent trends notwithstanding, has prevented the easy implementation of demand-response and other smart grid applications. Part I of this paper discusses the problem of learning the operational structure of the grid from nodal voltage measurements. In this work (Part II), the learning of the operational radial structure is coupled with the problem of estimating nodal consumption statistics and inferring the line parameters in the grid. Based on a Linear-Coupled(LC) approximation of AC power flows equations, polynomial time algorithms are designed to identify the structure and estimate nodal load characteristics and/or line parameters in the grid using the available nodal voltage measurements. Then the structure learning algorithm is extended to cases with missing data, where available observations are limited to a fraction of the grid nodes. The efficacy of the presented algorithms are demonstrated through simulations on several distribution test cases.
Radio resource allocation over fading channels under statistical delay constraints
Le-Ngoc, Tho
2017-01-01
This SpringerBrief presents radio resource allocation schemes for buffer-aided communications systems over fading channels under statistical delay constraints in terms of upper-bounded average delay or delay-outage probability. This Brief starts by considering a source-destination communications link with data arriving at the source transmission buffer. The first scenario, the joint optimal data admission control and power allocation problem for throughput maximization is considered, where the source is assumed to have a maximum power and an average delay constraints. The second scenario, optimal power allocation problems for energy harvesting (EH) communications systems under average delay or delay-outage constraints are explored, where the EH source harvests random amounts of energy from renewable energy sources, and stores the harvested energy in a battery during data transmission. Online resource allocation algorithms are developed when the statistical knowledge of the random channel fading, data arrivals...
Hayslett, H T
1991-01-01
Statistics covers the basic principles of Statistics. The book starts by tackling the importance and the two kinds of statistics; the presentation of sample data; the definition, illustration and explanation of several measures of location; and the measures of variation. The text then discusses elementary probability, the normal distribution and the normal approximation to the binomial. Testing of statistical hypotheses and tests of hypotheses about the theoretical proportion of successes in a binomial population and about the theoretical mean of a normal population are explained. The text the
Estimating Predictive Variance for Statistical Gas Distribution Modelling
International Nuclear Information System (INIS)
Lilienthal, Achim J.; Asadi, Sahar; Reggente, Matteo
2009-01-01
Recent publications in statistical gas distribution modelling have proposed algorithms that model mean and variance of a distribution. This paper argues that estimating the predictive concentration variance entails not only a gradual improvement but is rather a significant step to advance the field. This is, first, since the models much better fit the particular structure of gas distributions, which exhibit strong fluctuations with considerable spatial variations as a result of the intermittent character of gas dispersal. Second, because estimating the predictive variance allows to evaluate the model quality in terms of the data likelihood. This offers a solution to the problem of ground truth evaluation, which has always been a critical issue for gas distribution modelling. It also enables solid comparisons of different modelling approaches, and provides the means to learn meta parameters of the model, to determine when the model should be updated or re-initialised, or to suggest new measurement locations based on the current model. We also point out directions of related ongoing or potential future research work.
Statistical distributions of extreme dry spell in Peninsular Malaysia
Zin, Wan Zawiah Wan; Jemain, Abdul Aziz
2010-11-01
Statistical distributions of annual extreme (AE) series and partial duration (PD) series for dry-spell event are analyzed for a database of daily rainfall records of 50 rain-gauge stations in Peninsular Malaysia, with recording period extending from 1975 to 2004. The three-parameter generalized extreme value (GEV) and generalized Pareto (GP) distributions are considered to model both series. In both cases, the parameters of these two distributions are fitted by means of the L-moments method, which provides a robust estimation of them. The goodness-of-fit (GOF) between empirical data and theoretical distributions are then evaluated by means of the L-moment ratio diagram and several goodness-of-fit tests for each of the 50 stations. It is found that for the majority of stations, the AE and PD series are well fitted by the GEV and GP models, respectively. Based on the models that have been identified, we can reasonably predict the risks associated with extreme dry spells for various return periods.
Statistical Distributions of Optical Flares from Gamma-Ray Bursts
International Nuclear Information System (INIS)
Yi, Shuang-Xi; Yu, Hai; Wang, F. Y.; Dai, Zi-Gao
2017-01-01
We statistically study gamma-ray burst (GRB) optical flares from the Swift /UVOT catalog. We compile 119 optical flares, including 77 flares with redshift measurements. Some tight correlations among the timescales of optical flares are found. For example, the rise time is correlated with the decay time, and the duration time is correlated with the peak time of optical flares. These two tight correlations indicate that longer rise times are associated with longer decay times of optical flares and also suggest that broader optical flares peak at later times, which are consistent with the corresponding correlations of X-ray flares. We also study the frequency distributions of optical flare parameters, including the duration time, rise time, decay time, peak time, and waiting time. Similar power-law distributions for optical and X-ray flares are found. Our statistic results imply that GRB optical flares and X-ray flares may share the similar physical origin, and both of them are possibly related to central engine activities.
Statistical Distributions of Optical Flares from Gamma-Ray Bursts
Energy Technology Data Exchange (ETDEWEB)
Yi, Shuang-Xi [College of Physics and Engineering, Qufu Normal University, Qufu 273165 (China); Yu, Hai; Wang, F. Y.; Dai, Zi-Gao, E-mail: fayinwang@nju.edu.cn [School of Astronomy and Space Science, Nanjing University, Nanjing 210093 (China)
2017-07-20
We statistically study gamma-ray burst (GRB) optical flares from the Swift /UVOT catalog. We compile 119 optical flares, including 77 flares with redshift measurements. Some tight correlations among the timescales of optical flares are found. For example, the rise time is correlated with the decay time, and the duration time is correlated with the peak time of optical flares. These two tight correlations indicate that longer rise times are associated with longer decay times of optical flares and also suggest that broader optical flares peak at later times, which are consistent with the corresponding correlations of X-ray flares. We also study the frequency distributions of optical flare parameters, including the duration time, rise time, decay time, peak time, and waiting time. Similar power-law distributions for optical and X-ray flares are found. Our statistic results imply that GRB optical flares and X-ray flares may share the similar physical origin, and both of them are possibly related to central engine activities.
Elastohydrodynamics of microfilament under distributed body actuation
Singh, T. Sonamani; Yadava, R. D. S.
2018-05-01
The dynamics of an active filament in low Reynolds (Re) number regime is analyzed under distributed body actuation represented by the sliding filament model. The governing elastohydrodynamic equations are formulated by assuming the resistive force theory (RFT). The effect of geometric nonlinearity in bending stiffness on the propulsive thrust has been analyzed where the former is introduced by cross-sectional tapering. Two types of boundary conditions (clamped-free and hinged-free) are analyzed. A comparison with the uniform filament dynamics reveals that the tapering enhances the thrust under both types of boundary conditions.
Planar-channeling spatial density under statistical equilibrium
International Nuclear Information System (INIS)
Ellison, J.A.; Picraux, S.T.
1978-01-01
The phase-space density for planar channeled particles has been derived for the continuum model under statistical equilibrium. This is used to obtain the particle spatial probability density as a function of incident angle. The spatial density is shown to depend on only two parameters, a normalized incident angle and a normalized planar spacing. This normalization is used to obtain, by numerical calculation, a set of universal curves for the spatial density and also for the channeled-particle wavelength as a function of amplitude. Using these universal curves, the statistical-equilibrium spatial density and the channeled-particle wavelength can be easily obtained for any case for which the continuum model can be applied. Also, a new one-parameter analytic approximation to the spatial density is developed. This parabolic approximation is shown to give excellent agreement with the exact calculations
Directory of Open Access Journals (Sweden)
Chaeyoung Lee
2012-11-01
Full Text Available Epistasis that may explain a large portion of the phenotypic variation for complex economic traits of animals has been ignored in many genetic association studies. A Baysian method was introduced to draw inferences about multilocus genotypic effects based on their marginal posterior distributions by a Gibbs sampler. A simulation study was conducted to provide statistical powers under various unbalanced designs by using this method. Data were simulated by combined designs of number of loci, within genotype variance, and sample size in unbalanced designs with or without null combined genotype cells. Mean empirical statistical power was estimated for testing posterior mean estimate of combined genotype effect. A practical example for obtaining empirical statistical power estimates with a given sample size was provided under unbalanced designs. The empirical statistical powers would be useful for determining an optimal design when interactive associations of multiple loci with complex phenotypes were examined.
Statistical study of ion pitch-angle distributions
International Nuclear Information System (INIS)
Sibeck, D.G.; Mcentire, R.W.; Lui, A.T.Y.; Krimigis, S.M.
1987-01-01
Preliminary results of a statistical study of energetic (34-50 keV) ion pitch-angle distributions (PADs) within 9 Re of earth provide evidence for an orderly pattern consistent with both drift-shell splitting and magnetopause shadowing. Normal ion PADs dominate the dayside and inner magnetosphere. Butterfly PADs typically occur in a narrow belt stretching from dusk to dawn through midnight, where they approach within 6 Re of earth. While those ion butterfly PADs that typically occur on closed drift paths are mainly caused by drift-shell splitting, there is also evidence for magnetopause shadowing in observations of more frequent butterfly PAD occurrence in the outer magnetosphere near dawn than dusk. Isotropic and gradient boundary PADs terminate the tailward extent of the butterfly ion PAD belt. 9 references
Statistical properties of the ice particle distribution in stratiform clouds
Delanoe, J.; Tinel, C.; Testud, J.
2003-04-01
This paper presents an extensive analysis of several microphysical data bases CEPEX, EUCREX, CLARE and CARL to determine statistical properties of the Particle Size Distribution (PSD). The data base covers different type of stratiform clouds : tropical cirrus (CEPEX), mid-latitude cirrus (EUCREX) and mid-latitude cirrus and stratus (CARL,CLARE) The approach for analysis uses the concept of normalisation of the PSD developed by Testud et al. (2001). The normalization aims at isolating three independent characteristics of the PSD : its "intrinsic" shape, the "average size" of the spectrum and the ice water content IWC, "average size" is meant the mean mass weighted diameter. It is shown that concentration should be normalized by N_0^* proportional to IWC/D_m^4. The "intrinsic" shape is defined as F(Deq/D_m)=N(Deq)/N_0^* where Deq is the equivalent melted diameter. The "intrinsic" shape is found to be very stable in the range 001.5, more scatter is observed, but future analysis should decide if it is representative of real physical variation or statistical "error" due to counting problem. Considering an overall statistics over the full data base, a large scatter of the N_0^* against Dm plot is found. But in the case of a particular event or a particular leg of a flight, the N_0^* vs. Dm plot is much less scattered and shows a systematic trend for decaying of N_0^* when Dm increases. This trend is interpreted as the manifestation of the predominance of the aggregation process. Finally an important point for cloud remote sensing is investigated : the normalised relationships IWC/N_0^* against Z/N_0^* is much less scattered that the classical IWC against Z the radar reflectivity factor.
Structure Learning and Statistical Estimation in Distribution Networks - Part I
Energy Technology Data Exchange (ETDEWEB)
Deka, Deepjyoti [Univ. of Texas, Austin, TX (United States); Backhaus, Scott N. [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Chertkov, Michael [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)
2015-02-13
Traditionally power distribution networks are either not observable or only partially observable. This complicates development and implementation of new smart grid technologies, such as those related to demand response, outage detection and management, and improved load-monitoring. In this two part paper, inspired by proliferation of the metering technology, we discuss estimation problems in structurally loopy but operationally radial distribution grids from measurements, e.g. voltage data, which are either already available or can be made available with a relatively minor investment. In Part I, the objective is to learn the operational layout of the grid. Part II of this paper presents algorithms that estimate load statistics or line parameters in addition to learning the grid structure. Further, Part II discusses the problem of structure estimation for systems with incomplete measurement sets. Our newly suggested algorithms apply to a wide range of realistic scenarios. The algorithms are also computationally efficient – polynomial in time– which is proven theoretically and illustrated computationally on a number of test cases. The technique developed can be applied to detect line failures in real time as well as to understand the scope of possible adversarial attacks on the grid.
Statistical thermodynamics and the size distributions of tropical convective clouds.
Garrett, T. J.; Glenn, I. B.; Krueger, S. K.; Ferlay, N.
2017-12-01
Parameterizations for sub-grid cloud dynamics are commonly developed by using fine scale modeling or measurements to explicitly resolve the mechanistic details of clouds to the best extent possible, and then to formulating these behaviors cloud state for use within a coarser grid. A second is to invoke physical intuition and some very general theoretical principles from equilibrium statistical thermodynamics. This second approach is quite widely used elsewhere in the atmospheric sciences: for example to explain the heat capacity of air, blackbody radiation, or even the density profile or air in the atmosphere. Here we describe how entrainment and detrainment across cloud perimeters is limited by the amount of available air and the range of moist static energy in the atmosphere, and that constrains cloud perimeter distributions to a power law with a -1 exponent along isentropes and to a Boltzmann distribution across isentropes. Further, the total cloud perimeter density in a cloud field is directly tied to the buoyancy frequency of the column. These simple results are shown to be reproduced within a complex dynamic simulation of a tropical convective cloud field and in passive satellite observations of cloud 3D structures. The implication is that equilibrium tropical cloud structures can be inferred from the bulk thermodynamic structure of the atmosphere without having to analyze computationally expensive dynamic simulations.
Garrido, Marta Isabel; Teng, Chee Leong James; Taylor, Jeremy Alexander; Rowe, Elise Genevieve; Mattingley, Jason Brett
2016-06-01
The ability to learn about regularities in the environment and to make predictions about future events is fundamental for adaptive behaviour. We have previously shown that people can implicitly encode statistical regularities and detect violations therein, as reflected in neuronal responses to unpredictable events that carry a unique prediction error signature. In the real world, however, learning about regularities will often occur in the context of competing cognitive demands. Here we asked whether learning of statistical regularities is modulated by concurrent cognitive load. We compared electroencephalographic metrics associated with responses to pure-tone sounds with frequencies sampled from narrow or wide Gaussian distributions. We showed that outliers evoked a larger response than those in the centre of the stimulus distribution (i.e., an effect of surprise) and that this difference was greater for physically identical outliers in the narrow than in the broad distribution. These results demonstrate an early neurophysiological marker of the brain's ability to implicitly encode complex statistical structure in the environment. Moreover, we manipulated concurrent cognitive load by having participants perform a visual working memory task while listening to these streams of sounds. We again observed greater prediction error responses in the narrower distribution under both low and high cognitive load. Furthermore, there was no reliable reduction in prediction error magnitude under high-relative to low-cognitive load. Our findings suggest that statistical learning is not a capacity limited process, and that it proceeds automatically even when cognitive resources are taxed by concurrent demands.
The statistical distribution of aerosol properties in sourthern West Africa
Haslett, Sophie; Taylor, Jonathan; Flynn, Michael; Bower, Keith; Dorsey, James; Crawford, Ian; Brito, Joel; Denjean, Cyrielle; Bourrianne, Thierry; Burnet, Frederic; Batenburg, Anneke; Schulz, Christiane; Schneider, Johannes; Borrmann, Stephan; Sauer, Daniel; Duplissy, Jonathan; Lee, James; Vaughan, Adam; Coe, Hugh
2017-04-01
The population and economy in southern West Africa have been growing at an exceptional rate in recent years and this trend is expected to continue, with the population projected to more than double to 800 million by 2050. This will result in a dramatic increase in anthropogenic pollutants, already estimated to have tripled between 1950 and 2000 (Lamarque et al., 2010). It is known that aerosols can modify the radiative properties of clouds. As such, the entrainment of anthropogenic aerosol into the large banks of clouds forming during the onset of the West African Monsoon could have a substantial impact on the region's response to climate change. Such projections, however, are greatly limited by the scarcity of observations in this part of the world. As part of the Dynamics-Aerosol-Chemistry-Cloud Interactions in West Africa (DACCIWA) project, three research aircraft were deployed, each carrying equipment capable of measuring aerosol properties in-situ. Instrumentation included Aerosol Mass Spectrometers (AMS), Single Particle Soot Photometers (SP2), Condensation Particle Counters (CPC) and Scanning Mobility Particle Sizers (SMPS). Throughout the intensive aircraft campaign, 155 hours of scientific flights covered an area including large parts of Benin, Togo, Ghana and parts of Côte D'Ivoire. Approximately 70 hours were dedicated to the measurement of cloud-aerosol interactions, with many other flights producing data contributing towards this objective. Using datasets collected during this campaign period, it is possible to build a robust statistical understanding of aerosol properties in this region for the first time, including size distributions and optical and chemical properties. Here, we describe preliminary results from aerosol measurements on board the three aircraft. These have been used to describe aerosol properties throughout the region and time period encompassed by the DACCIWA aircraft campaign. Such statistics will be invaluable for improving future
DEFF Research Database (Denmark)
Jurado-Navas, Antonio
2015-01-01
in homogeneous, isotropic turbulence. Málaga distribution was demonstrated to have the advantage of unifying most of the proposed statistical models derived until now in the scientific literature in a closed-form and mathematically-tractable expression. Furthermore, it unifies most of the proposed statistical...... models for the irradiance fluctuations derived in the bibliography providing, in addition, an excellent agreement with published plane wave and spherical wave simulation data over a wide range of turbulence conditions (weak to strong). In this communication, reviews of its different features...... scintillation in atmospheric optical communication links under any turbulence conditions...
Yigzaw, Kassaye Yitbarek; Michalas, Antonis; Bellika, Johan Gustav
2017-01-03
Techniques have been developed to compute statistics on distributed datasets without revealing private information except the statistical results. However, duplicate records in a distributed dataset may lead to incorrect statistical results. Therefore, to increase the accuracy of the statistical analysis of a distributed dataset, secure deduplication is an important preprocessing step. We designed a secure protocol for the deduplication of horizontally partitioned datasets with deterministic record linkage algorithms. We provided a formal security analysis of the protocol in the presence of semi-honest adversaries. The protocol was implemented and deployed across three microbiology laboratories located in Norway, and we ran experiments on the datasets in which the number of records for each laboratory varied. Experiments were also performed on simulated microbiology datasets and data custodians connected through a local area network. The security analysis demonstrated that the protocol protects the privacy of individuals and data custodians under a semi-honest adversarial model. More precisely, the protocol remains secure with the collusion of up to N - 2 corrupt data custodians. The total runtime for the protocol scales linearly with the addition of data custodians and records. One million simulated records distributed across 20 data custodians were deduplicated within 45 s. The experimental results showed that the protocol is more efficient and scalable than previous protocols for the same problem. The proposed deduplication protocol is efficient and scalable for practical uses while protecting the privacy of patients and data custodians.
Investigation of the statistical distance to reach stationary distributions
International Nuclear Information System (INIS)
Nicholson, S.B.; Kim, Eun-jin
2015-01-01
The thermodynamic length gives a Riemannian metric to a system's phase space. Here we extend the traditional thermodynamic length to the information length (L) out of equilibrium and examine its properties. We utilise L as a useful methodology of analysing non-equilibrium systems without evoking conventional assumptions such as Gaussian statistics, detailed balance, priori-known constraints, or ergodicity and numerically examine how L evolves in time for the logistic map in the chaotic regime depending on initial conditions. To this end, we propose a discrete version of L which is mathematically well defined by taking a set theoretic approach. We identify the areas of phase space where the loss of information of the system takes place most rapidly. In particular, we present an interesting result that the unstable fixed points turn out to most efficiently drive the logistic map towards a stationary distribution through L. - Highlights: • Define a set theoretic version of the discrete thermodynamic length. • These sets allow one to analyse systems having zero probabilities in their evolution. • Numerically analyse the Logistic map using the thermodynamic length. • Show how the unstable fixed points most efficiently lead the system to equilibrium
International Nuclear Information System (INIS)
EI-Shanshoury, G.I.
2011-01-01
Several statistical distributions are used to model various reliability and maintainability parameters. The applied distribution depends on the' nature of the data being analyzed. The presented paper deals with analysis of some statistical distributions used in reliability to reach the best fit of distribution analysis. The calculations rely on circuit quantity parameters obtained by using Relex 2009 computer program. The statistical analysis of ten different distributions indicated that Weibull distribution gives the best fit distribution for modeling the reliability of the data set of Temperature Alarm Circuit (TAC). However, the Exponential distribution is found to be the best fit distribution for modeling the failure rate
Alimi, Jean-Michel; de Fromont, Paul
2018-04-01
The statistical properties of cosmic structures are well known to be strong probes for cosmology. In particular, several studies tried to use the cosmic void counting number to obtain tight constrains on dark energy. In this paper, we model the statistical properties of these regions using the CoSphere formalism (de Fromont & Alimi) in both primordial and non-linearly evolved Universe in the standard Λ cold dark matter model. This formalism applies similarly for minima (voids) and maxima (such as DM haloes), which are here considered symmetrically. We first derive the full joint Gaussian distribution of CoSphere's parameters in the Gaussian random field. We recover the results of Bardeen et al. only in the limit where the compensation radius becomes very large, i.e. when the central extremum decouples from its cosmic environment. We compute the probability distribution of the compensation size in this primordial field. We show that this distribution is redshift independent and can be used to model cosmic voids size distribution. We also derive the statistical distribution of the peak parameters introduced by Bardeen et al. and discuss their correlation with the cosmic environment. We show that small central extrema with low density are associated with narrow compensation regions with deep compensation density, while higher central extrema are preferentially located in larger but smoother over/under massive regions.
Statistical properties of the normalized ice particle size distribution
Delanoë, Julien; Protat, Alain; Testud, Jacques; Bouniol, Dominique; Heymsfield, A. J.; Bansemer, A.; Brown, P. R. A.; Forbes, R. M.
2005-05-01
Testud et al. (2001) have recently developed a formalism, known as the "normalized particle size distribution (PSD)", which consists in scaling the diameter and concentration axes in such a way that the normalized PSDs are independent of water content and mean volume-weighted diameter. In this paper we investigate the statistical properties of the normalized PSD for the particular case of ice clouds, which are known to play a crucial role in the Earth's radiation balance. To do so, an extensive database of airborne in situ microphysical measurements has been constructed. A remarkable stability in shape of the normalized PSD is obtained. The impact of using a single analytical shape to represent all PSDs in the database is estimated through an error analysis on the instrumental (radar reflectivity and attenuation) and cloud (ice water content, effective radius, terminal fall velocity of ice crystals, visible extinction) properties. This resulted in a roughly unbiased estimate of the instrumental and cloud parameters, with small standard deviations ranging from 5 to 12%. This error is found to be roughly independent of the temperature range. This stability in shape and its single analytical approximation implies that two parameters are now sufficient to describe any normalized PSD in ice clouds: the intercept parameter N*0 and the mean volume-weighted diameter Dm. Statistical relationships (parameterizations) between N*0 and Dm have then been evaluated in order to reduce again the number of unknowns. It has been shown that a parameterization of N*0 and Dm by temperature could not be envisaged to retrieve the cloud parameters. Nevertheless, Dm-T and mean maximum dimension diameter -T parameterizations have been derived and compared to the parameterization of Kristjánsson et al. (2000) currently used to characterize particle size in climate models. The new parameterization generally produces larger particle sizes at any temperature than the Kristjánsson et al. (2000
On the Limit Distribution of Lower Extreme Generalized Order Statistics
Indian Academy of Sciences (India)
In a wide subclass of generalized order statistics ( g O s ) , which contains most of the known and important models of ordered random variables, weak convergence of lower extremes are developed. A recent result of extreme value theory of m − g O s (as well as the classical extreme value theory of ordinary order statistics) ...
Larwin, Karen H.; Larwin, David A.
2011-01-01
Bootstrapping methods and random distribution methods are increasingly recommended as better approaches for teaching students about statistical inference in introductory-level statistics courses. The authors examined the effect of teaching undergraduate business statistics students using random distribution and bootstrapping simulations. It is the…
Distribution function of excitations in systems with fractional statistics
International Nuclear Information System (INIS)
Protogenov, A.P.
1992-08-01
The distribution function of low-energy excitations in 2+1D systems has been considered. It is shown that in these systems the quantum distribution function differs from the usual one by having a finite value of the entropy of linked braids. (author). 47 refs
Statistical Tests for Frequency Distribution of Mean Gravity Anomalies
African Journals Online (AJOL)
The hypothesis that a very large number of lOx 10mean gravity anomalies are normally distributed has been rejected at 5% Significance level based on the X2 and the unit normal deviate tests. However, the 50 equal area mean anomalies derived from the lOx 10data, have been found to be normally distributed at the same ...
Optimal skill distribution under convex skill costs
Directory of Open Access Journals (Sweden)
Tin Cheuk Leung
2018-03-01
Full Text Available This paper studies optimal distribution of skills in an optimal income tax framework with convex skill constraints. The problem is cast as a social planning problem where a redistributive planner chooses how to distribute a given amount of aggregate skills across people. We find that optimal skill distribution is either perfectly equal or perfectly unequal, but an interior level of skill inequality is never optimal.
Baijal, Shruti; Nakatani, Chie; van Leeuwen, Cees; Srinivasan, Narayanan
2013-06-07
Human observers show remarkable efficiency in statistical estimation; they are able, for instance, to estimate the mean size of visual objects, even if their number exceeds the capacity limits of focused attention. This ability has been understood as the result of a distinct mode of attention, i.e. distributed attention. Compared to the focused attention mode, working memory representations under distributed attention are proposed to be more compressed, leading to reduced working memory loads. An alternate proposal is that distributed attention uses less structured, feature-level representations. These would fill up working memory (WM) more, even when target set size is low. Using event-related potentials, we compared WM loading in a typical distributed attention task (mean size estimation) to that in a corresponding focused attention task (object recognition), using a measure called contralateral delay activity (CDA). Participants performed both tasks on 2, 4, or 8 different-sized target disks. In the recognition task, CDA amplitude increased with set size; notably, however, in the mean estimation task the CDA amplitude was high regardless of set size. In particular for set-size 2, the amplitude was higher in the mean estimation task than in the recognition task. The result showed that the task involves full WM loading even with a low target set size. This suggests that in the distributed attention mode, representations are not compressed, but rather less structured than under focused attention conditions. Copyright © 2012 Elsevier Ltd. All rights reserved.
Asymptotic distribution of ∆AUC, NRIs, and IDI based on theory of U-statistics.
Demler, Olga V; Pencina, Michael J; Cook, Nancy R; D'Agostino, Ralph B
2017-09-20
The change in area under the curve (∆AUC), the integrated discrimination improvement (IDI), and net reclassification index (NRI) are commonly used measures of risk prediction model performance. Some authors have reported good validity of associated methods of estimating their standard errors (SE) and construction of confidence intervals, whereas others have questioned their performance. To address these issues, we unite the ∆AUC, IDI, and three versions of the NRI under the umbrella of the U-statistics family. We rigorously show that the asymptotic behavior of ∆AUC, NRIs, and IDI fits the asymptotic distribution theory developed for U-statistics. We prove that the ∆AUC, NRIs, and IDI are asymptotically normal, unless they compare nested models under the null hypothesis. In the latter case, asymptotic normality and existing SE estimates cannot be applied to ∆AUC, NRIs, or IDI. In the former case, SE formulas proposed in the literature are equivalent to SE formulas obtained from U-statistics theory if we ignore adjustment for estimated parameters. We use Sukhatme-Randles-deWet condition to determine when adjustment for estimated parameters is necessary. We show that adjustment is not necessary for SEs of the ∆AUC and two versions of the NRI when added predictor variables are significant and normally distributed. The SEs of the IDI and three-category NRI should always be adjusted for estimated parameters. These results allow us to define when existing formulas for SE estimates can be used and when resampling methods such as the bootstrap should be used instead when comparing nested models. We also use the U-statistic theory to develop a new SE estimate of ∆AUC. Copyright © 2017 John Wiley & Sons, Ltd. Copyright © 2017 John Wiley & Sons, Ltd.
Statistical analysis of rockfall volume distributions: Implications for rockfall dynamics
Dussauge, Carine; Grasso, Jean-Robert; Helmstetter, AgnèS.
2003-06-01
We analyze the volume distribution of natural rockfalls on different geological settings (i.e., calcareous cliffs in the French Alps, Grenoble area, and granite Yosemite cliffs, California Sierra) and different volume ranges (i.e., regional and worldwide catalogs). Contrary to previous studies that included several types of landslides, we restrict our analysis to rockfall sources which originated on subvertical cliffs. For the three data sets, we find that the rockfall volumes follow a power law distribution with a similar exponent value, within error bars. This power law distribution was also proposed for rockfall volumes that occurred along road cuts. All these results argue for a recurrent power law distribution of rockfall volumes on subvertical cliffs, for a large range of rockfall sizes (102-1010 m3), regardless of the geological settings and of the preexisting geometry of fracture patterns that are drastically different on the three studied areas. The power law distribution for rockfall volumes could emerge from two types of processes. First, the observed power law distribution of rockfall volumes is similar to the one reported for both fragmentation experiments and fragmentation models. This argues for the geometry of rock mass fragment sizes to possibly control the rockfall volumes. This way neither cascade nor avalanche processes would influence the rockfall volume distribution. Second, without any requirement of scale-invariant quenched heterogeneity patterns, the rock mass dynamics can arise from avalanche processes driven by fluctuations of the rock mass properties, e.g., cohesion or friction angle. This model may also explain the power law distribution reported for landslides involving unconsolidated materials. We find that the exponent values of rockfall volume on subvertical cliffs, 0.5 ± 0.2, is significantly smaller than the 1.2 ± 0.3 value reported for mixed landslide types. This change of exponents can be driven by the material strength, which
On the limit distribution of lower extreme generalized order statistics
Indian Academy of Sciences (India)
Abstract. In a wide subclass of generalized order statistics (gOs), which contains most of the known and important models of ordered random variables, weak conver- gence of lower extremes are developed. A recent result of extreme value theory of m−gOs (as well as the classical extreme value theory of ordinary order ...
Metal Distribution and Mobility under alkaline conditions
International Nuclear Information System (INIS)
Dario, Maarten
2004-01-01
The adsorption of an element, expressed as its distribution between liquid (aquatic) and solid phases in the bio geosphere, largely determines its mobility and transport properties. This is of fundamental importance in the assessment of the performance of e.g. geologic repositories for hazardous elements like radionuclides. Geologic repositories for low and intermediate level nuclear waste will most likely be based on concrete constructions in a suitable bedrock, leading to a local chemical environment with pH well above 12. At this pH metal adsorption is very high, and thus the mobility is hindered. Organic complexing agents, such as natural humic matter from the ground and in the groundwater, as well as components in the waste (cleaning agents, degradation products from ion exchange resins and cellulose, cement additives etc.) would affect the sorption properties of the various elements in the waste. Trace element migration from a cementitious repository through the pH- and salinity gradient created around the repository would be affected by the presence and creation of particulate matter (colloids) that may serve as carriers that enhance the mobility. The objective of this thesis was to describe and quantify the sorption of some selected elements representative of spent nuclear fuel (Eu, Am) and other heavy metals (Zn, Cd, Hg) in a clay/cement environment (pH 10-13) and in the pH-gradient outside this environment. The potential of organic complexing agents and colloids to enhance metal migration was also investigated. It was shown that many organic ligands are able to reduce trace metal sorption under these conditions. It was not possible to calculate the effect of well-defined organic ligands on the metal sorption in a cement environment by using stability constants from the literature. A simple method for comparing the effect of different complexing agents on metal sorption is, however, suggested. The stability in terms of the particle size of suspended
Dynamic Response to Pedestrian Loads with Statistical Frequency Distribution
DEFF Research Database (Denmark)
Krenk, Steen
2012-01-01
on the magnitude of the resulting response. A frequency representation of vertical pedestrian load is developed, and a compact explicit formula is developed for the magnitude of the resulting response, in terms of the damping ratio of the structure, the bandwidth of the pedestrian load, and the mean footfall...... frequency. The accuracy of the formula is verified by a statistical moment analysis using the Lyapunov equations....
Statistical distribution of the estimator of Weibull modulus
Barbero, Enrique; Fernández-Sáez, José; Navarro Ugena, Carlos
2001-01-01
3 pages, 3 figures. The Weibull statistic has been widely used to study the inherent scatter existing in the strength properties of many advanced materials, as well as in the fracture toughness of steels in the ductile-brittle transition region. The authors are indebted to the Fundación Ramón Areces (Área de Materiales, IX Concurso Nacional) for its financial support of this research. Publicado
Vali Ahmadi, Mohammad; Doostparast, Mahdi; Ahmadi, Jafar
2015-04-01
In manufacturing industries, the lifetime of an item is usually characterised by a random variable X and considered to be satisfactory if X exceeds a given lower lifetime limit L. The probability of a satisfactory item is then ηL := P(X ≥ L), called conforming rate. In industrial companies, however, the lifetime performance index, proposed by Montgomery and denoted by CL, is widely used as a process capability index instead of the conforming rate. Assuming a parametric model for the random variable X, we show that there is a connection between the conforming rate and the lifetime performance index. Consequently, the statistical inferences about ηL and CL are equivalent. Hence, we restrict ourselves to statistical inference for CL based on generalised order statistics, which contains several ordered data models such as usual order statistics, progressively Type-II censored data and records. Various point and interval estimators for the parameter CL are obtained and optimal critical regions for the hypothesis testing problems concerning CL are proposed. Finally, two real data-sets on the lifetimes of insulating fluid and ball bearings, due to Nelson (1982) and Caroni (2002), respectively, and a simulated sample are analysed.
Statistical analysis of lightning electric field measured under Malaysian condition
Salimi, Behnam; Mehranzamir, Kamyar; Abdul-Malek, Zulkurnain
2014-02-01
Lightning is an electrical discharge during thunderstorms that can be either within clouds (Inter-Cloud), or between clouds and ground (Cloud-Ground). The Lightning characteristics and their statistical information are the foundation for the design of lightning protection system as well as for the calculation of lightning radiated fields. Nowadays, there are various techniques to detect lightning signals and to determine various parameters produced by a lightning flash. Each technique provides its own claimed performances. In this paper, the characteristics of captured broadband electric fields generated by cloud-to-ground lightning discharges in South of Malaysia are analyzed. A total of 130 cloud-to-ground lightning flashes from 3 separate thunderstorm events (each event lasts for about 4-5 hours) were examined. Statistical analyses of the following signal parameters were presented: preliminary breakdown pulse train time duration, time interval between preliminary breakdowns and return stroke, multiplicity of stroke, and percentages of single stroke only. The BIL model is also introduced to characterize the lightning signature patterns. Observations on the statistical analyses show that about 79% of lightning signals fit well with the BIL model. The maximum and minimum of preliminary breakdown time duration of the observed lightning signals are 84 ms and 560 us, respectively. The findings of the statistical results show that 7.6% of the flashes were single stroke flashes, and the maximum number of strokes recorded was 14 multiple strokes per flash. A preliminary breakdown signature in more than 95% of the flashes can be identified.
Statistical complexity without explicit reference to underlying probabilities
Pennini, F.; Plastino, A.
2018-06-01
We show that extremely simple systems of a not too large number of particles can be simultaneously thermally stable and complex. To such an end, we extend the statistical complexity's notion to simple configurations of non-interacting particles, without appeal to probabilities, and discuss configurational properties.
CDFTBL: A statistical program for generating cumulative distribution functions from data
International Nuclear Information System (INIS)
Eslinger, P.W.
1991-06-01
This document describes the theory underlying the CDFTBL code and gives details for using the code. The CDFTBL code provides an automated tool for generating a statistical cumulative distribution function that describes a set of field data. The cumulative distribution function is written in the form of a table of probabilities, which can be used in a Monte Carlo computer code. A a specific application, CDFTBL can be used to analyze field data collected for parameters required by the PORMC computer code. Section 2.0 discusses the mathematical basis of the code. Section 3.0 discusses the code structure. Section 4.0 describes the free-format input command language, while Section 5.0 describes in detail the commands to run the program. Section 6.0 provides example program runs, and Section 7.0 provides references. The Appendix provides a program source listing. 11 refs., 2 figs., 19 tabs
New distributions of the statistical time delay of electrical breakdown in nitrogen
International Nuclear Information System (INIS)
Markovic, V Lj; Gocic, S R; Stamenkovic, S N
2006-01-01
Two new distributions of the statistical time delay of electrical breakdown in nitrogen are reported in this paper. The Gaussian and Gauss-exponential distributions of statistical time delay have been obtained on the basis of thousands of time delay measurements on a gas tube with a plane-parallel electrode system. Distributions of the statistical time delay are theoretically founded on binomial distribution for the occurrence of initiating electrons and described by using simple analytical and numerical models. The shapes of distributions depend on the electron yields in the interelectrode space originating from residual states. It is shown that a distribution of the statistical time delay changes from exponential and Gauss-exponential to Gaussian distribution due to the influence of residual ionization
Optimal dividend distribution under Markov regime switching
Jiang, Z.; Pistorius, M.
2012-01-01
We investigate the problem of optimal dividend distribution for a company in the presence of regime shifts. We consider a company whose cumulative net revenues evolve as a Brownian motion with positive drift that is modulated by a finite state Markov chain, and model the discount rate as a
Modeling Malaria Vector Distribution under Climate Change Scenarios in Kenya
Ngaina, J. N.
2017-12-01
Projecting the distribution of malaria vectors under climate change is essential for planning integrated vector control strategies for sustaining elimination and preventing reintroduction of malaria. However, in Kenya, little knowledge exists on the possible effects of climate change on malaria vectors. Here we assess the potential impact of future climate change on locally dominant Anopheles vectors including Anopheles gambiae, Anopheles arabiensis, Anopheles merus, Anopheles funestus, Anopheles pharoensis and Anopheles nili. Environmental data (Climate, Land cover and elevation) and primary empirical geo-located species-presence data were identified. The principle of maximum entropy (Maxent) was used to model the species' potential distribution area under paleoclimate, current and future climates. The Maxent model was highly accurate with a statistically significant AUC value. Simulation-based estimates suggest that the environmentally suitable area (ESA) for Anopheles gambiae, An. arabiensis, An. funestus and An. pharoensis would increase under all two scenarios for mid-century (2016-2045), but decrease for end century (2071-2100). An increase in ESA of An. Funestus was estimated under medium stabilizing (RCP4.5) and very heavy (RCP8.5) emission scenarios for mid-century. Our findings can be applied in various ways such as the identification of additional localities where Anopheles malaria vectors may already exist, but has not yet been detected and the recognition of localities where it is likely to spread to. Moreover, it will help guide future sampling location decisions, help with the planning of vector control suites nationally and encourage broader research inquiry into vector species niche modeling
Distribution, Statistics, and Resurfacing of Large Impact Basins on Mercury
Fassett, Caleb I.; Head, James W.; Baker, David M. H.; Chapman, Clark R.; Murchie, Scott L.; Neumann, Gregory A.; Oberst, Juergen; Prockter, Louise M.; Smith, David E.; Solomon, Sean C.;
2012-01-01
The distribution and geological history of large impact basins (diameter D greater than or equal to 300 km) on Mercury is important to understanding the planet's stratigraphy and surface evolution. It is also informative to compare the density of impact basins on Mercury with that of the Moon to understand similarities and differences in their impact crater and basin populations [1, 2]. A variety of impact basins were proposed on the basis of geological mapping with Mariner 10 data [e.g. 3]. This basin population can now be re-assessed and extended to the full planet, using data from the MErcury Surface, Space ENvironment, GEochemistry, and Ranging (MESSENGER) spacecraft. Note that small-to- medium-sized peak-ring basins on Mercury are being examined separately [4, 5]; only the three largest peak-ring basins on Mercury overlap with the size range we consider here. In this study, we (1) re-examine the large basins suggested on the basis of Mariner 10 data, (2) suggest additional basins from MESSENGER's global coverage of Mercury, (3) assess the size-frequency distribution of mercurian basins on the basis of these global observations and compare it to the Moon, and (4) analyze the implications of these observations for the modification history of basins on Mercury.
Charged-particle thermonuclear reaction rates: I. Monte Carlo method and statistical distributions
International Nuclear Information System (INIS)
Longland, R.; Iliadis, C.; Champagne, A.E.; Newton, J.R.; Ugalde, C.; Coc, A.; Fitzgerald, R.
2010-01-01
A method based on Monte Carlo techniques is presented for evaluating thermonuclear reaction rates. We begin by reviewing commonly applied procedures and point out that reaction rates that have been reported up to now in the literature have no rigorous statistical meaning. Subsequently, we associate each nuclear physics quantity entering in the calculation of reaction rates with a specific probability density function, including Gaussian, lognormal and chi-squared distributions. Based on these probability density functions the total reaction rate is randomly sampled many times until the required statistical precision is achieved. This procedure results in a median (Monte Carlo) rate which agrees under certain conditions with the commonly reported recommended 'classical' rate. In addition, we present at each temperature a low rate and a high rate, corresponding to the 0.16 and 0.84 quantiles of the cumulative reaction rate distribution. These quantities are in general different from the statistically meaningless 'minimum' (or 'lower limit') and 'maximum' (or 'upper limit') reaction rates which are commonly reported. Furthermore, we approximate the output reaction rate probability density function by a lognormal distribution and present, at each temperature, the lognormal parameters μ and σ. The values of these quantities will be crucial for future Monte Carlo nucleosynthesis studies. Our new reaction rates, appropriate for bare nuclei in the laboratory, are tabulated in the second paper of this issue (Paper II). The nuclear physics input used to derive our reaction rates is presented in the third paper of this issue (Paper III). In the fourth paper of this issue (Paper IV) we compare our new reaction rates to previous results.
Distribution-level electricity reliability: Temporal trends using statistical analysis
International Nuclear Information System (INIS)
Eto, Joseph H.; LaCommare, Kristina H.; Larsen, Peter; Todd, Annika; Fisher, Emily
2012-01-01
This paper helps to address the lack of comprehensive, national-scale information on the reliability of the U.S. electric power system by assessing trends in U.S. electricity reliability based on the information reported by the electric utilities on power interruptions experienced by their customers. The research analyzes up to 10 years of electricity reliability information collected from 155 U.S. electric utilities, which together account for roughly 50% of total U.S. electricity sales. We find that reported annual average duration and annual average frequency of power interruptions have been increasing over time at a rate of approximately 2% annually. We find that, independent of this trend, installation or upgrade of an automated outage management system is correlated with an increase in the reported annual average duration of power interruptions. We also find that reliance on IEEE Standard 1366-2003 is correlated with higher reported reliability compared to reported reliability not using the IEEE standard. However, we caution that we cannot attribute reliance on the IEEE standard as having caused or led to higher reported reliability because we could not separate the effect of reliance on the IEEE standard from other utility-specific factors that may be correlated with reliance on the IEEE standard. - Highlights: ► We assess trends in electricity reliability based on the information reported by the electric utilities. ► We use rigorous statistical techniques to account for utility-specific differences. ► We find modest declines in reliability analyzing interruption duration and frequency experienced by utility customers. ► Installation or upgrade of an OMS is correlated to an increase in reported duration of power interruptions. ► We find reliance in IEEE Standard 1366 is correlated with higher reported reliability.
Investigating the Statistical Distribution of Learning Coverage in MOOCs
Directory of Open Access Journals (Sweden)
Xiu Li
2017-11-01
Full Text Available Learners participating in Massive Open Online Courses (MOOC have a wide range of backgrounds and motivations. Many MOOC learners enroll in the courses to take a brief look; only a few go through the entire content, and even fewer are able to eventually obtain a certificate. We discovered this phenomenon after having examined 92 courses on both xuetangX and edX platforms. More specifically, we found that the learning coverage in many courses—one of the metrics used to estimate the learners’ active engagement with the online courses—observes a Zipf distribution. We apply the maximum likelihood estimation method to fit the Zipf’s law and test our hypothesis using a chi-square test. In the xuetangX dataset, the learning coverage in 53 of 76 courses fits Zipf’s law, but in all of 16 courses on the edX platform, the learning coverage rejects the Zipf’s law. The result from our study is expected to bring insight to the unique learning behavior on MOOC.
Higher Moments of Underlying Event Distributions
Xu, Zhen
2017-01-01
We perform an Underlying Event analysis for real data sets from pp collisions at center of mass energy $ \\sqrt{s}=5 $ and 13 TeV and pPb collisions at $ \\sqrt{s}=7 $ TeV at the LHC, together with the Monte Carlo data sets generated with Pythia8 and EPOS in the same conditions. The analysis is focused on the transverse region which is more sensitive to the Underlying Event, and performed as a function of the leading track transverse - momentum $p_t$ in each event. In our work, not only the average underlying event activity but also its fluctuation, namely its root mean square (RMS), Skewness and Kurtosis, are analyzed. We find that the particle density, energy density and their fluctuation magnitude (RMS) are suppressed at leading $p_t\\approx$ 5 GeV/c for all these cases, with EPOS having evident deviation of 10\\%-25\\%. The higher moments skewness and kurtosis decrease rapidly in low leading $p_t$ region, and follow an interesting Gaussian-like peak centered at leading $p_t\\approx$ 15 GeV/c.
Derivation of some new distributions in statistical mechanics using maximum entropy approach
Directory of Open Access Journals (Sweden)
Ray Amritansu
2014-01-01
Full Text Available The maximum entropy principle has been earlier used to derive the Bose Einstein(B.E., Fermi Dirac(F.D. & Intermediate Statistics(I.S. distribution of statistical mechanics. The central idea of these distributions is to predict the distribution of the microstates, which are the particle of the system, on the basis of the knowledge of some macroscopic data. The latter information is specified in the form of some simple moment constraints. One distribution differs from the other in the way in which the constraints are specified. In the present paper, we have derived some new distributions similar to B.E., F.D. distributions of statistical mechanics by using maximum entropy principle. Some proofs of B.E. & F.D. distributions are shown, and at the end some new results are discussed.
International Nuclear Information System (INIS)
Zambra, M.; Favre, M.; Moreno, J.; Wyndham, E.; Chuaqui, H.; Choi, P.
1998-01-01
The charge formation processes in a hollow cathode region (HCR) of transient hollow cathode discharge have been studied at the final phase. The statistical distribution that describe different processes of ionization have been represented by Gaussian distributions. Nevertheless, was observed a better representation of these distributions when the pressure is near a minimum value, just before breakdown
ERROR DISTRIBUTION EVALUATION OF THE THIRD VANISHING POINT BASED ON RANDOM STATISTICAL SIMULATION
Directory of Open Access Journals (Sweden)
C. Li
2012-07-01
Full Text Available POS, integrated by GPS / INS (Inertial Navigation Systems, has allowed rapid and accurate determination of position and attitude of remote sensing equipment for MMS (Mobile Mapping Systems. However, not only does INS have system error, but also it is very expensive. Therefore, in this paper error distributions of vanishing points are studied and tested in order to substitute INS for MMS in some special land-based scene, such as ground façade where usually only two vanishing points can be detected. Thus, the traditional calibration approach based on three orthogonal vanishing points is being challenged. In this article, firstly, the line clusters, which parallel to each others in object space and correspond to the vanishing points, are detected based on RANSAC (Random Sample Consensus and parallelism geometric constraint. Secondly, condition adjustment with parameters is utilized to estimate nonlinear error equations of two vanishing points (VX, VY. How to set initial weights for the adjustment solution of single image vanishing points is presented. Solving vanishing points and estimating their error distributions base on iteration method with variable weights, co-factor matrix and error ellipse theory. Thirdly, under the condition of known error ellipses of two vanishing points (VX, VY and on the basis of the triangle geometric relationship of three vanishing points, the error distribution of the third vanishing point (VZ is calculated and evaluated by random statistical simulation with ignoring camera distortion. Moreover, Monte Carlo methods utilized for random statistical estimation are presented. Finally, experimental results of vanishing points coordinate and their error distributions are shown and analyzed.
Error Distribution Evaluation of the Third Vanishing Point Based on Random Statistical Simulation
Li, C.
2012-07-01
POS, integrated by GPS / INS (Inertial Navigation Systems), has allowed rapid and accurate determination of position and attitude of remote sensing equipment for MMS (Mobile Mapping Systems). However, not only does INS have system error, but also it is very expensive. Therefore, in this paper error distributions of vanishing points are studied and tested in order to substitute INS for MMS in some special land-based scene, such as ground façade where usually only two vanishing points can be detected. Thus, the traditional calibration approach based on three orthogonal vanishing points is being challenged. In this article, firstly, the line clusters, which parallel to each others in object space and correspond to the vanishing points, are detected based on RANSAC (Random Sample Consensus) and parallelism geometric constraint. Secondly, condition adjustment with parameters is utilized to estimate nonlinear error equations of two vanishing points (VX, VY). How to set initial weights for the adjustment solution of single image vanishing points is presented. Solving vanishing points and estimating their error distributions base on iteration method with variable weights, co-factor matrix and error ellipse theory. Thirdly, under the condition of known error ellipses of two vanishing points (VX, VY) and on the basis of the triangle geometric relationship of three vanishing points, the error distribution of the third vanishing point (VZ) is calculated and evaluated by random statistical simulation with ignoring camera distortion. Moreover, Monte Carlo methods utilized for random statistical estimation are presented. Finally, experimental results of vanishing points coordinate and their error distributions are shown and analyzed.
Probabilistic analysis of flaw distribution on structure under cyclic load
International Nuclear Information System (INIS)
Kwak, Sang Log; Choi, Young Hwan; Kim, Hho Jung
2003-01-01
Flaw geometries, applied stress, and material properties are major input variables for the fracture mechanics analysis. Probabilistic approach can be applied for the consideration of uncertainties within these input variables. But probabilistic analysis requires many assumptions due to the lack of initial flaw distributions data. In this study correlations are examined between initial flaw distributions and in-service flaw distributions on structures under cyclic load. For the analysis, LEFM theories and Monte Carlo simulation are applied. Result shows that in-service flaw distributions are determined by initial flaw distributions rather than fatigue crack growth rate. So initial flaw distribution can be derived from in-service flaw distributions
Ipsen, Andreas
2015-02-03
Despite the widespread use of mass spectrometry (MS) in a broad range of disciplines, the nature of MS data remains very poorly understood, and this places important constraints on the quality of MS data analysis as well as on the effectiveness of MS instrument design. In the following, a procedure for calculating the statistical distribution of the mass peak intensity for MS instruments that use analog-to-digital converters (ADCs) and electron multipliers is presented. It is demonstrated that the physical processes underlying the data-generation process, from the generation of the ions to the signal induced at the detector, and on to the digitization of the resulting voltage pulse, result in data that can be well-approximated by a Gaussian distribution whose mean and variance are determined by physically meaningful instrumental parameters. This allows for a very precise understanding of the signal-to-noise ratio of mass peak intensities and suggests novel ways of improving it. Moreover, it is a prerequisite for being able to address virtually all data analytical problems in downstream analyses in a statistically rigorous manner. The model is validated with experimental data.
International Nuclear Information System (INIS)
Gorokhovski, M A; Saveliev, V L
2008-01-01
This paper analyses statistical universalities that arise over time during constant frequency fragmentation under scaling symmetry. The explicit expression of particle-size distribution obtained from the evolution kinetic equation shows that, with increasing time, the initial distribution tends to the ultimate steady-state delta function through at least two intermediate universal asymptotics. The earlier asymptotic is the well-known log-normal distribution of Kolmogorov (1941 Dokl. Akad. Nauk. SSSR 31 99-101). This distribution is the first universality and has two parameters: the first and the second logarithmic moments of the fragmentation intensity spectrum. The later asymptotic is a power function (stronger universality) with a single parameter that is given by the ratio of the first two logarithmic moments. At large times, the first universality implies that the evolution equation can be reduced exactly to the Fokker-Planck equation instead of making the widely used but inconsistent assumption about the smallness of higher than second order moments. At even larger times, the second universality shows evolution towards a fractal state with dimension identified as a measure of the fracture resistance of the medium
International Nuclear Information System (INIS)
Gupta, S.S.; Panchapakesan, S.
1975-01-01
A quantile selection procedure in reliability problems pertaining to a restricted family of probability distributions is discussed. This family is assumed to be star-ordered with respect to the standard normal distribution folded at the origin. Motivation for this formulation of the problem is described. Both exact and asymptotic results dealing with the distribution of the maximum of ratios of order statistics from such a family are obtained and tables of the appropriate constants, percentiles of this statistic, are given in order to facilitate the use of the selection procedure
Project management under uncertainty beyond beta: The generalized bicubic distribution
Directory of Open Access Journals (Sweden)
José García Pérez
2016-01-01
Full Text Available The beta distribution has traditionally been employed in the PERT methodology and generally used for modeling bounded continuous random variables based on expert’s judgment. The impossibility of estimating four parameters from the three values provided by the expert when the beta distribution is assumed to be the underlying distribution has been widely debated. This paper presents the generalized bicubic distribution as a good alternative to the beta distribution since, when the variance depends on the mode, the generalized bicubic distribution approximates the kurtosis of the Gaussian distribution better than the beta distribution. In addition, this distribution presents good properties in the PERT methodology in relation to moderation and conservatism criteria. Two empirical applications are presented to demonstrate the adequateness of this new distribution.
Nguyen, Hung T; Wu, Berlin; Xiang, Gang
2012-01-01
In many practical situations, we are interested in statistics characterizing a population of objects: e.g. in the mean height of people from a certain area. Most algorithms for estimating such statistics assume that the sample values are exact. In practice, sample values come from measurements, and measurements are never absolutely accurate. Sometimes, we know the exact probability distribution of the measurement inaccuracy, but often, we only know the upper bound on this inaccuracy. In this case, we have interval uncertainty: e.g. if the measured value is 1.0, and inaccuracy is bounded by 0.1, then the actual (unknown) value of the quantity can be anywhere between 1.0 - 0.1 = 0.9 and 1.0 + 0.1 = 1.1. In other cases, the values are expert estimates, and we only have fuzzy information about the estimation inaccuracy. This book shows how to compute statistics under such interval and fuzzy uncertainty. The resulting methods are applied to computer science (optimal scheduling of different processors), to in...
International Nuclear Information System (INIS)
Huang Zhifu; Lin Bihong; ChenJincan
2009-01-01
In order to overcome the limitations of the original expression of the probability distribution appearing in literature of Incomplete Statistics, a new expression of the probability distribution is derived, where the Lagrange multiplier β introduced here is proved to be identical with that introduced in the second and third choices for the internal energy constraint in Tsallis' statistics and to be just equal to the physical inverse temperature. It is expounded that the probability distribution described by the new expression is invariant through uniform translation of the energy spectrum. Moreover, several fundamental thermodynamic relations are given and the relationship between the new and the original expressions of the probability distribution is discussed.
New statistical function for the angular distribution of evaporation residues produced by heavy ions
International Nuclear Information System (INIS)
Rigol, J.
1994-01-01
A new statistical function has been found for modelling the angular distribution of evaporation residues produced by heavy ions. Experimental results are compared with the calculated ones. 11 refs.; 4 figs. (author)
Application of Maximum Entropy Distribution to the Statistical Properties of Wave Groups
Institute of Scientific and Technical Information of China (English)
无
2007-01-01
The new distributions of the statistics of wave groups based on the maximum entropy principle are presented. The maximum entropy distributions appear to be superior to conventional distributions when applied to a limited amount of information. Its applications to the wave group properties show the effectiveness of the maximum entropy distribution. FFT filtering method is employed to obtain the wave envelope fast and efficiently. Comparisons of both the maximum entropy distribution and the distribution of Longuet-Higgins (1984) with the laboratory wind-wave data show that the former gives a better fit.
Bellera, Carine A.; Julien, Marilyse; Hanley, James A.
2010-01-01
The Wilcoxon statistics are usually taught as nonparametric alternatives for the 1- and 2-sample Student-"t" statistics in situations where the data appear to arise from non-normal distributions, or where sample sizes are so small that we cannot check whether they do. In the past, critical values, based on exact tail areas, were…
Improving Statistics Education through Simulations: The Case of the Sampling Distribution.
Earley, Mark A.
This paper presents a summary of action research investigating statistics students' understandings of the sampling distribution of the mean. With four sections of an introductory Statistics in Education course (n=98 students), a computer simulation activity (R. delMas, J. Garfield, and B. Chance, 1999) was implemented and evaluated to show…
STATISTICAL DISTRIBUTION PATTERNS IN MECHANICAL AND FATIGUE PROPERTIES OF METALLIC MATERIALS
Tatsuo, SAKAI; Masaki, NAKAJIMA; Keiro, TOKAJI; Norihiko, HASEGAWA; Department of Mechanical Engineering, Ritsumeikan University; Department of Mechanical Engineering, Toyota College of Technology; Department of Mechanical Engineering, Gifu University; Department of Mechanical Engineering, Gifu University
1997-01-01
Many papers on the statistical aspect of materials strength have been collected and reviewed by The Research Group for Statistical Aspects of Materials Strength.A book of "Statistical Aspects of Materials Strength" was written by this group, and published in 1992.Based on the experimental data compiled in this book, distribution patterns of mechanical properties are systematically surveyed paying an attention to metallic materials.Thus one can obtain the fundamental knowledge for a reliabilit...
International Nuclear Information System (INIS)
Yu, J.; Sommer, W.F.; Bradbury, J.N.
1986-01-01
Microstructural evolution in metals under particle irradiation is described by a non-equilibrium statistics method. This method gives a set of equations for the evolution of bubbles and an approximate solution for a distribution function of bubble size as a function of fluence and temperature. The distribution function gives the number of bubbles of radius r at time t, N(r,t)dr, as a function of size, r/r 0 (r 0 is the radius of a bubble nucleus). It is found that N(r,t)dr increases with fluence. Also, the peak value of N(r,t)dt shifts to higher r/r 0 with increasing fluence. Nucleation depends mainly on helium concentration and defect cluster concentration while bubble growth is controlled mainly by the vacancy concentration and a fluctuation coefficient. If suitable material parameters are chosen, a reasonable distribution function for bubble size is obtained. The helium diffusion coefficient is found to be less than that for vacancies by five orders of magnitude. The fraction of helium remaining in matrix is less than 10 -2 ; the majority of the helium is associated with the bubbles
Zhang, Yu; Li, Fei; Zhang, Shengkai; Zhu, Tingting
2017-04-01
Synthetic Aperture Radar (SAR) is significantly important for polar remote sensing since it can provide continuous observations in all days and all weather. SAR can be used for extracting the surface roughness information characterized by the variance of dielectric properties and different polarization channels, which make it possible to observe different ice types and surface structure for deformation analysis. In November, 2016, Chinese National Antarctic Research Expedition (CHINARE) 33rd cruise has set sails in sea ice zone in Antarctic. Accurate leads spatial distribution in sea ice zone for routine planning of ship navigation is essential. In this study, the semantic relationship between leads and sea ice categories has been described by the Conditional Random Fields (CRF) model, and leads characteristics have been modeled by statistical distributions in SAR imagery. In the proposed algorithm, a mixture statistical distribution based CRF is developed by considering the contexture information and the statistical characteristics of sea ice for improving leads detection in Sentinel-1A dual polarization SAR imagery. The unary potential and pairwise potential in CRF model is constructed by integrating the posteriori probability estimated from statistical distributions. For mixture statistical distribution parameter estimation, Method of Logarithmic Cumulants (MoLC) is exploited for single statistical distribution parameters estimation. The iteration based Expectation Maximal (EM) algorithm is investigated to calculate the parameters in mixture statistical distribution based CRF model. In the posteriori probability inference, graph-cut energy minimization method is adopted in the initial leads detection. The post-processing procedures including aspect ratio constrain and spatial smoothing approaches are utilized to improve the visual result. The proposed method is validated on Sentinel-1A SAR C-band Extra Wide Swath (EW) Ground Range Detected (GRD) imagery with a
The Effect of Distributed Practice in Undergraduate Statistics Homework Sets: A Randomized Trial
Crissinger, Bryan R.
2015-01-01
Most homework sets in statistics courses are constructed so that students concentrate or "mass" their practice on a certain topic in one problem set. Distributed practice homework sets include review problems in each set so that practice on a topic is distributed across problem sets. There is a body of research that points to the…
Current state of the art for statistical modeling of species distributions [Chapter 16
Troy M. Hegel; Samuel A. Cushman; Jeffrey Evans; Falk Huettmann
2010-01-01
Over the past decade the number of statistical modelling tools available to ecologists to model species' distributions has increased at a rapid pace (e.g. Elith et al. 2006; Austin 2007), as have the number of species distribution models (SDM) published in the literature (e.g. Scott et al. 2002). Ten years ago, basic logistic regression (Hosmer and Lemeshow 2000)...
Comparing simulated and theoretical sampling distributions of the U3 person-fit statistic
Emons, W.H.M.; Meijer, R.R.; Sijtsma, K.
2002-01-01
The accuracy with which the theoretical sampling distribution of van der Flier's person-.t statistic U3 approaches the empirical U3 sampling distribution is affected by the item discrimination. A simulation study showed that for tests with a moderate or a strong mean item discrimination, the Type I
Comparing simulated and theoretical sampling distributions of the U3 person-fit statistic
Emons, Wilco H.M.; Meijer, R.R.; Sijtsma, Klaas
2002-01-01
The accuracy with which the theoretical sampling distribution of van der Flier’s person-fit statistic U3 approaches the empirical U3 sampling distribution is affected by the item discrimination. A simulation study showed that for tests with a moderate or a strong mean item discrimination, the Type I
Statistical Characterization of 18650-Format Lithium-Ion Cell Thermal Runaway Energy Distributions
Walker, William Q.; Rickman, Steven; Darst, John; Finegan, Donal; Bayles, Gary; Darcy, Eric
2017-01-01
Effective thermal management systems, designed to handle the impacts of thermal runaway (TR) and to prevent cell-to-cell propagation, are key to safe operation of lithium-ion (Li-ion) battery assemblies. Critical factors for optimizing these systems include the total energy released during a single cell TR event and the fraction of the total energy that is released through the cell casing vs. through the ejecta material. A unique calorimeter was utilized to examine the TR behavior of a statistically significant number of 18650-format Li-ion cells with varying manufacturers, chemistries, and capacities. The calorimeter was designed to contain the TR energy in a format conducive to discerning the fractions of energy released through the cell casing vs. through the ejecta material. Other benefits of this calorimeter included the ability to rapidly test of large quantities of cells and the intentional minimization of secondary combustion effects. High energy (270 Wh/kg) and moderate energy (200 Wh/kg) 18650 cells were tested. Some of the cells had an imbedded short circuit (ISC) device installed to aid in the examination of TR mechanisms under more realistic conditions. Other variations included cells with bottom vent (BV) features and cells with thin casings (0.22 1/4m). After combining the data gathered with the calorimeter, a statistical approach was used to examine the probability of certain TR behavior, and the associated energy distributions, as a function of capacity, venting features, cell casing thickness and temperature.
The exact probability distribution of the rank product statistics for replicated experiments.
Eisinga, Rob; Breitling, Rainer; Heskes, Tom
2013-03-18
The rank product method is a widely accepted technique for detecting differentially regulated genes in replicated microarray experiments. To approximate the sampling distribution of the rank product statistic, the original publication proposed a permutation approach, whereas recently an alternative approximation based on the continuous gamma distribution was suggested. However, both approximations are imperfect for estimating small tail probabilities. In this paper we relate the rank product statistic to number theory and provide a derivation of its exact probability distribution and the true tail probabilities. Copyright © 2013 Federation of European Biochemical Societies. Published by Elsevier B.V. All rights reserved.
A spatial scan statistic for survival data based on Weibull distribution.
Bhatt, Vijaya; Tiwari, Neeraj
2014-05-20
The spatial scan statistic has been developed as a geographical cluster detection analysis tool for different types of data sets such as Bernoulli, Poisson, ordinal, normal and exponential. We propose a scan statistic for survival data based on Weibull distribution. It may also be used for other survival distributions, such as exponential, gamma, and log normal. The proposed method is applied on the survival data of tuberculosis patients for the years 2004-2005 in Nainital district of Uttarakhand, India. Simulation studies reveal that the proposed method performs well for different survival distribution functions. Copyright © 2013 John Wiley & Sons, Ltd.
Comment on the asymptotics of a distribution-free goodness of fit test statistic.
Browne, Michael W; Shapiro, Alexander
2015-03-01
In a recent article Jennrich and Satorra (Psychometrika 78: 545-552, 2013) showed that a proof by Browne (British Journal of Mathematical and Statistical Psychology 37: 62-83, 1984) of the asymptotic distribution of a goodness of fit test statistic is incomplete because it fails to prove that the orthogonal component function employed is continuous. Jennrich and Satorra (Psychometrika 78: 545-552, 2013) showed how Browne's proof can be completed satisfactorily but this required the development of an extensive and mathematically sophisticated framework for continuous orthogonal component functions. This short note provides a simple proof of the asymptotic distribution of Browne's (British Journal of Mathematical and Statistical Psychology 37: 62-83, 1984) test statistic by using an equivalent form of the statistic that does not involve orthogonal component functions and consequently avoids all complicating issues associated with them.
DEFF Research Database (Denmark)
Hansen, Kurt Schaldemose
2007-01-01
The statistical distribution of extreme wind excursions above a mean level, for a specified recurrence period, is of crucial importance in relation to design of wind sensitive structures. This is particularly true for wind turbine structures. Based on an assumption of a Gaussian "mother......" distribution, Cartwright and Longuet-Higgens [1] derived an asymptotic expression for the distribution of the largest excursion from the mean level during an arbitrary recurrence period. From its inception, this celebrated expression has been widely used in wind engineering (as well as in off-shore engineering...... associated with large excursions from the mean [2]. Thus, the more extreme turbulence excursions (i.e. the upper tail of the turbulence PDF) seem to follow an Exponential-like distribution rather than a Gaussian distribution, and a Gaussian estimate may under-predict the probability of large turbulence...
Nishino, Ko; Lombardi, Stephen
2011-01-01
We introduce a novel parametric bidirectional reflectance distribution function (BRDF) model that can accurately encode a wide variety of real-world isotropic BRDFs with a small number of parameters. The key observation we make is that a BRDF may be viewed as a statistical distribution on a unit hemisphere. We derive a novel directional statistics distribution, which we refer to as the hemispherical exponential power distribution, and model real-world isotropic BRDFs as mixtures of it. We derive a canonical probabilistic method for estimating the parameters, including the number of components, of this novel directional statistics BRDF model. We show that the model captures the full spectrum of real-world isotropic BRDFs with high accuracy, but a small footprint. We also demonstrate the advantages of the novel BRDF model by showing its use for reflection component separation and for exploring the space of isotropic BRDFs.
Links to sources of cancer-related statistics, including the Surveillance, Epidemiology and End Results (SEER) Program, SEER-Medicare datasets, cancer survivor prevalence data, and the Cancer Trends Progress Report.
Statistical distribution of blood serotonin as a predictor of early autistic brain abnormalities
Directory of Open Access Journals (Sweden)
Janušonis Skirmantas
2005-07-01
Full Text Available Abstract Background A wide range of abnormalities has been reported in autistic brains, but these abnormalities may be the result of an earlier underlying developmental alteration that may no longer be evident by the time autism is diagnosed. The most consistent biological finding in autistic individuals has been their statistically elevated levels of 5-hydroxytryptamine (5-HT, serotonin in blood platelets (platelet hyperserotonemia. The early developmental alteration of the autistic brain and the autistic platelet hyperserotonemia may be caused by the same biological factor expressed in the brain and outside the brain, respectively. Unlike the brain, blood platelets are short-lived and continue to be produced throughout the life span, suggesting that this factor may continue to operate outside the brain years after the brain is formed. The statistical distributions of the platelet 5-HT levels in normal and autistic groups have characteristic features and may contain information about the nature of this yet unidentified factor. Results The identity of this factor was studied by using a novel, quantitative approach that was applied to published distributions of the platelet 5-HT levels in normal and autistic groups. It was shown that the published data are consistent with the hypothesis that a factor that interferes with brain development in autism may also regulate the release of 5-HT from gut enterochromaffin cells. Numerical analysis revealed that this factor may be non-functional in autistic individuals. Conclusion At least some biological factors, the abnormal function of which leads to the development of the autistic brain, may regulate the release of 5-HT from the gut years after birth. If the present model is correct, it will allow future efforts to be focused on a limited number of gene candidates, some of which have not been suspected to be involved in autism (such as the 5-HT4 receptor gene based on currently available clinical and
The κ parameter and κ-distribution in κ-deformed statistics for the systems in an external field
International Nuclear Information System (INIS)
Guo, Lina; Du, Jiulin
2007-01-01
It is naturally important question for us to ask under what physical situation should the κ-deformed statistics be suitable for the statistical description of a system and what should the κ parameter stand for. In this Letter, a formula expression of κ parameter is derived on the basis of the κ-H theorem, the κ-velocity distribution and the generalized Boltzmann equation in the framework of κ-deformed statistics. We thus obtain a physical interpretation for the parameter κ 0 with regard to the temperature gradient and the external force field. We show, as the q-statistics based on Tsallis entropy, the κ-deformed statistics may also be the candidate one suitable for the statistical description of the systems in external fields when being in the nonequilibrium stationary state, but has different physical characteristics. Namely, the κ-distribution is found to describe the nonequilibrium stationary state of the system where the external force should be vertical to the temperature gradient
Handbook of tables for order statistics from lognormal distributions with applications
Balakrishnan, N
1999-01-01
Lognormal distributions are one of the most commonly studied models in the sta tistical literature while being most frequently used in the applied literature. The lognormal distributions have been used in problems arising from such diverse fields as hydrology, biology, communication engineering, environmental science, reliability, agriculture, medical science, mechanical engineering, material science, and pharma cology. Though the lognormal distributions have been around from the beginning of this century (see Chapter 1), much of the work concerning inferential methods for the parameters of lognormal distributions has been done in the recent past. Most of these methods of inference, particUlarly those based on censored samples, involve extensive use of numerical methods to solve some nonlinear equations. Order statistics and their moments have been discussed quite extensively in the literature for many distributions. It is very well known that the moments of order statistics can be derived explicitly only...
Hardpan and maize root distribution under conservation and ...
African Journals Online (AJOL)
Hardpan and maize root distribution under conservation and conventional tillage in agro-ecological zone IIa, Zambia. ... There is no scientific basis for the recommendation given to farmers by agricultural extension workers to “break the hardpan” in fields under manual or animal tillage in the study areas. Key Words: Soil ...
International Nuclear Information System (INIS)
2005-01-01
For the years 2004 and 2005 the figures shown in the tables of Energy Review are partly preliminary. The annual statistics published in Energy Review are presented in more detail in a publication called Energy Statistics that comes out yearly. Energy Statistics also includes historical time-series over a longer period of time (see e.g. Energy Statistics, Statistics Finland, Helsinki 2004.) The applied energy units and conversion coefficients are shown in the back cover of the Review. Explanatory notes to the statistical tables can be found after tables and figures. The figures presents: Changes in GDP, energy consumption and electricity consumption, Carbon dioxide emissions from fossile fuels use, Coal consumption, Consumption of natural gas, Peat consumption, Domestic oil deliveries, Import prices of oil, Consumer prices of principal oil products, Fuel prices in heat production, Fuel prices in electricity production, Price of electricity by type of consumer, Average monthly spot prices at the Nord pool power exchange, Total energy consumption by source and CO 2 -emissions, Supplies and total consumption of electricity GWh, Energy imports by country of origin in January-June 2003, Energy exports by recipient country in January-June 2003, Consumer prices of liquid fuels, Consumer prices of hard coal, natural gas and indigenous fuels, Price of natural gas by type of consumer, Price of electricity by type of consumer, Price of district heating by type of consumer, Excise taxes, value added taxes and fiscal charges and fees included in consumer prices of some energy sources and Energy taxes, precautionary stock fees and oil pollution fees
International Nuclear Information System (INIS)
2001-01-01
For the year 2000, part of the figures shown in the tables of the Energy Review are preliminary or estimated. The annual statistics of the Energy Review appear in more detail from the publication Energiatilastot - Energy Statistics issued annually, which also includes historical time series over a longer period (see e.g. Energiatilastot 1999, Statistics Finland, Helsinki 2000, ISSN 0785-3165). The inside of the Review's back cover shows the energy units and the conversion coefficients used for them. Explanatory notes to the statistical tables can be found after tables and figures. The figures presents: Changes in the volume of GNP and energy consumption, Changes in the volume of GNP and electricity, Coal consumption, Natural gas consumption, Peat consumption, Domestic oil deliveries, Import prices of oil, Consumer prices of principal oil products, Fuel prices for heat production, Fuel prices for electricity production, Carbon dioxide emissions from the use of fossil fuels, Total energy consumption by source and CO 2 -emissions, Electricity supply, Energy imports by country of origin in 2000, Energy exports by recipient country in 2000, Consumer prices of liquid fuels, Consumer prices of hard coal, natural gas and indigenous fuels, Average electricity price by type of consumer, Price of district heating by type of consumer, Excise taxes, value added taxes and fiscal charges and fees included in consumer prices of some energy sources and Energy taxes and precautionary stock fees on oil products
International Nuclear Information System (INIS)
2000-01-01
For the year 1999 and 2000, part of the figures shown in the tables of the Energy Review are preliminary or estimated. The annual statistics of the Energy Review appear in more detail from the publication Energiatilastot - Energy Statistics issued annually, which also includes historical time series over a longer period (see e.g., Energiatilastot 1998, Statistics Finland, Helsinki 1999, ISSN 0785-3165). The inside of the Review's back cover shows the energy units and the conversion coefficients used for them. Explanatory notes to the statistical tables can be found after tables and figures. The figures presents: Changes in the volume of GNP and energy consumption, Changes in the volume of GNP and electricity, Coal consumption, Natural gas consumption, Peat consumption, Domestic oil deliveries, Import prices of oil, Consumer prices of principal oil products, Fuel prices for heat production, Fuel prices for electricity production, Carbon dioxide emissions, Total energy consumption by source and CO 2 -emissions, Electricity supply, Energy imports by country of origin in January-March 2000, Energy exports by recipient country in January-March 2000, Consumer prices of liquid fuels, Consumer prices of hard coal, natural gas and indigenous fuels, Average electricity price by type of consumer, Price of district heating by type of consumer, Excise taxes, value added taxes and fiscal charges and fees included in consumer prices of some energy sources and Energy taxes and precautionary stock fees on oil products
International Nuclear Information System (INIS)
1999-01-01
For the year 1998 and the year 1999, part of the figures shown in the tables of the Energy Review are preliminary or estimated. The annual statistics of the Energy Review appear in more detail from the publication Energiatilastot - Energy Statistics issued annually, which also includes historical time series over a longer period (see e.g. Energiatilastot 1998, Statistics Finland, Helsinki 1999, ISSN 0785-3165). The inside of the Review's back cover shows the energy units and the conversion coefficients used for them. Explanatory notes to the statistical tables can be found after tables and figures. The figures presents: Changes in the volume of GNP and energy consumption, Changes in the volume of GNP and electricity, Coal consumption, Natural gas consumption, Peat consumption, Domestic oil deliveries, Import prices of oil, Consumer prices of principal oil products, Fuel prices for heat production, Fuel prices for electricity production, Carbon dioxide emissions, Total energy consumption by source and CO 2 -emissions, Electricity supply, Energy imports by country of origin in January-June 1999, Energy exports by recipient country in January-June 1999, Consumer prices of liquid fuels, Consumer prices of hard coal, natural gas and indigenous fuels, Average electricity price by type of consumer, Price of district heating by type of consumer, Excise taxes, value added taxes and fiscal charges and fees included in consumer prices of some energy sources and Energy taxes and precautionary stock fees on oil products
Zhang, Y.; Li, F.; Zhang, S.; Hao, W.; Zhu, T.; Yuan, L.; Xiao, F.
2017-09-01
In this paper, Statistical Distribution based Conditional Random Fields (STA-CRF) algorithm is exploited for improving marginal ice-water classification. Pixel level ice concentration is presented as the comparison of methods based on CRF. Furthermore, in order to explore the effective statistical distribution model to be integrated into STA-CRF, five statistical distribution models are investigated. The STA-CRF methods are tested on 2 scenes around Prydz Bay and Adélie Depression, where contain a variety of ice types during melt season. Experimental results indicate that the proposed method can resolve sea ice edge well in Marginal Ice Zone (MIZ) and show a robust distinction of ice and water.
Forecasting Value-at-Risk under Different Distributional Assumptions
Directory of Open Access Journals (Sweden)
Manuela Braione
2016-01-01
Full Text Available Financial asset returns are known to be conditionally heteroskedastic and generally non-normally distributed, fat-tailed and often skewed. These features must be taken into account to produce accurate forecasts of Value-at-Risk (VaR. We provide a comprehensive look at the problem by considering the impact that different distributional assumptions have on the accuracy of both univariate and multivariate GARCH models in out-of-sample VaR prediction. The set of analyzed distributions comprises the normal, Student, Multivariate Exponential Power and their corresponding skewed counterparts. The accuracy of the VaR forecasts is assessed by implementing standard statistical backtesting procedures used to rank the different specifications. The results show the importance of allowing for heavy-tails and skewness in the distributional assumption with the skew-Student outperforming the others across all tests and confidence levels.
Huang, N. E.; Long, S. R.
1980-01-01
Laboratory experiments were performed to measure the surface elevation probability density function and associated statistical properties for a wind-generated wave field. The laboratory data along with some limited field data were compared. The statistical properties of the surface elevation were processed for comparison with the results derived from the Longuet-Higgins (1963) theory. It is found that, even for the highly non-Gaussian cases, the distribution function proposed by Longuet-Higgins still gives good approximations.
Directory of Open Access Journals (Sweden)
María Gabriela Mago Ramos
2012-05-01
Full Text Available A methodology was developed for analysing faults in distribution transformers using the statistical package for social sciences (SPSS; it consisted of organising and creating of database regarding failed equipment, incorporating such data into the processing programme and converting all the information into numerical variables to be processed, thereby obtaining descriptive statistics and enabling factor and discriminant analysis. The research was based on information provided by companies in areas served by Corpoelec (Valencia, Venezuela and Codensa (Bogotá, Colombia.
Influence of the statistical distribution of bioassay measurement errors on the intake estimation
International Nuclear Information System (INIS)
Lee, T. Y; Kim, J. K
2006-01-01
The purpose of this study is to provide the guidance necessary for making a selection of error distributions by analyzing influence of statistical distribution for a type of bioassay measurement error on the intake estimation. For this purpose, intakes were estimated using maximum likelihood method for cases that error distributions are normal and lognormal, and comparisons between two distributions for the estimated intakes were made. According to the results of this study, in case that measurement results for lung retention are somewhat greater than the limit of detection it appeared that distribution types have negligible influence on the results. Whereas in case of measurement results for the daily excretion rate, the results obtained from assumption of a lognormal distribution were 10% higher than those obtained from assumption of a normal distribution. In view of these facts, in case where uncertainty component is governed by counting statistics it is considered that distribution type have no influence on intake estimation. Whereas in case where the others are predominant, it is concluded that it is clearly desirable to estimate the intake assuming a lognormal distribution
Poppe, L.J.; Eliason, A.H.; Hastings, M.E.
2004-01-01
Measures that describe and summarize sediment grain-size distributions are important to geologists because of the large amount of information contained in textural data sets. Statistical methods are usually employed to simplify the necessary comparisons among samples and quantify the observed differences. The two statistical methods most commonly used by sedimentologists to describe particle distributions are mathematical moments (Krumbein and Pettijohn, 1938) and inclusive graphics (Folk, 1974). The choice of which of these statistical measures to use is typically governed by the amount of data available (Royse, 1970). If the entire distribution is known, the method of moments may be used; if the next to last accumulated percent is greater than 95, inclusive graphics statistics can be generated. Unfortunately, earlier programs designed to describe sediment grain-size distributions statistically do not run in a Windows environment, do not allow extrapolation of the distribution's tails, or do not generate both moment and graphic statistics (Kane and Hubert, 1963; Collias et al., 1963; Schlee and Webster, 1967; Poppe et al., 2000)1.Owing to analytical limitations, electro-resistance multichannel particle-size analyzers, such as Coulter Counters, commonly truncate the tails of the fine-fraction part of grain-size distributions. These devices do not detect fine clay in the 0.6–0.1 μm range (part of the 11-phi and all of the 12-phi and 13-phi fractions). Although size analyses performed down to 0.6 μm microns are adequate for most freshwater and near shore marine sediments, samples from many deeper water marine environments (e.g. rise and abyssal plain) may contain significant material in the fine clay fraction, and these analyses benefit from extrapolation.The program (GSSTAT) described herein generates statistics to characterize sediment grain-size distributions and can extrapolate the fine-grained end of the particle distribution. It is written in Microsoft
Statistical γ-ray multiplicity distributions in Dy and Yb nuclei
International Nuclear Information System (INIS)
Tveter, T.S.; Bergholt, L.; Guttormsen, M.; Rekstad, J.
1994-03-01
The statistical γ-ray multiplicity distributions following the reactions 163 Dy( 3 He,αxn) 162-x Dy and 173 Yb( 3 He,αxn) 172-x Yb have been studied. The mean value and standard deviation have been extracted as functions of excitation energy. The method is based on the probability distribution of k-fold events, where an α-particle is observed in coincidence with signals in k γ-ray detectors. Techniques for isolating statistical γ-rays and subtracting random background, cross-talk and neutron contributions are discussed. 22 refs., 10 figs., 3 tabs
International Nuclear Information System (INIS)
2003-01-01
For the year 2002, part of the figures shown in the tables of the Energy Review are partly preliminary. The annual statistics of the Energy Review also includes historical time-series over a longer period (see e.g. Energiatilastot 2001, Statistics Finland, Helsinki 2002). The applied energy units and conversion coefficients are shown in the inside back cover of the Review. Explanatory notes to the statistical tables can be found after tables and figures. The figures presents: Changes in GDP, energy consumption and electricity consumption, Carbon dioxide emissions from fossile fuels use, Coal consumption, Consumption of natural gas, Peat consumption, Domestic oil deliveries, Import prices of oil, Consumer prices of principal oil products, Fuel prices in heat production, Fuel prices in electricity production, Price of electricity by type of consumer, Average monthly spot prices at the Nord pool power exchange, Total energy consumption by source and CO 2 -emissions, Supply and total consumption of electricity GWh, Energy imports by country of origin in January-June 2003, Energy exports by recipient country in January-June 2003, Consumer prices of liquid fuels, Consumer prices of hard coal, natural gas and indigenous fuels, Price of natural gas by type of consumer, Price of electricity by type of consumer, Price of district heating by type of consumer, Excise taxes, value added taxes and fiscal charges and fees included in consumer prices of some energy sources and Excise taxes, precautionary stock fees on oil pollution fees on energy products
International Nuclear Information System (INIS)
2004-01-01
For the year 2003 and 2004, the figures shown in the tables of the Energy Review are partly preliminary. The annual statistics of the Energy Review also includes historical time-series over a longer period (see e.g. Energiatilastot, Statistics Finland, Helsinki 2003, ISSN 0785-3165). The applied energy units and conversion coefficients are shown in the inside back cover of the Review. Explanatory notes to the statistical tables can be found after tables and figures. The figures presents: Changes in GDP, energy consumption and electricity consumption, Carbon dioxide emissions from fossile fuels use, Coal consumption, Consumption of natural gas, Peat consumption, Domestic oil deliveries, Import prices of oil, Consumer prices of principal oil products, Fuel prices in heat production, Fuel prices in electricity production, Price of electricity by type of consumer, Average monthly spot prices at the Nord pool power exchange, Total energy consumption by source and CO 2 -emissions, Supplies and total consumption of electricity GWh, Energy imports by country of origin in January-March 2004, Energy exports by recipient country in January-March 2004, Consumer prices of liquid fuels, Consumer prices of hard coal, natural gas and indigenous fuels, Price of natural gas by type of consumer, Price of electricity by type of consumer, Price of district heating by type of consumer, Excise taxes, value added taxes and fiscal charges and fees included in consumer prices of some energy sources and Excise taxes, precautionary stock fees on oil pollution fees
International Nuclear Information System (INIS)
2000-01-01
For the year 1999 and 2000, part of the figures shown in the tables of the Energy Review are preliminary or estimated. The annual statistics of the Energy also includes historical time series over a longer period (see e.g., Energiatilastot 1999, Statistics Finland, Helsinki 2000, ISSN 0785-3165). The inside of the Review's back cover shows the energy units and the conversion coefficients used for them. Explanatory notes to the statistical tables can be found after tables and figures. The figures presents: Changes in the volume of GNP and energy consumption, Changes in the volume of GNP and electricity, Coal consumption, Natural gas consumption, Peat consumption, Domestic oil deliveries, Import prices of oil, Consumer prices of principal oil products, Fuel prices for heat production, Fuel prices for electricity production, Carbon dioxide emissions, Total energy consumption by source and CO 2 -emissions, Electricity supply, Energy imports by country of origin in January-June 2000, Energy exports by recipient country in January-June 2000, Consumer prices of liquid fuels, Consumer prices of hard coal, natural gas and indigenous fuels, Average electricity price by type of consumer, Price of district heating by type of consumer, Excise taxes, value added taxes and fiscal charges and fees included in consumer prices of some energy sources and Energy taxes and precautionary stock fees on oil products
Directory of Open Access Journals (Sweden)
Mehmet KURBAN
2007-01-01
Full Text Available In this paper, the wind energy potential of the region is analyzed with Weibull and Reyleigh statistical distribution functions by using the wind speed data measured per 15 seconds in July, August, September, and October of 2005 at 10 m height of 30-m observation pole in the wind observation station constructed in the coverage of the scientific research project titled "The Construction of Hybrid (Wind-Solar Power Plant Model by Determining the Wind and Solar Potential in the Iki Eylul Campus of A.U." supported by Anadolu University. The Maximum likelihood method is used for finding the parameters of these distributions. The conclusion of the analysis for the months taken represents that the Weibull distribution models the wind speeds better than the Rayleigh distribution. Furthermore, the error rate in the monthly values of power density computed by using the Weibull distribution is smaller than the values by Rayleigh distribution.
Statistical distribution of components of energy eigenfunctions: from nearly-integrable to chaotic
International Nuclear Information System (INIS)
Wang, Jiaozi; Wang, Wen-ge
2016-01-01
We study the statistical distribution of components in the non-perturbative parts of energy eigenfunctions (EFs), in which main bodies of the EFs lie. Our numerical simulations in five models show that deviation of the distribution from the prediction of random matrix theory (RMT) is useful in characterizing the process from nearly-integrable to chaotic, in a way somewhat similar to the nearest-level-spacing distribution. But, the statistics of EFs reveals some more properties, as described below. (i) In the process of approaching quantum chaos, the distribution of components shows a delay feature compared with the nearest-level-spacing distribution in most of the models studied. (ii) In the quantum chaotic regime, the distribution of components always shows small but notable deviation from the prediction of RMT in models possessing classical counterparts, while, the deviation can be almost negligible in models not possessing classical counterparts. (iii) In models whose Hamiltonian matrices possess a clear band structure, tails of EFs show statistical behaviors obviously different from those in the main bodies, while, the difference is smaller for Hamiltonian matrices without a clear band structure.
Fissure formation in coke. 3: Coke size distribution and statistical analysis
Energy Technology Data Exchange (ETDEWEB)
D.R. Jenkins; D.E. Shaw; M.R. Mahoney [CSIRO, North Ryde, NSW (Australia). Mathematical and Information Sciences
2010-07-15
A model of coke stabilization, based on a fundamental model of fissuring during carbonisation is used to demonstrate the applicability of the fissuring model to actual coke size distributions. The results indicate that the degree of stabilization is important in determining the size distribution. A modified form of the Weibull distribution is shown to provide a better representation of the whole coke size distribution compared to the Rosin-Rammler distribution, which is generally only fitted to the lump coke. A statistical analysis of a large number of experiments in a pilot scale coke oven shows reasonably good prediction of the coke mean size, based on parameters related to blend rank, amount of low rank coal, fluidity and ash. However, the prediction of measures of the spread of the size distribution is more problematic. The fissuring model, the size distribution representation and the statistical analysis together provide a comprehensive capability for understanding and predicting the mean size and distribution of coke lumps produced during carbonisation. 12 refs., 16 figs., 4 tabs.
Optimal Intermittent Operation of Water Distribution Networks under Water Shortage
Directory of Open Access Journals (Sweden)
mohamad Solgi
2017-07-01
Full Text Available Under water shortage conditions, it is necessary to exercise water consumption management practices in water distribution networks (WDN. Intermittent supply of water is one such practice that makes it possible to supply consumption nodal demands with the required pressure via water cutoff to some consumers during certain hours of the day. One of the most important issues that must be observed in this management practice is the equitable and uniform water distribution among the consumers. In the present study, uniformity in water distribution and minimum supply of water to all consumers are defined as justice and equity, respectively. Also, an optimization model has been developed to find an optimal intermittent supply schedule that ensures maximum number of demand nodes are supplied with water while the constraints on the operation of water distribution networks are also observed. To show the efficiency of the proposed model, it has been used in the Two-Loop distribution network under several different scenarios of water shortage. The optimization model has been solved using the honey bee mating optimization algorithm (HBMO linked to the hydraulic simulator EPANET. The results obtained confirm the efficiency of the proposed model in achieving an optimal intermittent supply schedule. Moreover, the model is found capable of distributing the available water in an equitable and just manner among all the consumers even under severe water shoratges.
International Nuclear Information System (INIS)
Kawasaki, Keiichi; Ishii, Kenji; Saito, Yoko; Oda, Keiichi; Kimura, Yuichi; Ishiwata, Kiichi
2008-01-01
In clinical cerebral 2-[ 18 F]fluoro-2-deoxy-D-glucose positron emission tomography (FDG-PET) studies, we sometimes encounter hyperglycemic patients with diabetes mellitus or patients who have not adhered to the fasting requirement. The objective of this study was to investigate the influence of mild hyperglycemia (plasma glucose range 110-160 mg/dl) on the cerebral FDG distribution patterns calculated by statistical parametric mapping (SPM). We studied 19 healthy subjects (mean age 66.2 years). First, all the subjects underwent FDG-PET scans in the fasting condition. Then, 9 of the 19 subjects (mean age 64.3 years) underwent the second FDG-PET scans in the mild hyperglycemic condition. The alterations in the FDG-PET scans were investigated using SPM- and region of interest (ROI)-based analyses. We used three reference regions: SPM global brain (SPMgb) used for SPM global mean calculation, the gray and white matter region computed from magnetic resonance image (MRIgw), and the cerebellar cortex (Cbll). The FDG uptake calculated as the standardized uptake value (average) in SPMgb, MRIgw, and Cbll regions in the mild hyperglycemic condition was 42.7%, 41.3%, and 40.0%, respectively, of that observed in the fasting condition. In SPM analysis, the mild hyperglycemia was found to affect the cerebral distribution patterns of FDG. The FDG uptake was relatively decreased in the gray matter, mainly in the frontal, temporal, and parietal association cortices, posterior cingulate, and precuneus in both SPMgb- and MRIgw-reference-based analyses. When Cbll was adopted as the reference region, those decrease patterns disappeared. The FDG uptake was relatively increased in the white matter, mainly in the centrum semiovale in all the reference-based analyses. It is noteworthy that the FDG distribution patterns were altered under mild hyperglycemia in SPM analysis. The decreased uptake patterns in SPMgb- (SPM default) and MRIgw-reference-based analyses resembled those observed in
Directory of Open Access Journals (Sweden)
Xiliang Zheng
2015-04-01
Full Text Available We uncovered the universal statistical laws for the biomolecular recognition/binding process. We quantified the statistical energy landscapes for binding, from which we can characterize the distributions of the binding free energy (affinity, the equilibrium constants, the kinetics and the specificity by exploring the different ligands binding with a particular receptor. The results of the analytical studies are confirmed by the microscopic flexible docking simulations. The distribution of binding affinity is Gaussian around the mean and becomes exponential near the tail. The equilibrium constants of the binding follow a log-normal distribution around the mean and a power law distribution in the tail. The intrinsic specificity for biomolecular recognition measures the degree of discrimination of native versus non-native binding and the optimization of which becomes the maximization of the ratio of the free energy gap between the native state and the average of non-native states versus the roughness measured by the variance of the free energy landscape around its mean. The intrinsic specificity obeys a Gaussian distribution near the mean and an exponential distribution near the tail. Furthermore, the kinetics of binding follows a log-normal distribution near the mean and a power law distribution at the tail. Our study provides new insights into the statistical nature of thermodynamics, kinetics and function from different ligands binding with a specific receptor or equivalently specific ligand binding with different receptors. The elucidation of distributions of the kinetics and free energy has guiding roles in studying biomolecular recognition and function through small-molecule evolution and chemical genetics.
Cheng, P.W.; Kuik, van G.A.M.; Bussel, van G.J.W.; Vrouwenvelder, A.C.W.M.
2002-01-01
Extreme response is an important design variable for wind turbines. The statistical uncertainties concerning the extreme response distribution are simulated here with data concerning physical characteristics obtained from measurements. The extreme responses are the flap moment at the blade root and
Directory of Open Access Journals (Sweden)
Gökhan Gökdere
2014-05-01
Full Text Available In this paper, closed form expressions for the moments of the truncated Pareto order statistics are obtained by using conditional distribution. We also derive some results for the moments which will be useful for moment computations based on ordered data.
Directory of Open Access Journals (Sweden)
Wenzhi Wang
2016-07-01
Full Text Available Modeling the random fiber distribution of a fiber-reinforced composite is of great importance for studying the progressive failure behavior of the material on the micro scale. In this paper, we develop a new algorithm for generating random representative volume elements (RVEs with statistical equivalent fiber distribution against the actual material microstructure. The realistic statistical data is utilized as inputs of the new method, which is archived through implementation of the probability equations. Extensive statistical analysis is conducted to examine the capability of the proposed method and to compare it with existing methods. It is found that the proposed method presents a good match with experimental results in all aspects including the nearest neighbor distance, nearest neighbor orientation, Ripley’s K function, and the radial distribution function. Finite element analysis is presented to predict the effective elastic properties of a carbon/epoxy composite, to validate the generated random representative volume elements, and to provide insights of the effect of fiber distribution on the elastic properties. The present algorithm is shown to be highly accurate and can be used to generate statistically equivalent RVEs for not only fiber-reinforced composites but also other materials such as foam materials and particle-reinforced composites.
A model of seismic focus and related statistical distributions of earthquakes
International Nuclear Information System (INIS)
Apostol, Bogdan-Felix
2006-01-01
A growth model for accumulating seismic energy in a localized seismic focus is described, which introduces a fractional parameter r on geometrical grounds. The model is employed for deriving a power-type law for the statistical distribution in energy, where the parameter r contributes to the exponent, as well as corresponding time and magnitude distributions for earthquakes. The accompanying seismic activity of foreshocks and aftershocks is discussed in connection with this approach, as based on Omori distributions, and the rate of released energy is derived
Local multiplicity adjustment for the spatial scan statistic using the Gumbel distribution.
Gangnon, Ronald E
2012-03-01
The spatial scan statistic is an important and widely used tool for cluster detection. It is based on the simultaneous evaluation of the statistical significance of the maximum likelihood ratio test statistic over a large collection of potential clusters. In most cluster detection problems, there is variation in the extent of local multiplicity across the study region. For example, using a fixed maximum geographic radius for clusters, urban areas typically have many overlapping potential clusters, whereas rural areas have relatively few. The spatial scan statistic does not account for local multiplicity variation. We describe a previously proposed local multiplicity adjustment based on a nested Bonferroni correction and propose a novel adjustment based on a Gumbel distribution approximation to the distribution of a local scan statistic. We compare the performance of all three statistics in terms of power and a novel unbiased cluster detection criterion. These methods are then applied to the well-known New York leukemia dataset and a Wisconsin breast cancer incidence dataset. © 2011, The International Biometric Society.
Statistical Modelling of Synaptic Vesicles Distribution and Analysing their Physical Characteristics
DEFF Research Database (Denmark)
Khanmohammadi, Mahdieh
transmission electron microscopy is used to acquire images from two experimental groups of rats: 1) rats subjected to a behavioral model of stress and 2) rats subjected to sham stress as the control group. The synaptic vesicle distribution and interactions are modeled by employing a point process approach......This Ph.D. thesis deals with mathematical and statistical modeling of synaptic vesicle distribution, shape, orientation and interactions. The first major part of this thesis treats the problem of determining the effect of stress on synaptic vesicle distribution and interactions. Serial section...... on differences of statistical measures in section and the same measures in between sections. Three-dimensional (3D) datasets are reconstructed by using image registration techniques and estimated thicknesses. We distinguish the effect of stress by estimating the synaptic vesicle densities and modeling...
Statistical Analysis of Video Frame Size Distribution Originating from Scalable Video Codec (SVC
Directory of Open Access Journals (Sweden)
Sima Ahmadpour
2017-01-01
Full Text Available Designing an effective and high performance network requires an accurate characterization and modeling of network traffic. The modeling of video frame sizes is normally applied in simulation studies and mathematical analysis and generating streams for testing and compliance purposes. Besides, video traffic assumed as a major source of multimedia traffic in future heterogeneous network. Therefore, the statistical distribution of video data can be used as the inputs for performance modeling of networks. The finding of this paper comprises the theoretical definition of distribution which seems to be relevant to the video trace in terms of its statistical properties and finds the best distribution using both the graphical method and the hypothesis test. The data set used in this article consists of layered video traces generating from Scalable Video Codec (SVC video compression technique of three different movies.
Directory of Open Access Journals (Sweden)
Cabello Daniel R
1998-01-01
Full Text Available A statistical evaluation of the population dynamics of Panstrongylus geniculatus is based on a cohort experiment conducted under controlled laboratory conditions. Animals were fed on hen every 15 days. Egg incubation took 21 days; mean duration of 1st, 2nd, 3rd, 4th, and 5th instar nymphs was 25, 30, 58, 62, and 67 days, respectively; mean nymphal development time was 39 weeks and adult longevity was 72 weeks. Females reproduced during 30 weeks, producing an average of 61.6 eggs for female on its lifetime; the average number of eggs/female/week was 2.1. Total number of eggs produced by the cohort was 1379. Average hatch for the cohort was 88.9%; it was not affected by age of the mother. Age specific survival and reproduction tables were constructed. The following population parameters were evaluated, generation time was 36.1 weeks; net reproduction rate was 89.4; intrinsic rate of natural increase was 0.125; instantaneous birth and death rates were 0.163 and 0.039 respectively; finite rate of increase was 1.13; total reproductive value was 1196 and stable age distribution was 31.2% eggs, 64.7% nymphs and 4.1% adults. Finally the population characteristics of P. geniculatus lead to the conclusion that this species is a K strategist.
Coolant rate distribution in horizontal steam generator under natural circulation
International Nuclear Information System (INIS)
Blagovechtchenski, A.; Leontieva, V.; Mitrioukhin, A.
1997-01-01
In the presentation the major factors determining the conditions of NCC (Natural Coolant Circulation) in the primary circuit and in particular conditions of coolant rate distribution on the horizontal tubes of PGV-1000 in NPP with VVER-1000 under NCC are considered
Coolant rate distribution in horizontal steam generator under natural circulation
Energy Technology Data Exchange (ETDEWEB)
Blagovechtchenski, A.; Leontieva, V.; Mitrioukhin, A. [St. Petersburg State Technical Univ. (Russian Federation)
1997-12-31
In the presentation the major factors determining the conditions of NCC (Natural Coolant Circulation) in the primary circuit and in particular conditions of coolant rate distribution on the horizontal tubes of PGV-1000 in NPP with VVER-1000 under NCC are considered. 5 refs.
Water and nitrogen distribution in uncropped ridgetilled soil under ...
African Journals Online (AJOL)
A ridge-tillage configuration, with placement of nitrate nitrogen (NO3--N) or its source in the elevated portion of the ridge, can potentially isolate fertilizer from downward water flow and minimize nitrate leaching. In the experiment, the simultaneous distribution of water, nitrate, and ammonium under three ridge widths was ...
Coolant rate distribution in horizontal steam generator under natural circulation
Energy Technology Data Exchange (ETDEWEB)
Blagovechtchenski, A; Leontieva, V; Mitrioukhin, A [St. Petersburg State Technical Univ. (Russian Federation)
1998-12-31
In the presentation the major factors determining the conditions of NCC (Natural Coolant Circulation) in the primary circuit and in particular conditions of coolant rate distribution on the horizontal tubes of PGV-1000 in NPP with VVER-1000 under NCC are considered. 5 refs.
Epping, Ruben; Panne, Ulrich; Falkenhagen, Jana
2017-02-07
Statistical ethylene oxide (EO) and propylene oxide (PO) copolymers of different monomer compositions and different average molar masses additionally containing two kinds of end groups (FTD) were investigated by ultra high pressure liquid chromatography under critical conditions (UP-LCCC) combined with electrospray ionization time-of flight mass spectrometry (ESI-TOF-MS). Theoretical predictions of the existence of a critical adsorption point (CPA) for statistical copolymers with a given chemical and sequence distribution1 could be studied and confirmed. A fundamentally new approach to determine these critical conditions in a copolymer, alongside the inevitable chemical composition distribution (CCD), with mass spectrometric detection, is described. The shift of the critical eluent composition with the monomer composition of the polymers was determined. Due to the broad molar mass distribution (MMD) and the presumed existence of different end group functionalities as well as monomer sequence distribution (MSD), gradient separation only by CCD was not possible. Therefore, isocratic separation conditions at the CPA of definite CCD fractions were developed. Although the various present distributions partly superimposed the separation process, the goal of separation by end group functionality was still achieved on the basis of the additional dimension of ESI-TOF-MS. The existence of HO-H besides the desired allylO-H end group functionalities was confirmed and their amount estimated. Furthermore, indications for a MSD were found by UPLC/MS/MS measurements. This approach offers for the first time the possibility to obtain a fingerprint of a broad distributed statistical copolymer including MMD, FTD, CCD, and MSD.
Statistical thermodynamics and mean-field theory for the alloy under irradiation model
International Nuclear Information System (INIS)
Kamyshendo, V.
1993-01-01
A generalization of statistical thermodynamics to the open systems case, is discussed, using as an example the alloy-under-irradiation model. The statistical properties of stationary states are described with the use of generalized thermodynamic potentials and 'quasi-interactions' determined from the master equation for micro-configuration probabilities. Methods for resolving this equation are illustrated by the mean-field type calculations of correlators, thermodynamic potentials and phase diagrams for disordered alloys
Spatial statistical analysis of basal stem root disease under natural field epidemic of oil palm
Kamu, Assis; Phin, Chong Khim; Seman, Idris Abu; Wan, Hoong Hak; Mun, Ho Chong
2015-02-01
Oil palm or scientifically known as Elaeis guineensis Jacq. is the most important commodity crop in Malaysia and has greatly contributed to the economy growth of the country. As far as disease is concerned in the industry, Basal Stem Rot (BSR) caused by Ganoderma boninence remains the most important disease. BSR disease is the most widely studied with information available for oil palm disease in Malaysia. However, there is still limited study on the spatial as well as temporal pattern or distribution of the disease especially under natural field epidemic condition in oil palm plantation. The objective of this study is to spatially identify the pattern of BSR disease under natural field epidemic using two geospatial analytical techniques, which are quadrat analysis for the first order properties of partial pattern analysis and nearest-neighbor analysis (NNA) for the second order properties of partial pattern analysis. Two study sites were selected with different age of tree. Both sites are located in Tawau, Sabah and managed by the same company. The results showed that at least one of the point pattern analysis used which is NNA (i.e. the second order properties of partial pattern analysis) has confirmed the disease is complete spatial randomness. This suggests the spread of the disease is not from tree to tree and the age of palm does not play a significance role in determining the spatial pattern of the disease. From the spatial pattern of the disease, it would help in the disease management program and for the industry in the future. The statistical modelling is expected to help in identifying the right model to estimate the yield loss of oil palm due to BSR disease in the future.
International Nuclear Information System (INIS)
Pinotti, E.; Brenna, M.; Puppin, E.
2008-01-01
In magneto-optical Kerr measurements of the Barkhausen noise, a magnetization jump ΔM due to a domain reversal produces a variation ΔI of the intensity of a laser beam reflected by the sample, which is the physical quantity actually measured. Due to the non-uniform beam intensity profile, the magnitude of ΔI depends both on ΔM and on its position on the laser spot. This could distort the statistical distribution p(ΔI) of the measured ΔI with respect to the true distribution p(ΔM) of the magnetization jumps ΔM. In this work the exact relationship between the two distributions is derived in a general form, which will be applied to some possible beam profiles. It will be shown that in most cases the usual Gaussian beam produces a negligible statistical distortion. Moreover, for small ΔI the noise of the experimental setup can also distort the statistical distribution p(ΔI), by erroneously rejecting small ΔI as noise. This effect has been calculated for white noise, and it will be shown that it is relatively small but not totally negligible as the measured ΔI approaches the detection limit
Stress Distribution in Graded Cellular Materials Under Dynamic Compression
Directory of Open Access Journals (Sweden)
Peng Wang
Full Text Available Abstract Dynamic compression behaviors of density-homogeneous and density-graded irregular honeycombs are investigated using cell-based finite element models under a constant-velocity impact scenario. A method based on the cross-sectional engineering stress is developed to obtain the one-dimensional stress distribution along the loading direction in a cellular specimen. The cross-sectional engineering stress is contributed by two parts: the node-transitive stress and the contact-induced stress, which are caused by the nodal force and the contact of cell walls, respectively. It is found that the contact-induced stress is dominant for the significantly enhanced stress behind the shock front. The stress enhancement and the compaction wave propagation can be observed through the stress distributions in honeycombs under high-velocity compression. The single and double compaction wave modes are observed directly from the stress distributions. Theoretical analysis of the compaction wave propagation in the density-graded honeycombs based on the R-PH (rigid-plastic hardening idealization is carried out and verified by the numerical simulations. It is found that stress distribution in cellular materials and the compaction wave propagation characteristics under dynamic compression can be approximately predicted by the R-PH shock model.
Conditional maximum-entropy method for selecting prior distributions in Bayesian statistics
Abe, Sumiyoshi
2014-11-01
The conditional maximum-entropy method (abbreviated here as C-MaxEnt) is formulated for selecting prior probability distributions in Bayesian statistics for parameter estimation. This method is inspired by a statistical-mechanical approach to systems governed by dynamics with largely separated time scales and is based on three key concepts: conjugate pairs of variables, dimensionless integration measures with coarse-graining factors and partial maximization of the joint entropy. The method enables one to calculate a prior purely from a likelihood in a simple way. It is shown, in particular, how it not only yields Jeffreys's rules but also reveals new structures hidden behind them.
Statistical issues in the parton distribution analysis of the Tevatron jet data
International Nuclear Information System (INIS)
Alekhin, S.; Bluemlein, J.; Moch, S.O.; Hamburg Univ.
2012-11-01
We analyse a tension between the D0 and CDF inclusive jet data and the perturbative QCD calculations, which are based on the ABKM09 and ABM11 parton distribution functions (PDFs) within the nuisance parameter framework. Particular attention is paid on the uncertainties in the nuisance parameters due to the data fluctuations and the PDF errors. We show that with account of these uncertainties the nuisance parameters do not demonstrate a statistically significant excess. A statistical bias of the estimator based on the nuisance parameters is also discussed.
Estimation of current density distribution under electrodes for external defibrillation
Directory of Open Access Journals (Sweden)
Papazov Sava P
2002-12-01
Full Text Available Abstract Background Transthoracic defibrillation is the most common life-saving technique for the restoration of the heart rhythm of cardiac arrest victims. The procedure requires adequate application of large electrodes on the patient chest, to ensure low-resistance electrical contact. The current density distribution under the electrodes is non-uniform, leading to muscle contraction and pain, or risks of burning. The recent introduction of automatic external defibrillators and even wearable defibrillators, presents new demanding requirements for the structure of electrodes. Method and Results Using the pseudo-elliptic differential equation of Laplace type with appropriate boundary conditions and applying finite element method modeling, electrodes of various shapes and structure were studied. The non-uniformity of the current density distribution was shown to be moderately improved by adding a low resistivity layer between the metal and tissue and by a ring around the electrode perimeter. The inclusion of openings in long-term wearable electrodes additionally disturbs the current density profile. However, a number of small-size perforations may result in acceptable current density distribution. Conclusion The current density distribution non-uniformity of circular electrodes is about 30% less than that of square-shaped electrodes. The use of an interface layer of intermediate resistivity, comparable to that of the underlying tissues, and a high-resistivity perimeter ring, can further improve the distribution. The inclusion of skin aeration openings disturbs the current paths, but an appropriate selection of number and size provides a reasonable compromise.
International Nuclear Information System (INIS)
Gao, Li-Na; Liu, Fu-Hu; Lacey, Roy A.
2016-01-01
Experimental results of the transverse-momentum distributions of φ mesons and Ω hyperons produced in gold-gold (Au-Au) collisions with different centrality intervals, measured by the STAR Collaboration at different energies (7.7, 11.5, 19.6, 27, and 39 GeV) in the beam energy scan (BES) program at the relativistic heavy-ion collider (RHIC), are approximately described by the single Erlang distribution and the two-component Schwinger mechanism. Moreover, the STAR experimental transverse-momentum distributions of negatively charged particles, produced in Au-Au collisions at RHIC BES energies, are approximately described by the two-component Erlang distribution and the single Tsallis statistics. The excitation functions of free parameters are obtained from the fit to the experimental data. A weak softest point in the string tension in Ω hyperon spectra is observed at 7.7 GeV. (orig.)
International Nuclear Information System (INIS)
Varjas, Geza; Jozsef, Gabor; Gyenes, Gyoergy; Petranyi, Julia; Bozoky, Laszlo; Pataki, Gezane
1985-01-01
The establishment of the National Computerized Irradiation Planning Network allowed to perform the statistical evaluation presented in this report. During the first 5 years 13389 dose-distribution charts were calculated for the treatment of 5320 patients, i.e. in average, 2,5 dose-distribution chart-variants per patient. This number practically did not change in the last 4 years. The irradiation plan of certain tumour localizations was performed on the basis of the calculation of, in average, 1.6-3.0 dose-distribution charts. Recently, radiation procedures assuring optimal dose-distribution, such as the use of moving fields, and two- or three-irradiation fields, are gaining grounds. (author)
Noise and the statistical mechanics of distributed transport in a colony of interacting agents
Katifori, Eleni; Graewer, Johannes; Ronellenfitsch, Henrik; Mazza, Marco G.
Inspired by the process of liquid food distribution between individuals in an ant colony, in this work we consider the statistical mechanics of resource dissemination between interacting agents with finite carrying capacity. The agents move inside a confined space (nest), pick up the food at the entrance of the nest and share it with other agents that they encounter. We calculate analytically and via a series of simulations the global food intake rate for the whole colony as well as observables describing how uniformly the food is distributed within the nest. Our model and predictions provide a useful benchmark to assess which strategies can lead to efficient food distribution within the nest and also to what level the observed food uptake rates and efficiency in food distribution are due to stochastic fluctuations or specific food exchange strategies by an actual ant colony.
Assessing Regional Scale Variability in Extreme Value Statistics Under Altered Climate Scenarios
Energy Technology Data Exchange (ETDEWEB)
Brunsell, Nathaniel [Univ. of Kansas, Lawrence, KS (United States); Mechem, David [Univ. of Kansas, Lawrence, KS (United States); Ma, Chunsheng [Wichita State Univ., KS (United States)
2015-02-20
Recent studies have suggested that low-frequency modes of climate variability can significantly influence regional climate. The climatology associated with extreme events has been shown to be particularly sensitive. This has profound implications for droughts, heat waves, and food production. We propose to examine regional climate simulations conducted over the continental United States by applying a recently developed technique which combines wavelet multi–resolution analysis with information theory metrics. This research is motivated by two fundamental questions concerning the spatial and temporal structure of extreme events. These questions are 1) what temporal scales of the extreme value distributions are most sensitive to alteration by low-frequency climate forcings and 2) what is the nature of the spatial structure of variation in these timescales? The primary objective is to assess to what extent information theory metrics can be useful in characterizing the nature of extreme weather phenomena. Specifically, we hypothesize that (1) changes in the nature of extreme events will impact the temporal probability density functions and that information theory metrics will be sensitive these changes and (2) via a wavelet multi–resolution analysis, we will be able to characterize the relative contribution of different timescales on the stochastic nature of extreme events. In order to address these hypotheses, we propose a unique combination of an established regional climate modeling approach and advanced statistical techniques to assess the effects of low-frequency modes on climate extremes over North America. The behavior of climate extremes in RCM simulations for the 20th century will be compared with statistics calculated from the United States Historical Climatology Network (USHCN) and simulations from the North American Regional Climate Change Assessment Program (NARCCAP). This effort will serve to establish the baseline behavior of climate extremes, the
International Nuclear Information System (INIS)
Ballini, J.-P.; Cazes, P.; Turpin, P.-Y.
1976-01-01
Analysing the histogram of anode pulse amplitudes allows a discussion of the hypothesis that has been proposed to account for the statistical processes of secondary multiplication in a photomultiplier. In an earlier work, good agreement was obtained between experimental and reconstructed spectra, assuming a first dynode distribution including two Poisson distributions of distinct mean values. This first approximation led to a search for a method which could give the weights of several Poisson distributions of distinct mean values. Three methods have been briefly exposed: classical linear regression, constraint regression (d'Esopo's method), and regression on variables subject to error. The use of these methods gives an approach of the frequency function which represents the dispersion of the punctual mean gain around the whole first dynode mean gain value. Comparison between this function and the one employed in Polya distribution allows the statement that the latter is inadequate to describe the statistical process of secondary multiplication. Numerous spectra obtained with two kinds of photomultiplier working under different physical conditions have been analysed. Then two points are discussed: - Does the frequency function represent the dynode structure and the interdynode collection process. - Is the model (the multiplication process of all dynodes but the first one, is Poissonian) valid whatever the photomultiplier and the utilization conditions. (Auth.)
Directory of Open Access Journals (Sweden)
Scheid Anika
2012-07-01
Full Text Available Abstract Background Over the past years, statistical and Bayesian approaches have become increasingly appreciated to address the long-standing problem of computational RNA structure prediction. Recently, a novel probabilistic method for the prediction of RNA secondary structures from a single sequence has been studied which is based on generating statistically representative and reproducible samples of the entire ensemble of feasible structures for a particular input sequence. This method samples the possible foldings from a distribution implied by a sophisticated (traditional or length-dependent stochastic context-free grammar (SCFG that mirrors the standard thermodynamic model applied in modern physics-based prediction algorithms. Specifically, that grammar represents an exact probabilistic counterpart to the energy model underlying the Sfold software, which employs a sampling extension of the partition function (PF approach to produce statistically representative subsets of the Boltzmann-weighted ensemble. Although both sampling approaches have the same worst-case time and space complexities, it has been indicated that they differ in performance (both with respect to prediction accuracy and quality of generated samples, where neither of these two competing approaches generally outperforms the other. Results In this work, we will consider the SCFG based approach in order to perform an analysis on how the quality of generated sample sets and the corresponding prediction accuracy changes when different degrees of disturbances are incorporated into the needed sampling probabilities. This is motivated by the fact that if the results prove to be resistant to large errors on the distinct sampling probabilities (compared to the exact ones, then it will be an indication that these probabilities do not need to be computed exactly, but it may be sufficient and more efficient to approximate them. Thus, it might then be possible to decrease the worst
Czech Academy of Sciences Publication Activity Database
Netopilík, Miloš; Kratochvíl, Pavel
2006-01-01
Roč. 55, č. 2 (2006), s. 196-203 ISSN 0959-8103 R&D Projects: GA AV ČR IAA100500501; GA AV ČR IAA4050403; GA AV ČR IAA4050409; GA ČR GA203/03/0617 Institutional research plan: CEZ:AV0Z40500505 Keywords : statistical branching * tetrafunctional branch points * molecular-weight distribution Subject RIV: CD - Macromolecular Chemistry Impact factor: 1.475, year: 2006
Statistical distribution of time to crack initiation and initial crack size using service data
Heller, R. A.; Yang, J. N.
1977-01-01
Crack growth inspection data gathered during the service life of the C-130 Hercules airplane were used in conjunction with a crack propagation rule to estimate the distribution of crack initiation times and of initial crack sizes. A Bayesian statistical approach was used to calculate the fraction of undetected initiation times as a function of the inspection time and the reliability of the inspection procedure used.
Best Statistical Distribution of flood variables for Johor River in Malaysia
Salarpour Goodarzi, M.; Yusop, Z.; Yusof, F.
2012-12-01
A complex flood event is always characterized by a few characteristics such as flood peak, flood volume, and flood duration, which might be mutually correlated. This study explored the statistical distribution of peakflow, flood duration and flood volume at Rantau Panjang gauging station on the Johor River in Malaysia. Hourly data were recorded for 45 years. The data were analysed based on water year (July - June). Five distributions namely, Log Normal, Generalize Pareto, Log Pearson, Normal and Generalize Extreme Value (GEV) were used to model the distribution of all the three variables. Anderson-Darling and Kolmogorov-Smirnov goodness-of-fit tests were used to evaluate the best fit. Goodness-of-fit tests at 5% level of significance indicate that all the models can be used to model the distribution of peakflow, flood duration and flood volume. However, Generalize Pareto distribution is found to be the most suitable model when tested with the Anderson-Darling test and the, Kolmogorov-Smirnov suggested that GEV is the best for peakflow. The result of this research can be used to improve flood frequency analysis. Comparison between Generalized Extreme Value, Generalized Pareto and Log Pearson distributions in the Cumulative Distribution Function of peakflow
A study of outliers in statistical distributions of mechanical properties of structural steels
International Nuclear Information System (INIS)
Oefverbeck, P.; Oestberg, G.
1977-01-01
The safety against failure of pressure vessels can be assessed by statistical methods, so-called probabilistic fracture mechanics. The data base for such estimations is admittedly rather meagre, making it necessary to assume certain conventional statistical distributions. Since the failure rates arrived at are low, for nuclear vessels of the order of 10 - to 10 - per year, the extremes of the variables involved, among other things the mechanical properties of the steel used, are of particular interest. A question sometimes raised is whether outliers, or values exceeding the extremes in the assumed distributions, might occur. In order to explore this possibility a study has been made of strength values of three qualities of structural steels, available in samples of up to about 12,000. Statistical evaluation of these samples with respect to outliers, using standard methods for this purpose, revealed the presence of such outliers in most cases, with a frequency of occurrence of, typically, a few values per thousand, estimated by the methods described. Obviously, statistical analysis alone cannot be expected to shed any light on the causes of outliers. Thus, the interpretation of these results with respect to their implication for the probabilistic estimation of the integrety of pressure vessels must await further studies of a similar nature in which the test specimens corresponding to outliers can be recovered and examined metallographically. For the moment the results should be regarded only as a factor to be considered in discussions of the safety of pressure vessels. (author)
Galaxies distribution in the universe: large-scale statistics and structures
International Nuclear Information System (INIS)
Maurogordato, Sophie
1988-01-01
This research thesis addresses the distribution of galaxies in the Universe, and more particularly large scale statistics and structures. Based on an assessment of the main used statistical techniques, the author outlines the need to develop additional tools to correlation functions in order to characterise the distribution. She introduces a new indicator: the probability of a volume randomly tested in the distribution to be void. This allows a characterisation of void properties at the work scales (until 10h"-"1 Mpc) in the Harvard Smithsonian Center for Astrophysics Redshift Survey, or CfA catalog. A systematic analysis of statistical properties of different sub-samples has then been performed with respect to the size and location, luminosity class, and morphological type. This analysis is then extended to different scenarios of structure formation. A program of radial speed measurements based on observations allows the determination of possible relationships between apparent structures. The author also presents results of the search for south extensions of Perseus supernova [fr
Using a Statistical Approach to Anticipate Leaf Wetness Duration Under Climate Change in France
Huard, F.; Imig, A. F.; Perrin, P.
2014-12-01
Leaf wetness plays a major role in the development of fungal plant diseases. Leaf wetness duration (LWD) above a threshold value is determinant for infection and can be seen as a good indicator of impact of climate on infection occurrence and risk. As LWD is not widely measured, several methods, based on physics and empirical approach, have been developed to estimate it from weather data. Many LWD statistical models do exist, but the lack of standard for measurements require reassessments. A new empirical LWD model, called MEDHI (Modèle d'Estimation de la Durée d'Humectation à l'Inra) was developed for french configuration for wetness sensors (angle : 90°, height : 50 cm). This deployment is different from what is usually recommended from constructors or authors in other countries (angle from 10 to 60°, height from 10 to 150 cm…). MEDHI is a decision support system based on hourly climatic conditions at time steps n and n-1 taking account relative humidity, rainfall and previously simulated LWD. Air temperature, relative humidity, wind speed, rain and LWD data from several sensors with 2 configurations were measured during 6 months in Toulouse and Avignon (South West and South East of France) to calibrate MEDHI. A comparison of empirical models : NHRH (RH threshold), DPD (dew point depression), CART (classification and regression tree analysis dependant on RH, wind speed and dew point depression) and MEDHI, using meteorological and LWD measurements obtained during 5 months in Toulouse, showed that the development of this new model MEHDI was definitely better adapted to French conditions. In the context of climate change, MEDHI was used for mapping the evolution of leaf wetness duration in France from 1950 to 2100 with the French regional climate model ALADIN under different Representative Concentration Pathways (RCPs) and using a QM (Quantile-Mapping) statistical downscaling method. Results give information on the spatial distribution of infection risks
International Nuclear Information System (INIS)
Dobierzewska-Mozrzymas, E.; Szymczak, G.; Bieganski, P.; Pieciul, E.
2003-01-01
The ranges of statistical description of the systems may be determined on the basis of the inverse power law of the Mandelbrot. The slope of the straight line representing the power law in a double-logarithmic plot, determined as -1/μ (μ being a critical exponent), characterizes the distribution of elements in the system. In this paper, the inverse power law is used to describe the statistical distribution of discontinuous metal films with higher coverage coefficients (near percolation threshold). For these films the critical exponent μ∼1, both the mean value and the variance are infinite. The objects with such microstructure are described according to the Levy distribution; Cauchy, inverse Gauss and inverse gamma distribution, respectively. The experimental histograms are compared with the calculated ones. Inhomogeneous metal films were obtained experimentally, their microstructures were examined by means of electron microscope. On the basis of electron micrographs, the fractal dimensions were determined for the metal films with coverage coefficient ranging from 0.35 to 1.00
Yang, Jaw-Yen; Yan, Chih-Yuan; Diaz, Manuel; Huang, Juan-Chen; Li, Zhihui; Zhang, Hanxin
2014-01-08
The ideal quantum gas dynamics as manifested by the semiclassical ellipsoidal-statistical (ES) equilibrium distribution derived in Wu et al. (Wu et al . 2012 Proc. R. Soc. A 468 , 1799-1823 (doi:10.1098/rspa.2011.0673)) is numerically studied for particles of three statistics. This anisotropic ES equilibrium distribution was derived using the maximum entropy principle and conserves the mass, momentum and energy, but differs from the standard Fermi-Dirac or Bose-Einstein distribution. The present numerical method combines the discrete velocity (or momentum) ordinate method in momentum space and the high-resolution shock-capturing method in physical space. A decoding procedure to obtain the necessary parameters for determining the ES distribution is also devised. Computations of two-dimensional Riemann problems are presented, and various contours of the quantities unique to this ES model are illustrated. The main flow features, such as shock waves, expansion waves and slip lines and their complex nonlinear interactions, are depicted and found to be consistent with existing calculations for a classical gas.
Statistical distributions of avalanche size and waiting times in an inter-sandpile cascade model
Batac, Rene; Longjas, Anthony; Monterola, Christopher
2012-02-01
Sandpile-based models have successfully shed light on key features of nonlinear relaxational processes in nature, particularly the occurrence of fat-tailed magnitude distributions and exponential return times, from simple local stress redistributions. In this work, we extend the existing sandpile paradigm into an inter-sandpile cascade, wherein the avalanches emanating from a uniformly-driven sandpile (first layer) is used to trigger the next (second layer), and so on, in a successive fashion. Statistical characterizations reveal that avalanche size distributions evolve from a power-law p(S)≈S-1.3 for the first layer to gamma distributions p(S)≈Sαexp(-S/S0) for layers far away from the uniformly driven sandpile. The resulting avalanche size statistics is found to be associated with the corresponding waiting time distribution, as explained in an accompanying analytic formulation. Interestingly, both the numerical and analytic models show good agreement with actual inventories of non-uniformly driven events in nature.
Yang, Jaw-Yen; Yan, Chih-Yuan; Diaz, Manuel; Huang, Juan-Chen; Li, Zhihui; Zhang, Hanxin
2014-01-01
The ideal quantum gas dynamics as manifested by the semiclassical ellipsoidal-statistical (ES) equilibrium distribution derived in Wu et al. (Wu et al. 2012 Proc. R. Soc. A 468, 1799–1823 (doi:10.1098/rspa.2011.0673)) is numerically studied for particles of three statistics. This anisotropic ES equilibrium distribution was derived using the maximum entropy principle and conserves the mass, momentum and energy, but differs from the standard Fermi–Dirac or Bose–Einstein distribution. The present numerical method combines the discrete velocity (or momentum) ordinate method in momentum space and the high-resolution shock-capturing method in physical space. A decoding procedure to obtain the necessary parameters for determining the ES distribution is also devised. Computations of two-dimensional Riemann problems are presented, and various contours of the quantities unique to this ES model are illustrated. The main flow features, such as shock waves, expansion waves and slip lines and their complex nonlinear interactions, are depicted and found to be consistent with existing calculations for a classical gas. PMID:24399919
Rapp, J.B.
1991-01-01
Q-mode factor analysis was used to quantitate the distribution of the major aliphatic hydrocarbon (n-alkanes, pristane, phytane) systems in sediments from a variety of marine environments. The compositions of the pure end members of the systems were obtained from factor scores and the distribution of the systems within each sample was obtained from factor loadings. All the data, from the diverse environments sampled (estuarine (San Francisco Bay), fresh-water (San Francisco Peninsula), polar-marine (Antarctica) and geothermal-marine (Gorda Ridge) sediments), were reduced to three major systems: a terrestrial system (mostly high molecular weight aliphatics with odd-numbered-carbon predominance), a mature system (mostly low molecular weight aliphatics without predominance) and a system containing mostly high molecular weight aliphatics with even-numbered-carbon predominance. With this statistical approach, it is possible to assign the percentage contribution from various sources to the observed distribution of aliphatic hydrocarbons in each sediment sample. ?? 1991.
International Nuclear Information System (INIS)
Aringazin, A.K.; Mazhitov, M.I.
2003-01-01
We describe a formal procedure to obtain and specify the general form of a marginal distribution for the Lagrangian acceleration of fluid particle in developed turbulent flow using Langevin type equation and the assumption that velocity fluctuation u follows a normal distribution with zero mean, in accord to the Heisenberg-Yaglom picture. For a particular representation, β=exp[u], of the fluctuating parameter β, we reproduce the underlying log-normal distribution and the associated marginal distribution, which was found to be in a very good agreement with the new experimental data by Crawford, Mordant, and Bodenschatz on the acceleration statistics. We discuss on arising possibilities to make refinements of the log-normal model
Statistical distributions of earthquakes and related non-linear features in seismic waves
International Nuclear Information System (INIS)
Apostol, B.-F.
2006-01-01
A few basic facts in the science of the earthquakes are briefly reviewed. An accumulation, or growth, model is put forward for the focal mechanisms and the critical focal zone of the earthquakes, which relates the earthquake average recurrence time to the released seismic energy. The temporal statistical distribution for average recurrence time is introduced for earthquakes, and, on this basis, the Omori-type distribution in energy is derived, as well as the distribution in magnitude, by making use of the semi-empirical Gutenberg-Richter law relating seismic energy to earthquake magnitude. On geometric grounds, the accumulation model suggests the value r = 1/3 for the Omori parameter in the power-law of energy distribution, which leads to β = 1,17 for the coefficient in the Gutenberg-Richter recurrence law, in fair agreement with the statistical analysis of the empirical data. Making use of this value, the empirical Bath's law is discussed for the average magnitude of the aftershocks (which is 1.2 less than the magnitude of the main seismic shock), by assuming that the aftershocks are relaxation events of the seismic zone. The time distribution of the earthquakes with a fixed average recurrence time is also derived, the earthquake occurrence prediction is discussed by means of the average recurrence time and the seismicity rate, and application of this discussion to the seismic region Vrancea, Romania, is outlined. Finally, a special effect of non-linear behaviour of the seismic waves is discussed, by describing an exact solution derived recently for the elastic waves equation with cubic anharmonicities, its relevance, and its connection to the approximate quasi-plane waves picture. The properties of the seismic activity accompanying a main seismic shock, both like foreshocks and aftershocks, are relegated to forthcoming publications. (author)
Automatic generation of 3D statistical shape models with optimal landmark distributions.
Heimann, T; Wolf, I; Meinzer, H-P
2007-01-01
To point out the problem of non-uniform landmark placement in statistical shape modeling, to present an improved method for generating landmarks in the 3D case and to propose an unbiased evaluation metric to determine model quality. Our approach minimizes a cost function based on the minimum description length (MDL) of the shape model to optimize landmark correspondences over the training set. In addition to the standard technique, we employ an extended remeshing method to change the landmark distribution without losing correspondences, thus ensuring a uniform distribution over all training samples. To break the dependency of the established evaluation measures generalization and specificity from the landmark distribution, we change the internal metric from landmark distance to volumetric overlap. Redistributing landmarks to an equally spaced distribution during the model construction phase improves the quality of the resulting models significantly if the shapes feature prominent bulges or other complex geometry. The distribution of landmarks on the training shapes is -- beyond the correspondence issue -- a crucial point in model construction.
Fitting Statistical Distributions Functions on Ozone Concentration Data at Coastal Areas
International Nuclear Information System (INIS)
Muhammad Yazid Nasir; Nurul Adyani Ghazali; Muhammad Izwan Zariq Mokhtar; Norhazlina Suhaimi
2016-01-01
Ozone is known as one of the pollutant that contributes to the air pollution problem. Therefore, it is important to carry out the study on ozone. The objective of this study is to find the best statistical distribution for ozone concentration. There are three distributions namely Inverse Gaussian, Weibull and Lognormal were chosen to fit one year hourly average ozone concentration data in 2010 at Port Dickson and Port Klang. Maximum likelihood estimation (MLE) method was used to estimate the parameters to develop the probability density function (PDF) graph and cumulative density function (CDF) graph. Three performance indicators (PI) that are normalized absolute error (NAE), prediction accuracy (PA), and coefficient of determination (R 2 ) were used to determine the goodness-of-fit criteria of the distribution. Result shows that Weibull distribution is the best distribution with the smallest error measure value (NAE) at Port Klang and Port Dickson is 0.08 and 0.31, respectively. The best score for highest adequacy measure (PA: 0.99) with the value of R 2 is 0.98 (Port Klang) and 0.99 (Port Dickson). These results provide useful information to local authorities for prediction purpose. (author)
Statistical models for the analysis of water distribution system pipe break data
International Nuclear Information System (INIS)
Yamijala, Shridhar; Guikema, Seth D.; Brumbelow, Kelly
2009-01-01
The deterioration of pipes leading to pipe breaks and leaks in urban water distribution systems is of concern to water utilities throughout the world. Pipe breaks and leaks may result in reduction in the water-carrying capacity of the pipes and contamination of water in the distribution systems. Water utilities incur large expenses in the replacement and rehabilitation of water mains, making it critical to evaluate the current and future condition of the system for maintenance decision-making. This paper compares different statistical regression models proposed in the literature for estimating the reliability of pipes in a water distribution system on the basis of short time histories. The goals of these models are to estimate the likelihood of pipe breaks in the future and determine the parameters that most affect the likelihood of pipe breaks. The data set used for the analysis comes from a major US city, and these data include approximately 85,000 pipe segments with nearly 2500 breaks from 2000 through 2005. The results show that the set of statistical models previously proposed for this problem do not provide good estimates with the test data set. However, logistic generalized linear models do provide good estimates of pipe reliability and can be useful for water utilities in planning pipe inspection and maintenance
Statistical Distribution of Fatigue Life for Cast TiAl Alloy
Directory of Open Access Journals (Sweden)
WAN Wenjuan
2016-08-01
Full Text Available Statistic distribution of fatigue life data and its controls of cast Ti-47.5Al-2.5V-1.0Cr-0.2Zr (atom fraction/% alloy were investigated. Fatigue tests were operated by means of load-controlled rotating bending fatigue tests (R=-1 performed at a frequency of 100 Hz at 750 ℃ in air. The fracture mechanism was analyzed by observing the fracture surface morphologies through scanning electron microscope,and the achieved fatigue life data were analyzed by Weibull statistics. The results show that the fatigue life data present a remarkable scatter ranging from 103 to 106 cycles, and distribute mainly in short and long life regime. The reason for this phenomenon is that the fatigue crack initiators are different with different specimens. The crack initiators for short-life specimens are caused by shrinkage porosity, and for long-life ones are caused by bridged porosity interface and soft-oriented lamellar interface. Based on the observation results of fracture surface, two-parameter Weibull distribution model for fatigue life data can be used for the prediction of fatigue life at a certain failure probability. It has also shown that the shrinkage porosity causes the most detrimental effect to fatigue life.
DETERMINATION OF THE TEMPERATURE DISTRIBUTION THE PERFORATED FINS UNDER
Directory of Open Access Journals (Sweden)
Aziz7 M. Mhamuad
2015-02-01
Full Text Available This work treats the problem of heat transfer for perforated fins under natural convection. The temperature distribution is examined for an array of rectangular fins (15 fins with uniform cross-sectional area (100x270 mm embedded with various vertical body perforations that extend through the fin thickness. The patterns of perforations include 18 circular perforations (holes. Experiments were carried out in an experimental facility that was specifically design and constructed for this purpose. The heat transfer rate and the coefficient of heat transfer increases with perforation diameter increased.
Kovalevsky, Louis; Langley, Robin S.; Caro, Stephane
2016-05-01
Due to the high cost of experimental EMI measurements significant attention has been focused on numerical simulation. Classical methods such as Method of Moment or Finite Difference Time Domain are not well suited for this type of problem, as they require a fine discretisation of space and failed to take into account uncertainties. In this paper, the authors show that the Statistical Energy Analysis is well suited for this type of application. The SEA is a statistical approach employed to solve high frequency problems of electromagnetically reverberant cavities at a reduced computational cost. The key aspects of this approach are (i) to consider an ensemble of system that share the same gross parameter, and (ii) to avoid solving Maxwell's equations inside the cavity, using the power balance principle. The output is an estimate of the field magnitude distribution in each cavity. The method is applied on a typical aircraft structure.
International Nuclear Information System (INIS)
Ayodele, T.R.; Ogunjuyigbe, A.S.O.
2015-01-01
In this paper, probability distribution of clearness index is proposed for the prediction of global solar radiation. First, the clearness index is obtained from the past data of global solar radiation, then, the parameters of the appropriate distribution that best fit the clearness index are determined. The global solar radiation is thereafter predicted from the clearness index using inverse transformation of the cumulative distribution function. To validate the proposed method, eight years global solar radiation data (2000–2007) of Ibadan, Nigeria are used to determine the parameters of appropriate probability distribution for clearness index. The calculated parameters are then used to predict the future monthly average global solar radiation for the following year (2008). The predicted values are compared with the measured values using four statistical tests: the Root Mean Square Error (RMSE), MAE (Mean Absolute Error), MAPE (Mean Absolute Percentage Error) and the coefficient of determination (R"2). The proposed method is also compared to the existing regression models. The results show that logistic distribution provides the best fit for clearness index of Ibadan and the proposed method is effective in predicting the monthly average global solar radiation with overall RMSE of 0.383 MJ/m"2/day, MAE of 0.295 MJ/m"2/day, MAPE of 2% and R"2 of 0.967. - Highlights: • Distribution of clearnes index is proposed for prediction of global solar radiation. • The clearness index is obtained from the past data of global solar radiation. • The parameters of distribution that best fit the clearness index are determined. • Solar radiation is predicted from the clearness index using inverse transformation. • The method is effective in predicting the monthly average global solar radiation.
Energy Technology Data Exchange (ETDEWEB)
NONE
2008-07-01
The general direction of energy and raw materials (DGEMP) is in charge of the follow up of the French market of petroleum products distribution. Each year, a statistical inquiry is carried out with the suppliers and published in the annual report of the petroleum industry. A particular emphasis is laid on fuel sales at highway service stations which give some additional information. The 2007 sales remain stable with respect to 2006 but they show a significant progress of diesel with respect to gasoline. The super-ethanol (E85) sales remain modest but show an import rise all along the year. And finally, 70% of the market is in the hands of 3 suppliers only. (J.S.)
Directory of Open Access Journals (Sweden)
M.M. Mohie El-Din
2011-10-01
Full Text Available In this paper, two sample Bayesian prediction intervals for order statistics (OS are obtained. This prediction is based on a certain class of the inverse exponential-type distributions using a right censored sample. A general class of prior density functions is used and the predictive cumulative function is obtained in the two samples case. The class of the inverse exponential-type distributions includes several important distributions such the inverse Weibull distribution, the inverse Burr distribution, the loglogistic distribution, the inverse Pareto distribution and the inverse paralogistic distribution. Special cases of the inverse Weibull model such as the inverse exponential model and the inverse Rayleigh model are considered.
Directory of Open Access Journals (Sweden)
Jihye Ryu
2018-04-01
Full Text Available The field of enacted/embodied cognition has emerged as a contemporary attempt to connect the mind and body in the study of cognition. However, there has been a paucity of methods that enable a multi-layered approach tapping into different levels of functionality within the nervous systems (e.g., continuously capturing in tandem multi-modal biophysical signals in naturalistic settings. The present study introduces a new theoretical and statistical framework to characterize the influences of cognitive demands on biophysical rhythmic signals harnessed from deliberate, spontaneous and autonomic activities. In this study, nine participants performed a basic pointing task to communicate a decision while they were exposed to different levels of cognitive load. Within these decision-making contexts, we examined the moment-by-moment fluctuations in the peak amplitude and timing of the biophysical time series data (e.g., continuous waveforms extracted from hand kinematics and heart signals. These spike-trains data offered high statistical power for personalized empirical statistical estimation and were well-characterized by a Gamma process. Our approach enabled the identification of different empirically estimated families of probability distributions to facilitate inference regarding the continuous physiological phenomena underlying cognitively driven decision-making. We found that the same pointing task revealed shifts in the probability distribution functions (PDFs of the hand kinematic signals under study and were accompanied by shifts in the signatures of the heart inter-beat-interval timings. Within the time scale of an experimental session, marked changes in skewness and dispersion of the distributions were tracked on the Gamma parameter plane with 95% confidence. The results suggest that traditional theoretical assumptions of stationarity and normality in biophysical data from the nervous systems are incongruent with the true statistical nature of
Ryu, Jihye; Torres, Elizabeth B.
2018-01-01
The field of enacted/embodied cognition has emerged as a contemporary attempt to connect the mind and body in the study of cognition. However, there has been a paucity of methods that enable a multi-layered approach tapping into different levels of functionality within the nervous systems (e.g., continuously capturing in tandem multi-modal biophysical signals in naturalistic settings). The present study introduces a new theoretical and statistical framework to characterize the influences of cognitive demands on biophysical rhythmic signals harnessed from deliberate, spontaneous and autonomic activities. In this study, nine participants performed a basic pointing task to communicate a decision while they were exposed to different levels of cognitive load. Within these decision-making contexts, we examined the moment-by-moment fluctuations in the peak amplitude and timing of the biophysical time series data (e.g., continuous waveforms extracted from hand kinematics and heart signals). These spike-trains data offered high statistical power for personalized empirical statistical estimation and were well-characterized by a Gamma process. Our approach enabled the identification of different empirically estimated families of probability distributions to facilitate inference regarding the continuous physiological phenomena underlying cognitively driven decision-making. We found that the same pointing task revealed shifts in the probability distribution functions (PDFs) of the hand kinematic signals under study and were accompanied by shifts in the signatures of the heart inter-beat-interval timings. Within the time scale of an experimental session, marked changes in skewness and dispersion of the distributions were tracked on the Gamma parameter plane with 95% confidence. The results suggest that traditional theoretical assumptions of stationarity and normality in biophysical data from the nervous systems are incongruent with the true statistical nature of empirical data
Gossip and Distributed Kalman Filtering: Weak Consensus Under Weak Detectability
Kar, Soummya; Moura, José M. F.
2011-04-01
The paper presents the gossip interactive Kalman filter (GIKF) for distributed Kalman filtering for networked systems and sensor networks, where inter-sensor communication and observations occur at the same time-scale. The communication among sensors is random; each sensor occasionally exchanges its filtering state information with a neighbor depending on the availability of the appropriate network link. We show that under a weak distributed detectability condition: 1. the GIKF error process remains stochastically bounded, irrespective of the instability properties of the random process dynamics; and 2. the network achieves \\emph{weak consensus}, i.e., the conditional estimation error covariance at a (uniformly) randomly selected sensor converges in distribution to a unique invariant measure on the space of positive semi-definite matrices (independent of the initial state.) To prove these results, we interpret the filtered states (estimates and error covariances) at each node in the GIKF as stochastic particles with local interactions. We analyze the asymptotic properties of the error process by studying as a random dynamical system the associated switched (random) Riccati equation, the switching being dictated by a non-stationary Markov chain on the network graph.
Distributed and organized decision making under resource boundedness
International Nuclear Information System (INIS)
Sawaragi, Tetsuo
1994-01-01
The coming bottleneck to be overcome in the era of the distributed and open-architectured environment will be the establishment of the rational design and coordination of the total system where multiple decision makers, problem solvers and automated machinery components coexist interacting with each other. In such an environment, they are not achieving some absolute standard of performance with unlimited amounts of resources nor with simple algorithms, but is doing as well as possible given what resources one has. In this article, we focus on the potentials of decision theory as a tool for tackling with the limited rationality under resource boundedness. First, the bottlenecks for establishing the organized and distributed decision making are summarized, and the importance of the formalization of decision activities of intelligent agents is stressed to establish an efficient and effective cooperation by distributed and organized decision making and/or problem solving. Some of the practical systems developed based on such a principle are reviewed briefly with respect to the real-time man-machine collaboration and the cooperative computational framework for the intelligent mobile robots. (author)
Energy Technology Data Exchange (ETDEWEB)
Murata, Isao [Osaka Univ., Suita (Japan); Mori, Takamasa; Nakagawa, Masayuki; Itakura, Hirofumi
1996-03-01
The method to calculate neutronics parameters of a core composed of randomly distributed spherical fuels has been developed based on a statistical geometry model with a continuous energy Monte Carlo method. This method was implemented in a general purpose Monte Carlo code MCNP, and a new code MCNP-CFP had been developed. This paper describes the model and method how to use it and the validation results. In the Monte Carlo calculation, the location of a spherical fuel is sampled probabilistically along the particle flight path from the spatial probability distribution of spherical fuels, called nearest neighbor distribution (NND). This sampling method was validated through the following two comparisons: (1) Calculations of inventory of coated fuel particles (CFPs) in a fuel compact by both track length estimator and direct evaluation method, and (2) Criticality calculations for ordered packed geometries. This method was also confined by applying to an analysis of the critical assembly experiment at VHTRC. The method established in the present study is quite unique so as to a probabilistic model of the geometry with a great number of spherical fuels distributed randomly. Realizing the speed-up by vector or parallel computations in future, it is expected to be widely used in calculation of a nuclear reactor core, especially HTGR cores. (author).
International Nuclear Information System (INIS)
Luneva, K.V.; Kryshev, A.I.; Nikitin, A.I.; Kryshev, I.I.
2010-01-01
The article presents the results of statistical analysis of radiation monitoring data of river system Techa-Iset'-Tobol-Irtysh contamination. A short description of analyzable data and the territory under consideration was given. The distribution-free statistic methods, used for comparative analysis, were described. Reasons of the methods selection and their application features were given. Comparative data analysis with traditional statistics methods was presented. Reliable decrease of 90 Sr specific activity in the river system object to object was determined, which is the evidence of the radionuclide transportation in the river system Techa-Iset'-Tobol-Irtysh [ru
Perneger, Thomas V; Combescure, Christophe
2017-07-01
Published P-values provide a window into the global enterprise of medical research. The aim of this study was to use the distribution of published P-values to estimate the relative frequencies of null and alternative hypotheses and to seek irregularities suggestive of publication bias. This cross-sectional study included P-values published in 120 medical research articles in 2016 (30 each from the BMJ, JAMA, Lancet, and New England Journal of Medicine). The observed distribution of P-values was compared with expected distributions under the null hypothesis (i.e., uniform between 0 and 1) and the alternative hypothesis (strictly decreasing from 0 to 1). P-values were categorized according to conventional levels of statistical significance and in one-percent intervals. Among 4,158 recorded P-values, 26.1% were highly significant (P values values equal to 1, and (3) about twice as many P-values less than 0.05 compared with those more than 0.05. The latter finding was seen in both randomized trials and observational studies, and in most types of analyses, excepting heterogeneity tests and interaction tests. Under plausible assumptions, we estimate that about half of the tested hypotheses were null and the other half were alternative. This analysis suggests that statistical tests published in medical journals are not a random sample of null and alternative hypotheses but that selective reporting is prevalent. In particular, significant results are about twice as likely to be reported as nonsignificant results. Copyright © 2017 Elsevier Inc. All rights reserved.
Simulation of snow distribution and melt under cloudy conditions in an Alpine watershed
Directory of Open Access Journals (Sweden)
H.-Y. Li
2011-07-01
Full Text Available An energy balance method and remote-sensing data were used to simulate snow distribution and melt in an alpine watershed in northwestern China within a complete snow accumulation-melt period. The spatial energy budgets were simulated using meteorological observations and a digital elevation model of the watershed. A linear interpolation method was used to estimate the daily snow cover area under cloudy conditions, using Moderate Resolution Imaging Spectroradiometer (MODIS data. Hourly snow distribution and melt, snow cover extent and daily discharge were included in the simulated results. The root mean square error between the measured snow-water equivalent samplings and the simulated results is 3.2 cm. The Nash and Sutcliffe efficiency statistic (NSE between the measured and simulated discharges is 0.673, and the volume difference (Dv is 3.9 %. Using the method introduced in this article, modelling spatial snow distribution and melt runoff will become relatively convenient.
Energy Technology Data Exchange (ETDEWEB)
Fhager, V
2000-01-01
In order to make correct predictions of the second moment of statistical nuclear variables, such as the number of fissions and the number of thermalized neutrons, the dependence of the energy distribution of the source particles on their number should be considered. It has been pointed out recently that neglecting this number dependence in accelerator driven systems might result in bad estimates of the second moment, and this paper contains qualitative and quantitative estimates of the size of these efforts. We walk towards the requested results in two steps. First, models of the number dependent energy distributions of the neutrons that are ejected in the spallation reactions are constructed, both by simple assumptions and by extracting energy distributions of spallation neutrons from a high-energy particle transport code. Then, the second moment of nuclear variables in a sub-critical reactor, into which spallation neutrons are injected, is calculated. The results from second moment calculations using number dependent energy distributions for the source neutrons are compared to those where only the average energy distribution is used. Two physical models are employed to simulate the neutron transport in the reactor. One is analytical, treating only slowing down of neutrons by elastic scattering in the core material. For this model, equations are written down and solved for the second moment of thermalized neutrons that include the distribution of energy of the spallation neutrons. The other model utilizes Monte Carlo methods for tracking the source neutrons as they travel inside the reactor material. Fast and thermal fission reactions are considered, as well as neutron capture and elastic scattering, and the second moment of the number of fissions, the number of neutrons that leaked out of the system, etc. are calculated. Both models use a cylindrical core with a homogenous mixture of core material. Our results indicate that the number dependence of the energy
International Nuclear Information System (INIS)
Fhager, V.
2000-01-01
In order to make correct predictions of the second moment of statistical nuclear variables, such as the number of fissions and the number of thermalized neutrons, the dependence of the energy distribution of the source particles on their number should be considered. It has been pointed out recently that neglecting this number dependence in accelerator driven systems might result in bad estimates of the second moment, and this paper contains qualitative and quantitative estimates of the size of these efforts. We walk towards the requested results in two steps. First, models of the number dependent energy distributions of the neutrons that are ejected in the spallation reactions are constructed, both by simple assumptions and by extracting energy distributions of spallation neutrons from a high-energy particle transport code. Then, the second moment of nuclear variables in a sub-critical reactor, into which spallation neutrons are injected, is calculated. The results from second moment calculations using number dependent energy distributions for the source neutrons are compared to those where only the average energy distribution is used. Two physical models are employed to simulate the neutron transport in the reactor. One is analytical, treating only slowing down of neutrons by elastic scattering in the core material. For this model, equations are written down and solved for the second moment of thermalized neutrons that include the distribution of energy of the spallation neutrons. The other model utilizes Monte Carlo methods for tracking the source neutrons as they travel inside the reactor material. Fast and thermal fission reactions are considered, as well as neutron capture and elastic scattering, and the second moment of the number of fissions, the number of neutrons that leaked out of the system, etc. are calculated. Both models use a cylindrical core with a homogenous mixture of core material. Our results indicate that the number dependence of the energy
Game Related Statistics Which Discriminate Between Winning and Losing Under-16 Male Basketball Games
Lorenzo, Alberto; Gómez, Miguel Ángel; Ortega, Enrique; Ibáñez, Sergio José; Sampaio, Jaime
2010-01-01
The aim of the present study was to identify the game-related statistics which discriminate between winning and losing teams in under-16 years old male basketball games. The sample gathered all 122 games in the 2004 and 2005 Under-16 European Championships. The game-related statistics analysed were the free-throws (both successful and unsuccessful), 2- and 3-points field-goals (both successful and unsuccessful) offensive and defensive rebounds, blocks, assists, fouls, turnovers and steals. The winning teams exhibited lower ball possessions per game and better offensive and defensive efficacy coefficients than the losing teams. Results from discriminant analysis were statistically significant and allowed to emphasize several structure coefficients (SC). In close games (final score differences below 9 points), the discriminant variables were the turnovers (SC = -0.47) and the assists (SC = 0.33). In balanced games (final score differences between 10 and 29 points), the variables that discriminated between the groups were the successful 2-point field-goals (SC = -0.34) and defensive rebounds (SC = -0. 36); and in unbalanced games (final score differences above 30 points) the variables that best discriminated both groups were the successful 2-point field-goals (SC = 0.37). These results allowed understanding that these players' specific characteristics result in a different game-related statistical profile and helped to point out the importance of the perceptive and decision making process in practice and in competition. Key points The players' game-related statistical profile varied according to game type, game outcome and in formative categories in basketball. The results of this work help to point out the different player's performance described in U-16 men's basketball teams compared with senior and professional men's basketball teams. The results obtained enhance the importance of the perceptive and decision making process in practice and in competition. PMID
International Nuclear Information System (INIS)
Heinrich, S.
2006-01-01
Nucleus fission process is a very complex phenomenon and, even nowadays, no realistic models describing the overall process are available. The work presented here deals with a theoretical description of fission fragments distributions in mass, charge, energy and deformation. We have reconsidered and updated the B.D. Wilking Scission Point model. Our purpose was to test if this statistic model applied at the scission point and by introducing new results of modern microscopic calculations allows to describe quantitatively the fission fragments distributions. We calculate the surface energy available at the scission point as a function of the fragments deformations. This surface is obtained from a Hartree Fock Bogoliubov microscopic calculation which guarantee a realistic description of the potential dependence on the deformation for each fragment. The statistic balance is described by the level densities of the fragment. We have tried to avoid as much as possible the input of empirical parameters in the model. Our only parameter, the distance between each fragment at the scission point, is discussed by comparison with scission configuration obtained from full dynamical microscopic calculations. Also, the comparison between our results and experimental data is very satisfying and allow us to discuss the success and limitations of our approach. We finally proposed ideas to improve the model, in particular by applying dynamical corrections. (author)
Directory of Open Access Journals (Sweden)
Hsueh-Hsien Chang
2017-04-01
Full Text Available This paper proposes statistical feature extraction methods combined with artificial intelligence (AI approaches for fault locations in non-intrusive single-line-to-ground fault (SLGF detection of low voltage distribution systems. The input features of the AI algorithms are extracted using statistical moment transformation for reducing the dimensions of the power signature inputs measured by using non-intrusive fault monitoring (NIFM techniques. The data required to develop the network are generated by simulating SLGF using the Electromagnetic Transient Program (EMTP in a test system. To enhance the identification accuracy, these features after normalization are given to AI algorithms for presenting and evaluating in this paper. Different AI techniques are then utilized to compare which identification algorithms are suitable to diagnose the SLGF for various power signatures in a NIFM system. The simulation results show that the proposed method is effective and can identify the fault locations by using non-intrusive monitoring techniques for low voltage distribution systems.
Coelho, Carlos A.; Marques, Filipe J.
2013-09-01
In this paper the authors combine the equicorrelation and equivariance test introduced by Wilks [13] with the likelihood ratio test (l.r.t.) for independence of groups of variables to obtain the l.r.t. of block equicorrelation and equivariance. This test or its single block version may find applications in many areas as in psychology, education, medicine, genetics and they are important "in many tests of multivariate analysis, e.g. in MANOVA, Profile Analysis, Growth Curve analysis, etc" [12, 9]. By decomposing the overall hypothesis into the hypotheses of independence of groups of variables and the hypothesis of equicorrelation and equivariance we are able to obtain the expressions for the overall l.r.t. statistic and its moments. From these we obtain a suitable factorization of the characteristic function (c.f.) of the logarithm of the l.r.t. statistic, which enables us to develop highly manageable and precise near-exact distributions for the test statistic.
Han, Fang; Liu, Han
2016-01-01
Correlation matrices play a key role in many multivariate methods (e.g., graphical model estimation and factor analysis). The current state-of-the-art in estimating large correlation matrices focuses on the use of Pearson's sample correlation matrix. Although Pearson's sample correlation matrix enjoys various good properties under Gaussian models, it is not an effective estimator when facing heavy-tailed distributions. As a robust alternative, Han and Liu [J. Am. Stat. Assoc. 109 (2015) 275-2...
Han, Fang; Liu, Han
2017-02-01
Correlation matrix plays a key role in many multivariate methods (e.g., graphical model estimation and factor analysis). The current state-of-the-art in estimating large correlation matrices focuses on the use of Pearson's sample correlation matrix. Although Pearson's sample correlation matrix enjoys various good properties under Gaussian models, its not an effective estimator when facing heavy-tail distributions with possible outliers. As a robust alternative, Han and Liu (2013b) advocated the use of a transformed version of the Kendall's tau sample correlation matrix in estimating high dimensional latent generalized correlation matrix under the transelliptical distribution family (or elliptical copula). The transelliptical family assumes that after unspecified marginal monotone transformations, the data follow an elliptical distribution. In this paper, we study the theoretical properties of the Kendall's tau sample correlation matrix and its transformed version proposed in Han and Liu (2013b) for estimating the population Kendall's tau correlation matrix and the latent Pearson's correlation matrix under both spectral and restricted spectral norms. With regard to the spectral norm, we highlight the role of "effective rank" in quantifying the rate of convergence. With regard to the restricted spectral norm, we for the first time present a "sign subgaussian condition" which is sufficient to guarantee that the rank-based correlation matrix estimator attains the optimal rate of convergence. In both cases, we do not need any moment condition.
A statistical analysis of North East Atlantic (submicron aerosol size distributions
Directory of Open Access Journals (Sweden)
M. Dall'Osto
2011-12-01
Full Text Available The Global Atmospheric Watch research station at Mace Head (Ireland offers the possibility to sample some of the cleanest air masses being imported into Europe as well as some of the most polluted being exported out of Europe. We present a statistical cluster analysis of the physical characteristics of aerosol size distributions in air ranging from the cleanest to the most polluted for the year 2008. Data coverage achieved was 75% throughout the year. By applying the Hartigan-Wong k-Means method, 12 clusters were identified as systematically occurring. These 12 clusters could be further combined into 4 categories with similar characteristics, namely: coastal nucleation category (occurring 21.3 % of the time, open ocean nucleation category (occurring 32.6% of the time, background clean marine category (occurring 26.1% of the time and anthropogenic category (occurring 20% of the time aerosol size distributions. The coastal nucleation category is characterised by a clear and dominant nucleation mode at sizes less than 10 nm while the open ocean nucleation category is characterised by a dominant Aitken mode between 15 nm and 50 nm. The background clean marine aerosol exhibited a clear bimodality in the sub-micron size distribution, with although it should be noted that either the Aitken mode or the accumulation mode may dominate the number concentration. However, peculiar background clean marine size distributions with coarser accumulation modes are also observed during winter months. By contrast, the continentally-influenced size distributions are generally more monomodal (accumulation, albeit with traces of bimodality. The open ocean category occurs more often during May, June and July, corresponding with the North East (NE Atlantic high biological period. Combined with the relatively high percentage frequency of occurrence (32.6%, this suggests that the marine biota is an important source of new nano aerosol particles in NE Atlantic Air.
Pedretti, Daniele; Beckie, Roger Daniel
2014-05-01
Missing data in hydrological time-series databases are ubiquitous in practical applications, yet it is of fundamental importance to make educated decisions in problems involving exhaustive time-series knowledge. This includes precipitation datasets, since recording or human failures can produce gaps in these time series. For some applications, directly involving the ratio between precipitation and some other quantity, lack of complete information can result in poor understanding of basic physical and chemical dynamics involving precipitated water. For instance, the ratio between precipitation (recharge) and outflow rates at a discharge point of an aquifer (e.g. rivers, pumping wells, lysimeters) can be used to obtain aquifer parameters and thus to constrain model-based predictions. We tested a suite of methodologies to reconstruct missing information in rainfall datasets. The goal was to obtain a suitable and versatile method to reduce the errors given by the lack of data in specific time windows. Our analyses included both a classical chronologically-pairing approach between rainfall stations and a probability-based approached, which accounted for the probability of exceedence of rain depths measured at two or multiple stations. Our analyses proved that it is not clear a priori which method delivers the best methodology. Rather, this selection should be based considering the specific statistical properties of the rainfall dataset. In this presentation, our emphasis is to discuss the effects of a few typical parametric distributions used to model the behavior of rainfall. Specifically, we analyzed the role of distributional "tails", which have an important control on the occurrence of extreme rainfall events. The latter strongly affect several hydrological applications, including recharge-discharge relationships. The heavy-tailed distributions we considered were parametric Log-Normal, Generalized Pareto, Generalized Extreme and Gamma distributions. The methods were
Shaw, R. L.
1979-01-01
A sample of 228 supernovae that occurred in galaxies with known redshifts is used to show that the mean projected linear supernova distance from the center of the parent galaxy increases with increasing redshift. This effect is interpreted as an observational bias: the discovery rate of supernovae is reduced in the inner parts of distant, poorly resolved galaxies. Even under the optimistic assumption that no selection effects work in galaxies closer than 33 Mpc, about 50% of all supernovae are lost in the inner regions of galaxies beyond 150 Mpc. This observational bias must be taken into account in the derivation of statistical properties of supernovae.
Statistical measurement of the gamma-ray source-count distribution as a function of energy
Zechlin, H.-S.; Cuoco, A.; Donato, F.; Fornengo, N.; Regis, M.
2017-01-01
Photon counts statistics have recently been proven to provide a sensitive observable for characterizing gamma-ray source populations and for measuring the composition of the gamma-ray sky. In this work, we generalize the use of the standard 1-point probability distribution function (1pPDF) to decompose the high-latitude gamma-ray emission observed with Fermi-LAT into: (i) point-source contributions, (ii) the Galactic foreground contribution, and (iii) a diffuse isotropic background contribution. We analyze gamma-ray data in five adjacent energy bands between 1 and 171 GeV. We measure the source-count distribution dN/dS as a function of energy, and demonstrate that our results extend current measurements from source catalogs to the regime of so far undetected sources. Our method improves the sensitivity for resolving point-source populations by about one order of magnitude in flux. The dN/dS distribution as a function of flux is found to be compatible with a broken power law. We derive upper limits on further possible breaks as well as the angular power of unresolved sources. We discuss the composition of the gamma-ray sky and capabilities of the 1pPDF method.
International Nuclear Information System (INIS)
Maluckov, Cedomir A.; Karamarkovic, Jugoslav P.; Radovic, Miodrag K.; Pejovic, Momcilo M.
2004-01-01
The convolution-based model of the electrical breakdown time delay distribution is applied for statistical analysis of experimental results obtained in neon-filled diode tube at 6.5 mbar. At first, the numerical breakdown time delay density distributions are obtained by stochastic modeling as the sum of two independent random variables, the electrical breakdown statistical time delay with exponential, and discharge formative time with Gaussian distribution. Then, the single characteristic breakdown time delay distribution is obtained as the convolution of these two random variables with previously determined parameters. These distributions show good correspondence with the experimental distributions, obtained on the basis of 1000 successive and independent measurements. The shape of distributions is investigated, and corresponding skewness and kurtosis are plotted, in order to follow the transition from Gaussian to exponential distribution
Node vulnerability of water distribution networks under cascading failures
International Nuclear Information System (INIS)
Shuang, Qing; Zhang, Mingyuan; Yuan, Yongbo
2014-01-01
Water distribution networks (WDNs) are important in modern lifeline system. Its stability and reliability are critical for guaranteeing high living quality and continuous operation of urban functions. The aim of this paper is to evaluate the nodal vulnerability of WDNs under cascading failures. Vulnerability is defined to analyze the effects of the consequent failures. A cascading failure is a step-by-step process which is quantitatively investigated by numerical simulation with intentional attack. Monitored pressures in different nodes and flows in different pipes have been used to estimate the network topological structure and the consequences of nodal failure. Based on the connectivity loss of topological structure, the nodal vulnerability has been evaluated. A load variation function is established to record the nodal failure reason and describe the relative differences between the load and the capacity. The proposed method is validated by an illustrative example. The results revealed that the network vulnerability should be evaluated with the consideration of hydraulic analysis and network topology. In the case study, 70.59% of the node failures trigger the cascading failures with different failure processes. It is shown that the cascading failures result in severe consequences in WDNs. - Highlights: • The aim of this paper is to evaluate the nodal vulnerability of water distribution networks under cascading failures. • Monitored pressures and flows have been used to estimate the network topological structure and the consequences of nodal failure. • Based on the connectivity loss of topological structure, the nodal vulnerability has been evaluated. • A load variation function is established to record the failure reason and describe the relative differences between load and capacity. • The results show that 70.59% of the node failures trigger the cascading failures with different failure processes
Toward the detection of gravitational waves under non-Gaussian noises I. Locally optimal statistic.
Yokoyama, Jun'ichi
2014-01-01
After reviewing the standard hypothesis test and the matched filter technique to identify gravitational waves under Gaussian noises, we introduce two methods to deal with non-Gaussian stationary noises. We formulate the likelihood ratio function under weakly non-Gaussian noises through the Edgeworth expansion and strongly non-Gaussian noises in terms of a new method we call Gaussian mapping where the observed marginal distribution and the two-body correlation function are fully taken into account. We then apply these two approaches to Student's t-distribution which has a larger tails than Gaussian. It is shown that while both methods work well in the case the non-Gaussianity is small, only the latter method works well for highly non-Gaussian case.
Ion induced electron emission statistics under Agm- cluster bombardment of Ag
Breuers, A.; Penning, R.; Wucher, A.
2018-05-01
The electron emission from a polycrystalline silver surface under bombardment with Agm- cluster ions (m = 1, 2, 3) is investigated in terms of ion induced kinetic excitation. The electron yield γ is determined directly by a current measurement method on the one hand and implicitly by the analysis of the electron emission statistics on the other hand. Successful measurements of the electron emission spectra ensure a deeper understanding of the ion induced kinetic electron emission process, with particular emphasis on the effect of the projectile cluster size to the yield as well as to emission statistics. The results allow a quantitative comparison to computer simulations performed for silver atoms and clusters impinging onto a silver surface.
NDVI statistical distribution of pasture areas at different times in the Community of Madrid (Spain)
Martín-Sotoca, Juan J.; Saa-Requejo, Antonio; Díaz-Ambrona, Carlos G. H.; Tarquis, Ana M.
2015-04-01
The severity of drought has many implications for society, including its impacts on the water supply, water pollution, reservoir management and ecosystem. However, its impacts on rain-fed agriculture are especially direct. Because of the importance of drought, there have been many attempts to characterize its severity, resulting in the numerous drought indices that have been developed (Niemeyer 2008). 'Biomass index' based on satellite image derived Normalized Difference Vegetation Index (NDVI) has been used in countries like United States of America, Canada and Spain for pasture and forage crops for some years (Rao, 2010). This type of agricultural insurance is named as 'index-based insurance' (IBI). IBI is perceived to be substantially less costly to operate and manage than multiple peril insurance. IBI contracts pay indemnities based not on the actual yield (or revenue) losses experienced by the insurance purchaser but rather based on realized NDVI values (historical data) that is correlated with farm-level losses (Xiaohui Deng et al., 2008). Definition of when drought event occurs is defined on NDVI threshold values mainly based in statistical parameters, average and standard deviation that characterize a normal distribution. In this work a pasture area at the north of Community of Madrid (Spain) has been delimited. Then, NDVI historical data was reconstructed based on remote sensing imaging MODIS, with 500x500m2 resolution. A statistical analysis of the NDVI histograms at consecutives 46 intervals of that area was applied to search for the best statistical distribution based on the maximum likelihood criteria. The results show that the normal distribution is not the optimal representation when IBI is available; the implications in the context of crop insurance are discussed (Martín-Sotoca, 2014). References Kolli N Rao. 2010. Index based Crop Insurance. Agriculture and Agricultural Science Procedia 1, 193-203. Martín-Sotoca, J.J. (2014) Estructura Espacial
DEFF Research Database (Denmark)
Conradsen, Knut; Nielsen, Allan Aasbjerg; Schou, Jesper
2003-01-01
. Based on this distribution, a test statistic for equality of two such matrices and an associated asymptotic probability for obtaining a smaller value of the test statistic are derived and applied successfully to change detection in polarimetric SAR data. In a case study, EMISAR L-band data from April 17...... to HH, VV, or HV data alone, the derived test statistic reduces to the well-known gamma likelihood-ratio test statistic. The derived test statistic and the associated significance value can be applied as a line or edge detector in fully polarimetric SAR data also....
Moigne, Le N.; Oever, van den M.J.A.; Budtova, T.
2011-01-01
Using high resolution optical microscopy coupled with image analysis software and statistical methods, fibre length and aspect ratio distributions in polypropylene composites were characterized. Three types of fibres, flax, sisal and wheat straw, were studied. Number and surface weighted
Pérez, Darío G; Funes, Gustavo
2012-12-03
Under the Geometrics Optics approximation is possible to estimate the covariance between the displacements of two thin beams after they have propagated through a turbulent medium. Previous works have concentrated in long propagation distances to provide models for the wandering statistics. These models are useful when the separation between beams is smaller than the propagation path-regardless of the characteristics scales of the turbulence. In this work we give a complete model for these covariances, behavior introducing absolute limits to the validity of former approximations. Moreover, these generalizations are established for non-Kolmogorov atmospheric models.
Evaluation of statistical distributions to analyze the pollution of Cd and Pb in urban runoff.
Toranjian, Amin; Marofi, Safar
2017-05-01
Heavy metal pollution in urban runoff causes severe environmental damage. Identification of these pollutants and their statistical analysis is necessary to provide management guidelines. In this study, 45 continuous probability distribution functions were selected to fit the Cd and Pb data in the runoff events of an urban area during October 2014-May 2015. The sampling was conducted from the outlet of the city basin during seven precipitation events. For evaluation and ranking of the functions, we used the goodness of fit Kolmogorov-Smirnov and Anderson-Darling tests. The results of Cd analysis showed that Hyperbolic Secant, Wakeby and Log-Pearson 3 are suitable for frequency analysis of the event mean concentration (EMC), the instantaneous concentration series (ICS) and instantaneous concentration of each event (ICEE), respectively. In addition, the LP3, Wakeby and Generalized Extreme Value functions were chosen for the EMC, ICS and ICEE related to Pb contamination.
Statistics on Near Wall Structures and Shear Stress Distribution from 3D Holographic Measurement.
Sheng, J.; Malkiel, E.; Katz, J.
2007-11-01
Digital Holographic Microscopy performs 3D velocity measurement in the near-wall region of a turbulent boundary layer in a square channel over a smooth wall at Reτ=1,400. Resolution of ˜1μm over a sample volume of 1.5x2x1.5mm (x^+=50, y^+=60, z^+=50) is sufficient for resolving buffer layer and lower log layer structures, and for measuring instantaneous wall shear stress distributions from velocity gradients in the viscous sublayer. Results, based on 700 instantaneous realizations, provide detailed statistics on the spatial distribution of both wall stress components along with characteristic flow structures. Conditional sampling based on maxima and minima of wall shear stresses, as well as examination of instantaneous flow structures, lead to development of a conceptual model for a characteristic flow phenomenon that seems to generating extreme stress events. This structure develops as an initially spanwise vortex element rises away from the surface, due to local disturbance, causing a local stress minimum. Due to increasing velocity with elevation, this element bends downstream, forming a pair of inclined streamwise vortices, aligned at 45^0 to freestream, with ejection-like flow between them. Entrainment of high streamwise momentum on the outer sides of this vortex pair generates streamwise shear stress maxima, 70 δν downstream, which are displaced laterally by 35 δν from the local minimum.
Distributed Monitoring of the R(sup 2) Statistic for Linear Regression
Bhaduri, Kanishka; Das, Kamalika; Giannella, Chris R.
2011-01-01
The problem of monitoring a multivariate linear regression model is relevant in studying the evolving relationship between a set of input variables (features) and one or more dependent target variables. This problem becomes challenging for large scale data in a distributed computing environment when only a subset of instances is available at individual nodes and the local data changes frequently. Data centralization and periodic model recomputation can add high overhead to tasks like anomaly detection in such dynamic settings. Therefore, the goal is to develop techniques for monitoring and updating the model over the union of all nodes data in a communication-efficient fashion. Correctness guarantees on such techniques are also often highly desirable, especially in safety-critical application scenarios. In this paper we develop DReMo a distributed algorithm with very low resource overhead, for monitoring the quality of a regression model in terms of its coefficient of determination (R2 statistic). When the nodes collectively determine that R2 has dropped below a fixed threshold, the linear regression model is recomputed via a network-wide convergecast and the updated model is broadcast back to all nodes. We show empirically, using both synthetic and real data, that our proposed method is highly communication-efficient and scalable, and also provide theoretical guarantees on correctness.
A statistical model for deriving probability distributions of contamination for accidental releases
International Nuclear Information System (INIS)
ApSimon, H.M.; Davison, A.C.
1986-01-01
Results generated from a detailed long-range transport model, MESOS, simulating dispersal of a large number of hypothetical releases of radionuclides in a variety of meteorological situations over Western Europe have been used to derive a simpler statistical model, MESOSTAT. This model may be used to generate probability distributions of different levels of contamination at a receptor point 100-1000 km or so from the source (for example, across a frontier in another country) without considering individual release and dispersal scenarios. The model is embodied in a series of equations involving parameters which are determined from such factors as distance between source and receptor, nuclide decay and deposition characteristics, release duration, and geostrophic windrose at the source. Suitable geostrophic windrose data have been derived for source locations covering Western Europe. Special attention has been paid to the relatively improbable extreme values of contamination at the top end of the distribution. The MESOSTAT model and its development are described, with illustrations of its use and comparison with the original more detailed modelling techniques. (author)
Shifflett, Benjamin; Huang, Rong; Edland, Steven D
2017-01-01
Genotypic association studies are prone to inflated type I error rates if multiple hypothesis testing is performed, e.g., sequentially testing for recessive, multiplicative, and dominant risk. Alternatives to multiple hypothesis testing include the model independent genotypic χ 2 test, the efficiency robust MAX statistic, which corrects for multiple comparisons but with some loss of power, or a single Armitage test for multiplicative trend, which has optimal power when the multiplicative model holds but with some loss of power when dominant or recessive models underlie the genetic association. We used Monte Carlo simulations to describe the relative performance of these three approaches under a range of scenarios. All three approaches maintained their nominal type I error rates. The genotypic χ 2 and MAX statistics were more powerful when testing a strictly recessive genetic effect or when testing a dominant effect when the allele frequency was high. The Armitage test for multiplicative trend was most powerful for the broad range of scenarios where heterozygote risk is intermediate between recessive and dominant risk. Moreover, all tests had limited power to detect recessive genetic risk unless the sample size was large, and conversely all tests were relatively well powered to detect dominant risk. Taken together, these results suggest the general utility of the multiplicative trend test when the underlying genetic model is unknown.
International Nuclear Information System (INIS)
Reutter, Bryan W.; Gullberg, Grant T.; Huesman, Ronald H.
2001-01-01
The estimation of time-activity curves and kinetic model parameters directly from projection data is potentially useful for clinical dynamic single photon emission computed tomography (SPECT) studies, particularly in those clinics that have only single-detector systems and thus are not able to perform rapid tomographic acquisitions. Because the radiopharmaceutical distribution changes while the SPECT gantry rotates, projections at different angles come from different tracer distributions. A dynamic image sequence reconstructed from the inconsistent projections acquired by a slowly rotating gantry can contain artifacts that lead to biases in kinetic parameters estimated from time-activity curves generated by overlaying regions of interest on the images. If cone beam collimators are used and the focal point of the collimators always remains in a particular transaxial plane, additional artifacts can arise in other planes reconstructed using insufficient projection samples [1]. If the projection samples truncate the patient's body, this can result in additional image artifacts. To overcome these sources of bias in conventional image based dynamic data analysis, we and others have been investigating the estimation of time-activity curves and kinetic model parameters directly from dynamic SPECT projection data by modeling the spatial and temporal distribution of the radiopharmaceutical throughout the projected field of view [2-8]. In our previous work we developed a computationally efficient method for fully four-dimensional (4-D) direct estimation of spatiotemporal distributions from dynamic SPECT projection data [5], which extended Formiconi's least squares algorithm for reconstructing temporally static distributions [9]. In addition, we studied the biases that result from modeling various orders temporal continuity and using various time samplings [5]. the present work, we address computational issues associated with evaluating the statistical uncertainty of
Kleibergen, F.R.
2003-01-01
We show that the sensitivity of the limit distribution of commonly used GMM statistics to weak and many instruments results from superfluous elements in the higher order expansion of these statistics. When the instruments are strong and their number is small, these elements are of higher order and
International Nuclear Information System (INIS)
Zhang Yu; Wang Guangyi; Lu Xinmiao; Hu Yongcai; Xu Jiangtao
2016-01-01
The random telegraph signal noise in the pixel source follower MOSFET is the principle component of the noise in the CMOS image sensor under low light. In this paper, the physical and statistical model of the random telegraph signal noise in the pixel source follower based on the binomial distribution is set up. The number of electrons captured or released by the oxide traps in the unit time is described as the random variables which obey the binomial distribution. As a result, the output states and the corresponding probabilities of the first and the second samples of the correlated double sampling circuit are acquired. The standard deviation of the output states after the correlated double sampling circuit can be obtained accordingly. In the simulation section, one hundred thousand samples of the source follower MOSFET have been simulated, and the simulation results show that the proposed model has the similar statistical characteristics with the existing models under the effect of the channel length and the density of the oxide trap. Moreover, the noise histogram of the proposed model has been evaluated at different environmental temperatures. (paper)
Knowledge Flow Rules of Modern Design under Distributed Resource Environment
Directory of Open Access Journals (Sweden)
Junning Li
2013-01-01
Full Text Available The process of modern design under the distributed resource environment is interpreted as the process of knowledge flow and integration. As the acquisition of new knowledge strongly depends on resources, knowledge flow can be influenced by technical, economic, and social relation factors, and so forth. In order to achieve greater efficiency of knowledge flow and make the product more competitive, the root causes of the above factors should be acquired first. In this paper, the authors attempt to reveal the nature of design knowledge flow from the perspectives of fluid dynamics and energy. The knowledge field effect and knowledge agglomeration effect are analyzed, respectively, in which the knowledge field effect model considering single task node and the single knowledge energy model in the knowledge flow are established, then the general expression of knowledge energy conservation with consideration of the kinetic energy and potential energy of knowledge is built. Then, the knowledge flow rules and their influential factors including complete transfer and incomplete transfer of design knowledge are studied. Finally, the coupling knowledge flows in the knowledge service platform for modern design are analyzed to certify the feasibility of the research work.
Bustamante, Javier; Seoane, Javier
2004-01-01
Aim To test the effectiveness of statistical models based on explanatory environmental variables vs. existing distribution information (maps and breeding atlas), for predicting the distribution of four species of raptors (family Accipitridae): common buzzard Buteo buteo (Linnaeus, 1758), short-toed eagle Circaetus gallicus (Gmelin, 1788), booted eagle Hieraaetus pennatus (Gmelin, 1788) and black kite Milvus migrans (Boddaert, 1783). Location Andalusia, southe...
Energy Technology Data Exchange (ETDEWEB)
Ohri, Shigehisa; Shimada, Daisaburo; Ishida, Morthiro; Onishi, Shigeyuki
1961-09-19
An evaluation was made of the reliability and validity of the information obtained by the first examination completed under the ABSMTL. Results of the analysis show clearly that the materials hardly can be utilized for studying the relationship between findings obtained from the medical examination and distance from the hypocenter. From the standpoint of clinical medicine, the lack of exactness in the examinations may be a major difficulty. However, as long as the degree of inexactness of the medical examinations is distributed equally to all sample members, comparison of the findings may be made within the limits of their accuracy. 4 references, 1 figure, 3 tables.
Under the hood of IRIS's Distributed REU Site
Hubenthal, M.; Taber, J.
2014-12-01
Since 1998 the IRIS Undergraduate Internship Program has provided research experiences for up to 15 students each summer. Through this 9 to 11 week internship program, students take part in an intensive week-long preparatory course, and work with leaders in seismological research, in both lab-base and field-based settings, to produce research products worthy of presentation and recognition at large professional conferences. The IRIS internship program employs a distributed REU model that has been demonstrated to bond students into a cohort, and maintain group cohesion despite students conducting their research at geographically distributed sites. Over the past 16 years the program has encountered numerous anticipated and unanticipated challenges. The primary challenges have involved exploring how to modify the REU-system to produce outcomes that are better aligned with our programmatic goals. For example, some questions we have attempted to address include: How can the success of an REU site be measured? How do you find, interest, and recruit under-represented minorities into a geophysics program? Can the program increase the probability of interns receiving some minimal level of mentoring across the program? While it is likely that no single answer to these questions exists, we have developed and piloted a number of strategies. These approaches have been developed through a process of identifying relevant research results from other REUs and combing this information with data from our own programmatic evaluations. This data informs the development of specific changes within our program which are then measured as a feedback. We will present our current strategies to address each questions along with measures of their effectiveness. In addition to broad scale systematic issues, we have also placed significant effort into responding to smaller, process challenges that all REU sites face. These range from simple logistical issues (e.g. liability), to educational
International Nuclear Information System (INIS)
Labatie, Antoine
2012-01-01
Baryon Acoustic Oscillations (BAOs) correspond to the acoustic phenomenon in the baryon-photon plasma before recombination. BAOs imprint a particular scale, corresponding to the sound horizon, that can be observed in large-scale structures of the Universe. Using this standard ruler property, BAOs can be used to probe the distance-redshift relation in galaxy catalogues, thus providing a very promising tool to study dark energy properties. BAOs can be studied from the second order statistics (the correlation function or the power spectrum) in the distribution of galaxies. In this thesis we restrict to the case of the correlation function. BAOs appear in the correlation function as a small localized bump at the scale of the sound horizon in co-moving coordinates. There are two major applications of BAO study: BAO detection and cosmological parameter constraints using the standard ruler property. The detection of BAOs at the expected scale enables to confirm the current cosmological model. As for cosmological parameter constraints, enabling the study of dark energy, it is a major goal of modern cosmology. In this thesis we tackle different statistical problems concerning the correlation function analysis in the galaxy distribution, with a focus on the study of BAOs. In the first part, we make both a theoretical and practical study of the bias due to the integral constraints in correlation function estimators. We show that this bias is very small for current galaxy surveys. In the second part we study the BAO detection. We show the limitations of the classical detection method and propose a new method, which is more rigorous. In particular our method enables to take into account the model-dependence of the covariance matrix of the estimators. In the third part, we focus again on the model-dependence of the covariance matrix, but this time for parameter constraints. We estimate a model-dependent covariance matrix and compare our constraints with constraints obtained by
Abbey, Craig K.; Samuelson, Frank W.; Gallas, Brandon D.; Boone, John M.; Niklason, Loren T.
2013-03-01
The receiver operating characteristic (ROC) curve has become a common tool for evaluating diagnostic imaging technologies, and the primary endpoint of such evaluations is the area under the curve (AUC), which integrates sensitivity over the entire false positive range. An alternative figure of merit for ROC studies is expected utility (EU), which focuses on the relevant region of the ROC curve as defined by disease prevalence and the relative utility of the task. However if this measure is to be used, it must also have desirable statistical properties keep the burden of observer performance studies as low as possible. Here, we evaluate effect size and variability for EU and AUC. We use two observer performance studies recently submitted to the FDA to compare the EU and AUC endpoints. The studies were conducted using the multi-reader multi-case methodology in which all readers score all cases in all modalities. ROC curves from the study were used to generate both the AUC and EU values for each reader and modality. The EU measure was computed assuming an iso-utility slope of 1.03. We find mean effect sizes, the reader averaged difference between modalities, to be roughly 2.0 times as big for EU as AUC. The standard deviation across readers is roughly 1.4 times as large, suggesting better statistical properties for the EU endpoint. In a simple power analysis of paired comparison across readers, the utility measure required 36% fewer readers on average to achieve 80% statistical power compared to AUC.
International Nuclear Information System (INIS)
Nam, Cheol; Choi, Byeong Kwon; Jeong, Yong Hwan; Jung, Youn Ho
2001-01-01
During the last decade, the failure behavior of high-burnup fuel rods under RIA has been an extensive concern since observations of fuel rod failures at low enthalpy. Of great importance is placed on failure prediction of fuel rod in the point of licensing criteria and safety in extending burnup achievement. To address the issue, a statistics-based methodology is introduced to predict failure probability of irradiated fuel rods. Based on RIA simulation results in literature, a failure enthalpy correlation for irradiated fuel rod is constructed as a function of oxide thickness, fuel burnup, and pulse width. From the failure enthalpy correlation, a single damage parameter, equivalent enthalpy, is defined to reflect the effects of the three primary factors as well as peak fuel enthalpy. Moreover, the failure distribution function with equivalent enthalpy is derived, applying a two-parameter Weibull statistical model. Using these equations, the sensitivity analysis is carried out to estimate the effects of burnup, corrosion, peak fuel enthalpy, pulse width and cladding materials used
Directory of Open Access Journals (Sweden)
Emmanouil Styvaktakis
2007-01-01
Full Text Available This paper presents the two main types of classification methods for power quality disturbances based on underlying causes: deterministic classification, giving an expert system as an example, and statistical classification, with support vector machines (a novel method as an example. An expert system is suitable when one has limited amount of data and sufficient power system expert knowledge; however, its application requires a set of threshold values. Statistical methods are suitable when large amount of data is available for training. Two important issues to guarantee the effectiveness of a classifier, data segmentation, and feature extraction are discussed. Segmentation of a sequence of data recording is preprocessing to partition the data into segments each representing a duration containing either an event or a transition between two events. Extraction of features is applied to each segment individually. Some useful features and their effectiveness are then discussed. Some experimental results are included for demonstrating the effectiveness of both systems. Finally, conclusions are given together with the discussion of some future research directions.
Santos, João A.; Malheiro, Aureliano C.; Karremann, Melanie K.; Pinto, Joaquim G.
2011-03-01
The impact of projected climate change on wine production was analysed for the Demarcated Region of Douro, Portugal. A statistical grapevine yield model (GYM) was developed using climate parameters as predictors. Statistically significant correlations were identified between annual yield and monthly mean temperatures and monthly precipitation totals during the growing cycle. These atmospheric factors control grapevine yield in the region, with the GYM explaining 50.4% of the total variance in the yield time series in recent decades. Anomalously high March rainfall (during budburst, shoot and inflorescence development) favours yield, as well as anomalously high temperatures and low precipitation amounts in May and June (May: flowering and June: berry development). The GYM was applied to a regional climate model output, which was shown to realistically reproduce the GYM predictors. Finally, using ensemble simulations under the A1B emission scenario, projections for GYM-derived yield in the Douro Region, and for the whole of the twenty-first century, were analysed. A slight upward trend in yield is projected to occur until about 2050, followed by a steep and continuous increase until the end of the twenty-first century, when yield is projected to be about 800 kg/ha above current values. While this estimate is based on meteorological parameters alone, changes due to elevated CO2 may further enhance this effect. In spite of the associated uncertainties, it can be stated that projected climate change may significantly benefit wine yield in the Douro Valley.
Statistical Analysis of Wave Climate Data Using Mixed Distributions and Extreme Wave Prediction
Directory of Open Access Journals (Sweden)
Wei Li
2016-05-01
Full Text Available The investigation of various aspects of the wave climate at a wave energy test site is essential for the development of reliable and efficient wave energy conversion technology. This paper presents studies of the wave climate based on nine years of wave observations from the 2005–2013 period measured with a wave measurement buoy at the Lysekil wave energy test site located off the west coast of Sweden. A detailed analysis of the wave statistics is investigated to reveal the characteristics of the wave climate at this specific test site. The long-term extreme waves are estimated from applying the Peak over Threshold (POT method on the measured wave data. The significant wave height and the maximum wave height at the test site for different return periods are also compared. In this study, a new approach using a mixed-distribution model is proposed to describe the long-term behavior of the significant wave height and it shows an impressive goodness of fit to wave data from the test site. The mixed-distribution model is also applied to measured wave data from four other sites and it provides an illustration of the general applicability of the proposed model. The methodologies used in this paper can be applied to general wave climate analysis of wave energy test sites to estimate extreme waves for the survivability assessment of wave energy converters and characterize the long wave climate to forecast the wave energy resource of the test sites and the energy production of the wave energy converters.
Morgenthaler, George W.
1989-01-01
The ability to launch-on-time and to send payloads into space has progressed dramatically since the days of the earliest missile and space programs. Causes for delay during launch, i.e., unplanned 'holds', are attributable to several sources: weather, range activities, vehicle conditions, human performance, etc. Recent developments in space program, particularly the need for highly reliable logistic support of space construction and the subsequent planned operation of space stations, large unmanned space structures, lunar and Mars bases, and the necessity of providing 'guaranteed' commercial launches have placed increased emphasis on understanding and mastering every aspect of launch vehicle operations. The Center of Space Construction has acquired historical launch vehicle data and is applying these data to the analysis of space launch vehicle logistic support of space construction. This analysis will include development of a better understanding of launch-on-time capability and simulation of required support systems for vehicle assembly and launch which are necessary to support national space program construction schedules. In this paper, the author presents actual launch data on unscheduled 'hold' distributions of various launch vehicles. The data have been supplied by industrial associate companies of the Center for Space Construction. The paper seeks to determine suitable probability models which describe these historical data and that can be used for several purposes such as: inputs to broader simulations of launch vehicle logistic space construction support processes and the determination of which launch operations sources cause the majority of the unscheduled 'holds', and hence to suggest changes which might improve launch-on-time. In particular, the paper investigates the ability of a compound distribution probability model to fit actual data, versus alternative models, and recommends the most productive avenues for future statistical work.
Studies of halo distributions under beam-beam interaction
International Nuclear Information System (INIS)
Chen, T.; Irwin, J.; Siemann, R.H.
1995-01-01
The halo distribution due to the beam-beam interaction in circular electron-positron colliders is simulated with a program which uses a technique that saves a factor of hundreds to thousands of CPU time. The distribution and the interference between the beam-beam interaction and lattice nonlinearities has been investigated. The effects on the halo distribution due to radiation damping misalignment at the collision point, and chromatic effect are presented
Statistical inference on censored data for targeted clinical trials under enrichment design.
Chen, Chen-Fang; Lin, Jr-Rung; Liu, Jen-Pei
2013-01-01
For the traditional clinical trials, inclusion and exclusion criteria are usually based on some clinical endpoints; the genetic or genomic variability of the trial participants are not totally utilized in the criteria. After completion of the human genome project, the disease targets at the molecular level can be identified and can be utilized for the treatment of diseases. However, the accuracy of diagnostic devices for identification of such molecular targets is usually not perfect. Some of the patients enrolled in targeted clinical trials with a positive result for the molecular target might not have the specific molecular targets. As a result, the treatment effect may be underestimated in the patient population truly with the molecular target. To resolve this issue, under the exponential distribution, we develop inferential procedures for the treatment effects of the targeted drug based on the censored endpoints in the patients truly with the molecular targets. Under an enrichment design, we propose using the expectation-maximization algorithm in conjunction with the bootstrap technique to incorporate the inaccuracy of the diagnostic device for detection of the molecular targets on the inference of the treatment effects. A simulation study was conducted to empirically investigate the performance of the proposed methods. Simulation results demonstrate that under the exponential distribution, the proposed estimator is nearly unbiased with adequate precision, and the confidence interval can provide adequate coverage probability. In addition, the proposed testing procedure can adequately control the size with sufficient power. On the other hand, when the proportional hazard assumption is violated, additional simulation studies show that the type I error rate is not controlled at the nominal level and is an increasing function of the positive predictive value. A numerical example illustrates the proposed procedures. Copyright © 2013 John Wiley & Sons, Ltd.
Quantifying Distribution of Flow Cytometric TCR-Vβ Usage with Economic Statistics.
Directory of Open Access Journals (Sweden)
Kornelis S M van der Geest
Full Text Available Measuring changes of the T cell receptor (TCR repertoire is important to many fields of medicine. Flow cytometry is a popular technique to study the TCR repertoire, as it quickly provides insight into the TCR-Vβ usage among well-defined populations of T cells. However, the interpretation of the flow cytometric data remains difficult, and subtle TCR repertoire changes may go undetected. Here, we introduce a novel means for analyzing the flow cytometric data on TCR-Vβ usage. By applying economic statistics, we calculated the Gini-TCR skewing index from the flow cytometric TCR-Vβ analysis. The Gini-TCR skewing index, which is a direct measure of TCR-Vβ distribution among T cells, allowed us to track subtle changes of the TCR repertoire among distinct populations of T cells. Application of the Gini-TCR skewing index to the flow cytometric TCR-Vβ analysis will greatly help to gain better understanding of the TCR repertoire in health and disease.
Riandry, M. A.; Ismet, I.; Akhsan, H.
2017-09-01
This study aims to produce a valid and practical statistical physics course handout on distribution function materials based on STEM. Rowntree development model is used to produce this handout. The model consists of three stages: planning, development and evaluation stages. In this study, the evaluation stage used Tessmer formative evaluation. It consists of 5 stages: self-evaluation, expert review, one-to-one evaluation, small group evaluation and field test stages. However, the handout is limited to be tested on validity and practicality aspects, so the field test stage is not implemented. The data collection technique used walkthroughs and questionnaires. Subjects of this study are students of 6th and 8th semester of academic year 2016/2017 Physics Education Study Program of Sriwijaya University. The average result of expert review is 87.31% (very valid category). One-to-one evaluation obtained the average result is 89.42%. The result of small group evaluation is 85.92%. From one-to-one and small group evaluation stages, averagestudent response to this handout is 87,67% (very practical category). Based on the results of the study, it can be concluded that the handout is valid and practical.
On the Distribution of the Peña Rodríguez Portmanteau Statistic
Directory of Open Access Journals (Sweden)
Serge B. Provost
2012-07-01
Full Text Available v\\:* {behavior:url(#default#VML;} o\\:* {behavior:url(#default#VML;} w\\:* {behavior:url(#default#VML;} .shape {behavior:url(#default#VML;} Normal 0 false false false EN-US X-NONE X-NONE /* Style Definitions */ table.MsoNormalTable {mso-style-name:"Table Normal"; mso-tstyle-rowband-size:0; mso-tstyle-colband-size:0; mso-style-noshow:yes; mso-style-priority:99; mso-style-parent:""; mso-padding-alt:0in 5.4pt 0in 5.4pt; mso-para-margin:0in; mso-para-margin-bottom:.0001pt; mso-pagination:widow-orphan; font-size:10.0pt; font-family:"Times New Roman","serif";} Peña and Rodríguez (2002 introduced a portmanteau test for time series which turns out to be more powerful than those proposed by Ljung and Box (1986 and Monti (1994, and approximated its distribution by means of a two-parameter gamma random variable. A polynomially adjusted beta approximation is proposed in this paper. This approximant is based on the moments of the statistic, which can be estimated by simulation or determined by symbolic computations or numerical integration. Various types of time series processes such as AR(1, MA(1, ARMA(2,2 are being considered. The proposed approximation turns out to be nearly exact.
Energy Technology Data Exchange (ETDEWEB)
Hacke, P.; Spataru, S.
2014-08-01
We propose a method for increasing the frequency of data collection and reducing the time and cost of accelerated lifetime testing of photovoltaic modules undergoing potential-induced degradation (PID). This consists of in-situ measurements of dark current-voltage curves of the modules at elevated stress temperature, their use to determine the maximum power at 25 degrees C standard test conditions (STC), and distribution statistics for determining degradation rates as a function of stress level. The semi-continuous data obtained by this method clearly show degradation curves of the maximum power, including an incubation phase, rates and extent of degradation, precise time to failure, and partial recovery. Stress tests were performed on crystalline silicon modules at 85% relative humidity and 60 degrees C, 72 degrees C, and 85 degrees C. Activation energy for the mean time to failure (1% relative) of 0.85 eV was determined and a mean time to failure of 8,000 h at 25 degrees C and 85% relative humidity is predicted. No clear trend in maximum degradation as a function of stress temperature was observed.
Directory of Open Access Journals (Sweden)
L. Boeckli
2012-07-01
Full Text Available The objective of this study is the production of an Alpine Permafrost Index Map (APIM covering the entire European Alps. A unified statistical model that is based on Alpine-wide permafrost observations is used for debris and bedrock surfaces across the entire Alps. The explanatory variables of the model are mean annual air temperatures, potential incoming solar radiation and precipitation. Offset terms were applied to make model predictions for topographic and geomorphic conditions that differ from the terrain features used for model fitting. These offsets are based on literature review and involve some degree of subjective choice during model building. The assessment of the APIM is challenging because limited independent test data are available for comparison and these observations represent point information in a spatially highly variable topography. The APIM provides an index that describes the spatial distribution of permafrost and comes together with an interpretation key that helps to assess map uncertainties and to relate map contents to their actual expression in the terrain. The map can be used as a first resource to estimate permafrost conditions at any given location in the European Alps in a variety of contexts such as research and spatial planning.
Results show that Switzerland likely is the country with the largest permafrost area in the Alps, followed by Italy, Austria, France and Germany. Slovenia and Liechtenstein may have marginal permafrost areas. In all countries the permafrost area is expected to be larger than the glacier-covered area.
Kwon, Hyun-Han; Lall, Upmanu; Engel, Vic
2011-09-01
The ability to map relationships between ecological outcomes and hydrologic conditions in the Everglades National Park (ENP) is a key building block for their restoration program, a primary goal of which is to improve conditions for wading birds. This paper presents a model linking wading bird foraging numbers to hydrologic conditions in the ENP. Seasonal hydrologic statistics derived from a single water level recorder are well correlated with water depths throughout most areas of the ENP, and are effective as predictors of wading bird numbers when using a nonlinear hierarchical Bayesian model to estimate the conditional distribution of bird populations. Model parameters are estimated using a Markov chain Monte Carlo (MCMC) procedure. Parameter and model uncertainty is assessed as a byproduct of the estimation process. Water depths at the beginning of the nesting season, the average dry season water level, and the numbers of reversals from the dry season recession are identified as significant predictors, consistent with the hydrologic conditions considered important in the production and concentration of prey organisms in this system. Long-term hydrologic records at the index location allow for a retrospective analysis (1952-2006) of foraging bird numbers showing low frequency oscillations in response to decadal fluctuations in hydroclimatic conditions. Simulations of water levels at the index location used in the Bayesian model under alternative water management scenarios allow the posterior probability distributions of the number of foraging birds to be compared, thus providing a mechanism for linking management schemes to seasonal rainfall forecasts.
Aryan, H.; Yearby, K.; Balikhin, M. A.; Krasnoselskikh, V.; Agapitov, O. V.
2013-12-01
The interaction of gyroresonant wave particles with chorus waves largely determine the dynamics of the Earth's radiation belts that effects the acceleration and loss of radiation belt electrons. The common approach is to present model waves distribution in the inner magnetosphere under different values of geomagnetic activity as expressed by the geomagnetic indices. However it is known that solar wind parameters such as bulk velocity (V) and density (n) are more effective in the control of high energy fluxes at the geostationary orbit. Therefore in the present study the set of parameters of the wave distribution is expanded to include the solar wind parameters in addition to the geomagnetic indices. The present study examines almost four years (01, January, 2004 to 29, September, 2007) of Cluster STAFF-SA, Double Star TC1 and OMNI data in order to present a combined model of wave magnetic field intensities for the chorus waves as a function of magnetic local time (MLT), L-shell (L*), geomagnetic activity, and solar wind velocity and density. Generally, the largest wave intensities are observed during average solar wind velocities (3006cm-3. On the other hand the wave intensity is lower and limited between 06:00 to 18:00 MLT for V700kms-1.
Understanding the Sampling Distribution and Its Use in Testing Statistical Significance.
Breunig, Nancy A.
Despite the increasing criticism of statistical significance testing by researchers, particularly in the publication of the 1994 American Psychological Association's style manual, statistical significance test results are still popular in journal articles. For this reason, it remains important to understand the logic of inferential statistics. A…
Directory of Open Access Journals (Sweden)
V. E. Merzlikin
2015-01-01
Full Text Available The article deals with the search for optimal parameter estimation of the parameters of the process of homogenization of dairy products. Provides a theoretical basis for relationship of the relaxation time of the fat globules and attenuation coefficient of ultrasonic oscillations in dairy products. Suggested from the measured acoustic properties of milk to make the calculations of the mass distribution of fat globules. Studies on the proof of this hypothesis. Morphological analysis procedure carried out for homogenized milk samples at different pressures, as well as homogenized. As a result of research obtained distribution histogram of fat globules in dependence on the homogenization pressure. Also performed acoustic studies to obtain the frequency characteristics of loss modulus as a function of homogenization pressure. For further research the choice of method for approximating dependences is obtained using statistical moments of distributions. The parameters for the approximation of the distribution of fat globules and loss modulus versus pressure homogenization were obtained. Was carried out to test the hypothesis on the relationship parameters of approximation of the distribution of the fat globules and loss modulus as a function of pressure homogenization. Correlation analysis showed a clear dependence of the first and second statistical moment distributions of the pressure homogenization. The obtain ed dependence is consistent with the physical meaning of the first two moments of a statistical distribution. Correlation analysis was carried out according to the statistical moments of the distribution of the fat globules from moments of loss modulus. It is concluded that the possibility of ultrasonic testing the degree of homogenization and mass distribution of the fat globules of milk products.
The Value of Distributed Generation under Different Tariff Structures
Firestone, Ryan; Magnus Maribu, Karl; Marnay, Chris
2006-01-01
Distributed generation (DG) may play a key role in a modern energy system because it can improve energy efficiency. Reductions in the energy bill, and therefore DG attractiveness, depend on the electricity tariff structure; a system created before widespread adoption of distributed generation. Tariffs have been designed to recover costs equitably amongst customers with similar consumption patterns. Recently, electric utilities began to question the equity of this electricity pricing stru...
International Nuclear Information System (INIS)
Kwag, Shinyoung; Gupta, Abhinav
2017-01-01
Highlights: • This study presents the development of Bayesian framework for probabilistic risk assessment (PRA) of structural systems under multiple hazards. • The concepts of Bayesian network and Bayesian inference are combined by mapping the traditionally used fault trees into a Bayesian network. • The proposed mapping allows for consideration of dependencies as well as correlations between events. • Incorporation of Bayesian inference permits a novel way for exploration of a scenario that is likely to result in a system level “vulnerability.” - Abstract: Conventional probabilistic risk assessment (PRA) methodologies (USNRC, 1983; IAEA, 1992; EPRI, 1994; Ellingwood, 2001) conduct risk assessment for different external hazards by considering each hazard separately and independent of each other. The risk metric for a specific hazard is evaluated by a convolution of the fragility and the hazard curves. The fragility curve for basic event is obtained by using empirical, experimental, and/or numerical simulation data for a particular hazard. Treating each hazard as an independently can be inappropriate in some cases as certain hazards are statistically correlated or dependent. Examples of such correlated events include but are not limited to flooding induced fire, seismically induced internal or external flooding, or even seismically induced fire. In the current practice, system level risk and consequence sequences are typically calculated using logic trees to express the causative relationship between events. In this paper, we present the results from a study on multi-hazard risk assessment that is conducted using a Bayesian network (BN) with Bayesian inference. The framework can consider statistical dependencies among risks from multiple hazards, allows updating by considering the newly available data/information at any level, and provide a novel way to explore alternative failure scenarios that may exist due to vulnerabilities.
Energy Technology Data Exchange (ETDEWEB)
Kwag, Shinyoung [North Carolina State University, Raleigh, NC 27695 (United States); Korea Atomic Energy Research Institute, Daejeon 305-353 (Korea, Republic of); Gupta, Abhinav, E-mail: agupta1@ncsu.edu [North Carolina State University, Raleigh, NC 27695 (United States)
2017-04-15
Highlights: • This study presents the development of Bayesian framework for probabilistic risk assessment (PRA) of structural systems under multiple hazards. • The concepts of Bayesian network and Bayesian inference are combined by mapping the traditionally used fault trees into a Bayesian network. • The proposed mapping allows for consideration of dependencies as well as correlations between events. • Incorporation of Bayesian inference permits a novel way for exploration of a scenario that is likely to result in a system level “vulnerability.” - Abstract: Conventional probabilistic risk assessment (PRA) methodologies (USNRC, 1983; IAEA, 1992; EPRI, 1994; Ellingwood, 2001) conduct risk assessment for different external hazards by considering each hazard separately and independent of each other. The risk metric for a specific hazard is evaluated by a convolution of the fragility and the hazard curves. The fragility curve for basic event is obtained by using empirical, experimental, and/or numerical simulation data for a particular hazard. Treating each hazard as an independently can be inappropriate in some cases as certain hazards are statistically correlated or dependent. Examples of such correlated events include but are not limited to flooding induced fire, seismically induced internal or external flooding, or even seismically induced fire. In the current practice, system level risk and consequence sequences are typically calculated using logic trees to express the causative relationship between events. In this paper, we present the results from a study on multi-hazard risk assessment that is conducted using a Bayesian network (BN) with Bayesian inference. The framework can consider statistical dependencies among risks from multiple hazards, allows updating by considering the newly available data/information at any level, and provide a novel way to explore alternative failure scenarios that may exist due to vulnerabilities.
Gerdes, Lars; Busch, Ulrich; Pecoraro, Sven
2014-12-14
According to Regulation (EU) No 619/2011, trace amounts of non-authorised genetically modified organisms (GMO) in feed are tolerated within the EU if certain prerequisites are met. Tolerable traces must not exceed the so-called 'minimum required performance limit' (MRPL), which was defined according to the mentioned regulation to correspond to 0.1% mass fraction per ingredient. Therefore, not yet authorised GMO (and some GMO whose approvals have expired) have to be quantified at very low level following the qualitative detection in genomic DNA extracted from feed samples. As the results of quantitative analysis can imply severe legal and financial consequences for producers or distributors of feed, the quantification results need to be utterly reliable. We developed a statistical approach to investigate the experimental measurement variability within one 96-well PCR plate. This approach visualises the frequency distribution as zygosity-corrected relative content of genetically modified material resulting from different combinations of transgene and reference gene Cq values. One application of it is the simulation of the consequences of varying parameters on measurement results. Parameters could be for example replicate numbers or baseline and threshold settings, measurement results could be for example median (class) and relative standard deviation (RSD). All calculations can be done using the built-in functions of Excel without any need for programming. The developed Excel spreadsheets are available (see section 'Availability of supporting data' for details). In most cases, the combination of four PCR replicates for each of the two DNA isolations already resulted in a relative standard deviation of 15% or less. The aims of the study are scientifically based suggestions for minimisation of uncertainty of measurement especially in -but not limited to- the field of GMO quantification at low concentration levels. Four PCR replicates for each of the two DNA isolations
Aryan, Homayon; Yearby, Keith; Balikhin, Michael; Agapitov, Oleksiy; Krasnoselskikh, Vladimir; Boynton, Richard
2014-08-01
Energetic electrons within the Earth's radiation belts represent a serious hazard to geostationary satellites. The interactions of electrons with chorus waves play an important role in both the acceleration and loss of radiation belt electrons. The common approach is to present model wave distributions in the inner magnetosphere under different values of geomagnetic activity as expressed by the geomagnetic indices. However, it has been shown that only around 50% of geomagnetic storms increase flux of relativistic electrons at geostationary orbit while 20% causes a decrease and the remaining 30% has relatively no effect. This emphasizes the importance of including solar wind parameters such as bulk velocity (V), density (n), flow pressure (P), and the vertical interplanetary magnetic field component (Bz) that are known to be predominately effective in the control of high energy fluxes at the geostationary orbit. Therefore, in the present study the set of parameters of the wave distributions is expanded to include the solar wind parameters in addition to the geomagnetic activity. The present study examines almost 4 years (1 January 2004 to 29 September 2007) of Spatio-Temporal Analysis of Field Fluctuation data from Double Star TC1 combined with geomagnetic indices and solar wind parameters from OMNI database in order to present a comprehensive model of wave magnetic field intensities for the chorus waves as a function of magnetic local time, L shell (L), magnetic latitude (λm), geomagnetic activity, and solar wind parameters. Generally, the results indicate that the intensity of chorus emission is not only dependent upon geomagnetic activity but also dependent on solar wind parameters with velocity and southward interplanetary magnetic field Bs (Bz < 0), evidently the most influential solar wind parameters. The largest peak chorus intensities in the order of 50 pT are observed during active conditions, high solar wind velocities, low solar wind densities, high
International Nuclear Information System (INIS)
Zhukhlistov, A.A.; Avilov, A.S.; Ferraris, D.; Zvyagin, B.B.; Plotnikov, V.P.
1997-01-01
The method of improved automatic electron diffractometry for measuring and recording intensities to two-dimensionally distributed reflections of texture-type electron diffraction patterns has been used for the analysis of the brucite Mg(OH) 2 structure. The experimental accuracy of the measured intensities proved to be sufficient for studying fine structural details of the statistical distribution of hydrogen atoms over three structure positions located around the threefold axis of the brucite structure
Simulation of concentration distribution of urban particles under wind
Chen, Yanghou; Yang, Hangsheng
2018-02-01
The concentration of particulate matter in the air is too high, which seriously affects people’s health. The concentration of particles in densely populated towns is also high. Understanding the distribution of particles in the air helps to remove them passively. The concentration distribution of particles in urban streets is simulated by using the FLUENT software. The simulation analysis based on Discrete Phase Modelling (DPM) of FLUENT. Simulation results show that the distribution of the particles is caused by different layout of buildings. And it is pointed out that in the windward area of the building and the leeward sides of the high-rise building are the areas with high concentration of particles. Understanding the concentration of particles in different areas is also helpful for people to avoid and reduce the concentration of particles in high concentration areas.
Studies in the statistical and thermal properties of hadronic matter under some extreme conditions
International Nuclear Information System (INIS)
Chase, K.C.; Mekjian, A.Z.; Bhattacharyya, P.
1997-01-01
The thermal and statistical properties of hadronic matter under some extreme conditions are investigated using an exactly solvable canonical ensemble model. A unified model describing both the fragmentation of nuclei and the thermal properties of hadronic matter is developed. Simple expressions are obtained for quantities such as the hadronic equation of state, specific heat, compressibility, entropy, and excitation energy as a function of temperature and density. These expressions encompass the fermionic aspect of nucleons, such as degeneracy pressure and Fermi energy at low temperatures and the ideal gas laws at high temperatures and low density. Expressions are developed which connect these two extremes with behavior that resembles an ideal Bose gas with its associated Bose condensation. In the thermodynamic limit, an infinite cluster exists below a certain critical condition in a manner similar to the sudden appearance of the infinite cluster in percolation theory. The importance of multiplicity fluctuations is discussed and some recent data from the EOS collaboration on critical point behavior of nuclei can be accounted for using simple expressions obtained from the model. copyright 1997 The American Physical Society
Statistical optimization for tannase production from Aspergillus niger under submerged fermentation.
Sharma, S; Agarwal, L; Saxena, R K
2007-06-01
Statistically based experimental design was employed for the optimization of fermentation conditions for maximum production of enzyme tannase from Aspergillus niger. Central composite rotatable design (CCRD) falling under response surface methodology (RSM) was used. Based on the results of 'one-at-a-time' approach in submerged fermentation, the most influencing factors for tannase production from A. niger were concentrations of tannic acid and sodium nitrate, agitation rate and incubation period. Hence, to achieve the maximum yield of tannase, interaction of these factors was studied at optimum production pH of 5.0 by RSM. The optimum values of parameters obtained through RSM were 5% tannic acid, 0.8% sodium nitrate, 5.0 pH, 5 × 10(7) spores/50mL inoculum density, 150 rpm agitation and incubation period of 48 h which resulted in production of 19.7 UmL(-1) of the enzyme. This activity was almost double as compared to the amount obtained by 'one-at-a-time' approach (9.8 UmL(-1)).
Robust DEA under discrete uncertain data: a case study of Iranian electricity distribution companies
Hafezalkotob, Ashkan; Haji-Sami, Elham; Omrani, Hashem
2015-06-01
Crisp input and output data are fundamentally indispensable in traditional data envelopment analysis (DEA). However, the real-world problems often deal with imprecise or ambiguous data. In this paper, we propose a novel robust data envelopment model (RDEA) to investigate the efficiencies of decision-making units (DMU) when there are discrete uncertain input and output data. The method is based upon the discrete robust optimization approaches proposed by Mulvey et al. (1995) that utilizes probable scenarios to capture the effect of ambiguous data in the case study. Our primary concern in this research is evaluating electricity distribution companies under uncertainty about input/output data. To illustrate the ability of proposed model, a numerical example of 38 Iranian electricity distribution companies is investigated. There are a large amount ambiguous data about these companies. Some electricity distribution companies may not report clear and real statistics to the government. Thus, it is needed to utilize a prominent approach to deal with this uncertainty. The results reveal that the RDEA model is suitable and reliable for target setting based on decision makers (DM's) preferences when there are uncertain input/output data.
Directory of Open Access Journals (Sweden)
E. Farg
2017-04-01
Full Text Available Traditional methods for center pivot evaluation depend on the water depth distribution along the pivot arm. Estimation and mapping the water depth under pivot irrigation systems using remote sensing data is essential for calculating the coefficient of uniformity (CU of water distribution. This study focuses on estimating and mapping water depth using Landsat OLI 8 satellite data integrated with Heerman and Hein (1968 modified equation for center pivot evaluation. Landsat OLI 8 image was geometrically and radiometrically corrected to calculate the vegetation and water indices (NDVI and NDWI in addition to land surface temperature. Results of the statistical analysis showed that the collected water depth in catchment cans is also highly correlated negatively with NDVI. On the other hand water, depth was positively correlated with NDWI and LST. Multi-linear regression analysis using stepwise selection method was applied to estimate and map the water depth distribution. The results showed R2 and adjusted R2 0.93 and 0.88 respectively. Study area or field level verification was applied for estimation equation with correlation 0.93 between the collected water depth and estimated values.
Ryazanov, V. V.
2007-01-01
By means of an inequality of the information and parametrization of family of distributions of the probabilities, supposing an effective estimation, introduction of the distributions containing time of the first achievement of a level as internal thermodynamic parameter ground.
Distribution Line Parameter Estimation Under Consideration of Measurement Tolerances
DEFF Research Database (Denmark)
Prostejovsky, Alexander; Gehrke, Oliver; Kosek, Anna Magdalena
2016-01-01
conductance that the absolute compensated error is −1.05% and −1.07% for both representations, as opposed to the expected uncompensated error of −79.68%. Identification of a laboratory distribution line using real measurement data grid yields a deviation of 6.75% and 4.00%, respectively, from a calculation...
Distribution Grid Integration Costs Under High PV Penetrations Workshop |
utility business model and structure: policies and regulations, revenue requirements and investment Practices Panel 3: Future Directions in Grid Integration Cost-Benefit Analysis Determining Distribution Grid into Utility Planning Notes on Future Needs All speakers were asked to include their opinions on
International Nuclear Information System (INIS)
Juang, K.-W.; Lee, D.-Y.; Teng, Y.-L.
2005-01-01
Correctly classifying 'contaminated' areas in soils, based on the threshold for a contaminated site, is important for determining effective clean-up actions. Pollutant mapping by means of kriging is increasingly being used for the delineation of contaminated soils. However, those areas where the kriged pollutant concentrations are close to the threshold have a high possibility for being misclassified. In order to reduce the misclassification due to the over- or under-estimation from kriging, an adaptive sampling using the cumulative distribution function of order statistics (CDFOS) was developed to draw additional samples for delineating contaminated soils, while kriging. A heavy-metal contaminated site in Hsinchu, Taiwan was used to illustrate this approach. The results showed that compared with random sampling, adaptive sampling using CDFOS reduced the kriging estimation errors and misclassification rates, and thus would appear to be a better choice than random sampling, as additional sampling is required for delineating the 'contaminated' areas. - A sampling approach was derived for drawing additional samples while kriging
Statistical distribution of resonance parameters for inelastic scattering of fast neutrons
International Nuclear Information System (INIS)
Radunovic, J.
1973-01-01
This paper deals with the application of statistical method for the analysis of nuclear reactions related to complex nuclei. It is shown that inelastic neutron scattering which occurs by creation of a complex nucleus in the higher energy range can be treated by statistical approach
Qu, Long; Nettleton, Dan; Dekkers, Jack C M
2012-12-01
Given a large number of t-statistics, we consider the problem of approximating the distribution of noncentrality parameters (NCPs) by a continuous density. This problem is closely related to the control of false discovery rates (FDR) in massive hypothesis testing applications, e.g., microarray gene expression analysis. Our methodology is similar to, but improves upon, the existing approach by Ruppert, Nettleton, and Hwang (2007, Biometrics, 63, 483-495). We provide parametric, nonparametric, and semiparametric estimators for the distribution of NCPs, as well as estimates of the FDR and local FDR. In the parametric situation, we assume that the NCPs follow a distribution that leads to an analytically available marginal distribution for the test statistics. In the nonparametric situation, we use convex combinations of basis density functions to estimate the density of the NCPs. A sequential quadratic programming procedure is developed to maximize the penalized likelihood. The smoothing parameter is selected with the approximate network information criterion. A semiparametric estimator is also developed to combine both parametric and nonparametric fits. Simulations show that, under a variety of situations, our density estimates are closer to the underlying truth and our FDR estimates are improved compared with alternative methods. Data-based simulations and the analyses of two microarray datasets are used to evaluate the performance in realistic situations. © 2012, The International Biometric Society.
Rectangular Shell Plating Under Uniformly Distributed Hydrostatic Pressure
Neubert, M; Sommer, A
1940-01-01
A check of the calculation methods used by Foppl and Henky for investigating the reliability of shell plating under hydrostatic pressure has proved that the formulas yield practical results within the elastic range of the material. Foppl's approximate calculation leaves one on the safe side. It further was found on the basis of the marked ductility of the shell plating under tensile stress that the strength is from 50 to 100 percent higher in the elastic range than expected by either method.
Shabetia, Alexander; Rodichev, Yurii; Veer, F.A.; Soroka, Elena; Louter, Christian; Bos, Freek; Belis, Jan; Veer, Fred; Nijsse, Rob
An analytical approach based on the on the sequential partitioning of the data and Weibull Statistical Distribution for inhomogeneous - defective materials is proposed. It allows assessing the guaranteed strength of glass structures for the low probability of fracture with a higher degree of
Khaemba, W.M.; Stein, A.
2001-01-01
This study illustrates the use of modern statistical procedures for better wildlife management by addressing three key issues: determination of abundance, modeling of animal distributions and variability of diversity in space and time. Prior information in Markov Chain Monte Carlo (MCMC) methods is
Mehta, Shraddha; Bastero-Caballero, Rowena F; Sun, Yijun; Zhu, Ray; Murphy, Diane K; Hardas, Bhushan; Koch, Gary
2018-04-29
Many published scale validation studies determine inter-rater reliability using the intra-class correlation coefficient (ICC). However, the use of this statistic must consider its advantages, limitations, and applicability. This paper evaluates how interaction of subject distribution, sample size, and levels of rater disagreement affects ICC and provides an approach for obtaining relevant ICC estimates under suboptimal conditions. Simulation results suggest that for a fixed number of subjects, ICC from the convex distribution is smaller than ICC for the uniform distribution, which in turn is smaller than ICC for the concave distribution. The variance component estimates also show that the dissimilarity of ICC among distributions is attributed to the study design (ie, distribution of subjects) component of subject variability and not the scale quality component of rater error variability. The dependency of ICC on the distribution of subjects makes it difficult to compare results across reliability studies. Hence, it is proposed that reliability studies should be designed using a uniform distribution of subjects because of the standardization it provides for representing objective disagreement. In the absence of uniform distribution, a sampling method is proposed to reduce the non-uniformity. In addition, as expected, high levels of disagreement result in low ICC, and when the type of distribution is fixed, any increase in the number of subjects beyond a moderately large specification such as n = 80 does not have a major impact on ICC. Copyright © 2018 John Wiley & Sons, Ltd.
Distributed Secure Coordinated Control for Multiagent Systems Under Strategic Attacks.
Feng, Zhi; Wen, Guanghui; Hu, Guoqiang
2017-05-01
This paper studies a distributed secure consensus tracking control problem for multiagent systems subject to strategic cyber attacks modeled by a random Markov process. A hybrid stochastic secure control framework is established for designing a distributed secure control law such that mean-square exponential consensus tracking is achieved. A connectivity restoration mechanism is considered and the properties on attack frequency and attack length rate are investigated, respectively. Based on the solutions of an algebraic Riccati equation and an algebraic Riccati inequality, a procedure to select the control gains is provided and stability analysis is studied by using Lyapunov's method.. The effect of strategic attacks on discrete-time systems is also investigated. Finally, numerical examples are provided to illustrate the effectiveness of theoretical analysis.
Optimal Power Flow for Distribution Systems under Uncertain Forecasts: Preprint
Energy Technology Data Exchange (ETDEWEB)
Dall' Anese, Emiliano; Baker, Kyri; Summers, Tyler
2016-12-01
The paper focuses on distribution systems featuring renewable energy sources and energy storage devices, and develops an optimal power flow (OPF) approach to optimize the system operation in spite of forecasting errors. The proposed method builds on a chance-constrained multi-period AC OPF formulation, where probabilistic constraints are utilized to enforce voltage regulation with a prescribed probability. To enable a computationally affordable solution approach, a convex reformulation of the OPF task is obtained by resorting to i) pertinent linear approximations of the power flow equations, and ii) convex approximations of the chance constraints. Particularly, the approximate chance constraints provide conservative bounds that hold for arbitrary distributions of the forecasting errors. An adaptive optimization strategy is then obtained by embedding the proposed OPF task into a model predictive control framework.
International Nuclear Information System (INIS)
Foray, G.; Descamps-Mandine, A.; R’Mili, M.; Lamon, J.
2012-01-01
The present paper investigates glass fibre flaw size distributions. Two commercial fibre grades (HP and HD) mainly used in cement-based composite reinforcement were studied. Glass fibre fractography is a difficult and time consuming exercise, and thus is seldom carried out. An approach based on tensile tests on multifilament bundles and examination of the fibre surface by atomic force microscopy (AFM) was used. Bundles of more than 500 single filaments each were tested. Thus a statistically significant database of failure data was built up for the HP and HD glass fibres. Gaussian flaw distributions were derived from the filament tensile strength data or extracted from the AFM images. The two distributions were compared. Defect sizes computed from raw AFM images agreed reasonably well with those derived from tensile strength data. Finally, the pertinence of a Gaussian distribution was discussed. The alternative Pareto distribution provided a fair approximation when dealing with AFM flaw size.
Hldreth, Laura A.; Robison-Cox, Jim; Schmidt, Jade
2018-01-01
This study examines the transferability of results from previous studies of simulation-based curriculum in introductory statistics using data from 3,500 students enrolled in an introductory statistics course at Montana State University from fall 2013 through spring 2016. During this time, four different curricula, a traditional curriculum and…
Phosphorus distribution in sandy soil profile under drip irrigation system
International Nuclear Information System (INIS)
El-Gendy, R.W.; Rizk, M.A.; Abd El Moniem, M.; Abdel-Aziz, H.A.; Fahmi, A.E.
2009-01-01
This work aims at to studying the impact of irrigation water applied using drip irrigation system in sandy soil with snap bean on phosphorus distribution. This experiment was carried out in soils and water research department farm, nuclear research center, atomic energy authority, cairo, Egypt. Snap bean was cultivated in sandy soil and irrigated with 50,37.5 and 25 cm water in three water treatments represented 100, 75 and 50% ETc. Phosphorus distribution and direction of soil water movement had been detected in three sites on the dripper line (S1,S2 and S3 at 0,12.5 and 25 cm distance from dripper). Phosphorus fertilizer (super phosphate, 15.5% P 2 O 5 in rate 300 kg/fed)was added before cultivation. Neutron probe was used to detect the water distribution and movement at the three site along soil profile. Soil samples were collected before p-addition, at end developing, mid, and late growth stages to determine residual available phosphorus. The obtained data showed that using 50 cm water for irrigation caused an increase in P-concentration till 75 cm depth in the three sites of 100% etc treatment, and covered P-requirements of snap bean for all growth stages. As for 37.5 and 25 cm irrigation water cannot cover all growth stages for P-requirements of snap bean. It could be concluded that applied irrigation water could drive the residual P-levels till 75 cm depth in the three sites. Yield of the crop had been taken as an indicator as an indicator profile. Yield showed good response according to water quantities and P-transportation within the soil profile
Statistical distribution of partial widths in the microscopic theory of nuclear reactions
International Nuclear Information System (INIS)
Bunakov, V.E.; Ogloblin, S.G.
1978-01-01
Using the microscopic theory of nuclear reaction the distribution function of neutron reduced partial widths is obtained. It is shown that the distribution of reduced partial widths of a radiative transition is of the same form. The distribution obtained differs from the Porter-Thomas law for neutron widths only in the presence of intermediate structures. It is noteworthy that the presence of an intermediate structure leads to a greater dispersion
DEFF Research Database (Denmark)
Missov, Trifon I.; Schöley, Jonas
to this criterion admissible distributions are, for example, the gamma, the beta, the truncated normal, the log-logistic and the Weibull, while distributions like the log-normal and the inverse Gaussian do not satisfy this condition. In this article we show that models with admissible frailty distributions...... and a Gompertz baseline provide a better fit to adult human mortality data than the corresponding models with non-admissible frailty distributions. We implement estimation procedures for mixture models with a Gompertz baseline and frailty that follows a gamma, truncated normal, log-normal, or inverse Gaussian...
Reliability analysis of water distribution systems under uncertainty
International Nuclear Information System (INIS)
Kansal, M.L.; Kumar, Arun; Sharma, P.B.
1995-01-01
In most of the developing countries, the Water Distribution Networks (WDN) are of intermittent type because of the shortage of safe drinking water. Failure of a pipeline(s) in such cases will cause not only the fall in one or more nodal heads but also the poor connectivity of source with various demand nodes of the system. Most of the previous works have used the two-step algorithm based on pathset or cutset approach for connectivity analysis. The computations become more cumbersome when connectivity of all demand nodes taken together with that of supply is carried out. In the present paper, network connectivity based on the concept of Appended Spanning Tree (AST) is suggested to compute global network connectivity which is defined as the probability of the source node being connected with all the demand nodes simultaneously. The concept of AST has distinct advantages as it attacks the problem directly rather than in an indirect way as most of the studies so far have done. Since the water distribution system is a repairable one, a general expression for pipeline avialability using the failure/repair rate is considered. Furthermore, the sensitivity of global reliability estimates due to the likely error in the estimation of failure/repair rates of various pipelines is also studied
Schneider, Markus P. A.
This dissertation contributes to two areas in economics: the understanding of the distribution of earned income and to Bayesian analysis of distributional data. Recently, physicists claimed that the distribution of earned income is exponential (see Yakovenko, 2009). The first chapter explores the perspective that the economy is a statistical mechanical system and the implication for labor market outcomes is considered critically. The robustness of the empirical results that lead to the physicists' claims, the significance of the exponential distribution in statistical mechanics, and the case for a conservation law in economics are discussed. The conclusion reached is that physicists' conception of the economy is too narrow even within their chosen framework, but that their overall approach is insightful. The dual labor market theory of segmented labor markets is invoked to understand why the observed distribution may be a mixture of distributional components, corresponding to different generating mechanisms described in Reich et al. (1973). The application of informational entropy in chapter II connects this work to Bayesian analysis and maximum entropy econometrics. The analysis follows E. T. Jaynes's treatment of Wolf's dice data, but is applied to the distribution of earned income based on CPS data. The results are calibrated to account for rounded survey responses using a simple simulation, and answer the graphical analyses by physicists. The results indicate that neither the income distribution of all respondents nor of the subpopulation used by physicists appears to be exponential. The empirics do support the claim that a mixture with exponential and log-normal distributional components ts the data. In the final chapter, a log-linear model is used to fit the exponential to the earned income distribution. Separating the CPS data by gender and marital status reveals that the exponential is only an appropriate model for a limited number of subpopulations, namely
Kleibergen, F.R.; Kleijn, R.; Paap, R.
2000-01-01
We propose a novel Bayesian test under a (noninformative) Jeffreys'priorspecification. We check whether the fixed scalar value of the so-calledBayesian Score Statistic (BSS) under the null hypothesis is aplausiblerealization from its known and standardized distribution under thealternative. Unlike
Distributionally Robust Joint Chance Constrained Problem under Moment Uncertainty
Directory of Open Access Journals (Sweden)
Ke-wei Ding
2014-01-01
Full Text Available We discuss and develop the convex approximation for robust joint chance constraints under uncertainty of first- and second-order moments. Robust chance constraints are approximated by Worst-Case CVaR constraints which can be reformulated by a semidefinite programming. Then the chance constrained problem can be presented as semidefinite programming. We also find that the approximation for robust joint chance constraints has an equivalent individual quadratic approximation form.
Dai, Qi; Yang, Yanchun; Wang, Tianming
2008-10-15
Many proposed statistical measures can efficiently compare biological sequences to further infer their structures, functions and evolutionary information. They are related in spirit because all the ideas for sequence comparison try to use the information on the k-word distributions, Markov model or both. Motivated by adding k-word distributions to Markov model directly, we investigated two novel statistical measures for sequence comparison, called wre.k.r and S2.k.r. The proposed measures were tested by similarity search, evaluation on functionally related regulatory sequences and phylogenetic analysis. This offers the systematic and quantitative experimental assessment of our measures. Moreover, we compared our achievements with these based on alignment or alignment-free. We grouped our experiments into two sets. The first one, performed via ROC (receiver operating curve) analysis, aims at assessing the intrinsic ability of our statistical measures to search for similar sequences from a database and discriminate functionally related regulatory sequences from unrelated sequences. The second one aims at assessing how well our statistical measure is used for phylogenetic analysis. The experimental assessment demonstrates that our similarity measures intending to incorporate k-word distributions into Markov model are more efficient.
Energy Technology Data Exchange (ETDEWEB)
Song, Myung Sub; Kim, Song Hyun; Kim, Jong Kyung [Hanyang Univ., Seoul (Korea, Republic of); Noh, Jae Man [Korea Atomic Energy Research Institute, Daejeon (Korea, Republic of)
2013-10-15
The uncertainty evaluation with statistical method is performed by repetition of transport calculation with sampling the directly perturbed nuclear data. Hence, the reliable uncertainty result can be obtained by analyzing the results of the numerous transport calculations. One of the problems in the uncertainty analysis with the statistical approach is known as that the cross section sampling from the normal (Gaussian) distribution with relatively large standard deviation leads to the sampling error of the cross sections such as the sampling of the negative cross section. Some collection methods are noted; however, the methods can distort the distribution of the sampled cross sections. In this study, a sampling method of the nuclear data is proposed by using lognormal distribution. After that, the criticality calculations with sampled nuclear data are performed and the results are compared with that from the normal distribution which is conventionally used in the previous studies. In this study, the statistical sampling method of the cross section with the lognormal distribution was proposed to increase the sampling accuracy without negative sampling error. Also, a stochastic cross section sampling and writing program was developed. For the sensitivity and uncertainty analysis, the cross section sampling was pursued with the normal and lognormal distribution. The uncertainties, which are caused by covariance of (n,.) cross sections, were evaluated by solving GODIVA problem. The results show that the sampling method with lognormal distribution can efficiently solve the negative sampling problem referred in the previous studies. It is expected that this study will contribute to increase the accuracy of the sampling-based uncertainty analysis.
International Nuclear Information System (INIS)
Song, Myung Sub; Kim, Song Hyun; Kim, Jong Kyung; Noh, Jae Man
2013-01-01
The uncertainty evaluation with statistical method is performed by repetition of transport calculation with sampling the directly perturbed nuclear data. Hence, the reliable uncertainty result can be obtained by analyzing the results of the numerous transport calculations. One of the problems in the uncertainty analysis with the statistical approach is known as that the cross section sampling from the normal (Gaussian) distribution with relatively large standard deviation leads to the sampling error of the cross sections such as the sampling of the negative cross section. Some collection methods are noted; however, the methods can distort the distribution of the sampled cross sections. In this study, a sampling method of the nuclear data is proposed by using lognormal distribution. After that, the criticality calculations with sampled nuclear data are performed and the results are compared with that from the normal distribution which is conventionally used in the previous studies. In this study, the statistical sampling method of the cross section with the lognormal distribution was proposed to increase the sampling accuracy without negative sampling error. Also, a stochastic cross section sampling and writing program was developed. For the sensitivity and uncertainty analysis, the cross section sampling was pursued with the normal and lognormal distribution. The uncertainties, which are caused by covariance of (n,.) cross sections, were evaluated by solving GODIVA problem. The results show that the sampling method with lognormal distribution can efficiently solve the negative sampling problem referred in the previous studies. It is expected that this study will contribute to increase the accuracy of the sampling-based uncertainty analysis
Directory of Open Access Journals (Sweden)
Orgeta Gjermëni
2017-10-01
Full Text Available This article aims to provide new results about the intraday degree sequence distribution considering phone call network graph evolution in time. More specifically, it tackles the following problem. Given a large amount of landline phone call data records, what is the best way to summarize the distinct number of calling partners per client per day? In order to answer this question, a series of undirected phone call network graphs is constructed based on data from a local telecommunication source in Albania. All network graphs of the series are simplified. Further, a longitudinal temporal study is made on this network graphs series related to the degree distributions. Power law and log-normal distribution fittings on the degree sequence are compared on each of the network graphs of the series. The maximum likelihood method is used to estimate the parameters of the distributions, and a Kolmogorov–Smirnov test associated with a p-value is used to define the plausible models. A direct distribution comparison is made through a Vuong test in the case that both distributions are plausible. Another goal was to describe the parameters’ distributions’ shape. A Shapiro-Wilk test is used to test the normality of the data, and measures of shape are used to define the distributions’ shape. Study findings suggested that log-normal distribution models better the intraday degree sequence data of the network graphs. It is not possible to say that the distributions of log-normal parameters are normal.
Distributed Generation Investment by a Microgrid under Uncertainty
Energy Technology Data Exchange (ETDEWEB)
Marnay, Chris; Siddiqui, Afzal; Marnay, Chris
2008-08-11
This paper examines a California-based microgrid?s decision to invest in a distributed generation (DG) unit fuelled by natural gas. While the long-term natural gas generation cost is stochastic, we initially assume that the microgrid may purchase electricity at a fixed retail rate from its utility. Using the real options approach, we find a natural gas generation cost threshold that triggers DG investment. Furthermore, the consideration of operational flexibility by the microgrid increases DG investment, while the option to disconnect from the utility is not attractive. By allowing the electricity price to be stochastic, we next determine an investment threshold boundary and find that high electricity price volatility relative to that of natural gas generation cost delays investment while simultaneously increasing the value of the investment. We conclude by using this result to find the implicit option value of the DG unit when two sources of uncertainty exist.
Distributed generation investment by a microgrid under uncertainty
International Nuclear Information System (INIS)
Siddiqui, Afzal S.; Marnay, Chris
2008-01-01
This paper examines a California-based microgrid's decision to invest in a distributed generation (DG) unit fuelled by natural gas. While the long-term natural gas generation cost is stochastic, we initially assume that the microgrid may purchase electricity at a fixed retail rate from its utility. Using the real options approach, we find a natural gas generation cost threshold that triggers DG investment. Furthermore, the consideration of operational flexibility by the microgrid increases DG investment, while the option to disconnect from the utility is not attractive. By allowing the electricity price to be stochastic, we next determine an investment threshold boundary and find that high electricity price volatility relative to that of natural gas generation cost delays investment while simultaneously increasing the value of the investment. We conclude by using this result to find the implicit option value of the DG unit when two sources of uncertainty exist. (author)
Distributed generation investment by a microgrid under uncertainty
Energy Technology Data Exchange (ETDEWEB)
Siddiqui, Afzal S. [Department of Statistical Science, University College London, Gower Street, London WC1E 6BT (United Kingdom); Marnay, Chris [Ernest Orlando Lawrence Berkeley National Laboratory, 1 Cyclotron Road, MS90R4000, Berkeley, CA 94720-8163 (United States)
2008-12-15
This paper examines a California-based microgrid's decision to invest in a distributed generation (DG) unit fuelled by natural gas. While the long-term natural gas generation cost is stochastic, we initially assume that the microgrid may purchase electricity at a fixed retail rate from its utility. Using the real options approach, we find a natural gas generation cost threshold that triggers DG investment. Furthermore, the consideration of operational flexibility by the microgrid increases DG investment, while the option to disconnect from the utility is not attractive. By allowing the electricity price to be stochastic, we next determine an investment threshold boundary and find that high electricity price volatility relative to that of natural gas generation cost delays investment while simultaneously increasing the value of the investment. We conclude by using this result to find the implicit option value of the DG unit when two sources of uncertainty exist. (author)
Smooth conditional distribution function and quantiles under random censorship.
Leconte, Eve; Poiraud-Casanova, Sandrine; Thomas-Agnan, Christine
2002-09-01
We consider a nonparametric random design regression model in which the response variable is possibly right censored. The aim of this paper is to estimate the conditional distribution function and the conditional alpha-quantile of the response variable. We restrict attention to the case where the response variable as well as the explanatory variable are unidimensional and continuous. We propose and discuss two classes of estimators which are smooth with respect to the response variable as well as to the covariate. Some simulations demonstrate that the new methods have better mean square error performances than the generalized Kaplan-Meier estimator introduced by Beran (1981) and considered in the literature by Dabrowska (1989, 1992) and Gonzalez-Manteiga and Cadarso-Suarez (1994).
International Nuclear Information System (INIS)
Gomez, Miryam; Saldarriaga, Julio; Correa, Mauricio; Posada, Enrique; Castrillon M, Francisco Javier
2007-01-01
Sand fields, constructions, carbon boilers, roads, and biologic sources are air-contaminant-constituent factors in down town Valle de Aburra, among others. the distribution of road contribution data to total suspended particles according to the source receptor model MCF, source correlation modeling, is nearly a gamma distribution. Chi-square goodness of fit is used to model statistically. This test for goodness of fit also allows estimating the parameters of the distribution utilizing maximum likelihood method. As convergence criteria, the estimation maximization algorithm is used. The mean of road contribution data to total suspended particles according to the source receptor model MCF, is straightforward and validates the road contribution factor to the atmospheric pollution of the zone under study
Statistics and methodology of multiple cell upset characterization under heavy ion irradiation
International Nuclear Information System (INIS)
Zebrev, G.I.; Gorbunov, M.S.; Useinov, R.G.; Emeliyanov, V.V.; Ozerov, A.I.; Anashin, V.S.; Kozyukov, A.E.; Zemtsov, K.S.
2015-01-01
Mean and partial cross-section concepts and their connections to multiplicity and statistics of multiple cell upsets (MCUs) in highly-scaled digital memories are introduced and discussed. The important role of the experimental determination of the upset statistics is emphasized. It was found that MCU may lead to quasi-linear dependence of cross-sections on linear energy transfer (LET). A new form of function for interpolation of mean cross-section dependences on LET has been proposed
Gyenge, N.; Ballai, I.; Baranyi, T.
2016-07-01
The aim of the present investigation is to study the spatio-temporal distribution of precursor flares during the 24 h interval preceding M- and X-class major flares and the evolution of follower flares. Information on associated (precursor and follower) flares is provided by Reuven Ramaty High Energy Solar Spectroscopic Imager (RHESSI). Flare list, while the major flares are observed by the Geostationary Operational Environmental Satellite (GOES) system satellites between 2002 and 2014. There are distinct evolutionary differences between the spatio-temporal distributions of associated flares in about one-day period depending on the type of the main flare. The spatial distribution was characterized by the normalized frequency distribution of the quantity δ (the distance between the major flare and its precursor flare normalized by the sunspot group diameter) in four 6 h time intervals before the major event. The precursors of X-class flares have a double-peaked spatial distribution for more than half a day prior to the major flare, but it changes to a lognormal-like distribution roughly 6 h prior to the event. The precursors of M-class flares show lognormal-like distribution in each 6 h subinterval. The most frequent sites of the precursors in the active region are within a distance of about 0.1 diameter of sunspot group from the site of the major flare in each case. Our investigation shows that the build-up of energy is more effective than the release of energy because of precursors.
Analysis of statistical distributions of partial γ-widths of 98Mo neutron 3 2-resonances
International Nuclear Information System (INIS)
Knat'ko, V.A.; Rudak, Eh.A.; Shimanovich, E.A.
1978-01-01
Width distributions for E1 γ-transitions from the 98 Mo neutron 3/2 resonances to the 99 Mo low-lying levels with 1/2 + , 3/2 2 and 5/2 + spins are desribed. Considered are sets of widths corresponding to γ-transitions to the levels with 3/2 spin and positioned in the energy range from 12 to 5268 eV. On the basis of the results obtained a conclusion has been drawn that the width distribution of γ-transitions to the 3/2 + level differs from the Porter-Thomas distribution
Investment and upgrade in distributed generation under uncertainty
Energy Technology Data Exchange (ETDEWEB)
Siddiqui, Afzal S. [Department of Statistical Science, University College London, London WC1E 6BT (United Kingdom); Maribu, Karl [Centre d' Economie Industrielle, Ecole Nationale Superieure des Mines de Paris, Paris 75272 (France)
2009-01-15
The ongoing deregulation of electricity industries worldwide is providing incentives for microgrids to use small-scale distributed generation (DG) and combined heat and power (CHP) applications via heat exchangers (HXs) to meet local energy loads. Although the electric-only efficiency of DG is lower than that of central-station production, relatively high tariff rates and the potential for CHP applications increase the attraction of on-site generation. Nevertheless, a microgrid contemplating the installation of gas-fired DG has to be aware of the uncertainty in the natural gas price. Treatment of uncertainty via real options increases the value of the investment opportunity, which then delays the adoption decision as the opportunity cost of exercising the investment option increases as well. In this paper, we take the perspective of a microgrid that can proceed in a sequential manner with DG capacity and HX investment in order to reduce its exposure to risk from natural gas price volatility. In particular, with the availability of the HX, the microgrid faces a tradeoff between reducing its exposure to the natural gas price and maximising its cost savings. By varying the volatility parameter, we find that the microgrid prefers a direct investment strategy for low levels of volatility and a sequential one for higher levels of volatility. (author)
Investment and upgrade in distributed generation under uncertainty
International Nuclear Information System (INIS)
Siddiqui, Afzal S.; Maribu, Karl
2009-01-01
The ongoing deregulation of electricity industries worldwide is providing incentives for microgrids to use small-scale distributed generation (DG) and combined heat and power (CHP) applications via heat exchangers (HXs) to meet local energy loads. Although the electric-only efficiency of DG is lower than that of central-station production, relatively high tariff rates and the potential for CHP applications increase the attraction of on-site generation. Nevertheless, a microgrid contemplating the installation of gas-fired DG has to be aware of the uncertainty in the natural gas price. Treatment of uncertainty via real options increases the value of the investment opportunity, which then delays the adoption decision as the opportunity cost of exercising the investment option increases as well. In this paper, we take the perspective of a microgrid that can proceed in a sequential manner with DG capacity and HX investment in order to reduce its exposure to risk from natural gas price volatility. In particular, with the availability of the HX, the microgrid faces a tradeoff between reducing its exposure to the natural gas price and maximising its cost savings. By varying the volatility parameter, we find that the microgrid prefers a direct investment strategy for low levels of volatility and a sequential one for higher levels of volatility. (author)
Investment and Upgrade in Distributed Generation under Uncertainty
Energy Technology Data Exchange (ETDEWEB)
Siddiqui, Afzal; Maribu, Karl
2008-08-18
The ongoing deregulation of electricity industries worldwide is providing incentives for microgrids to use small-scale distributed generation (DG) and combined heat and power (CHP) applications via heat exchangers (HXs) to meet local energy loads. Although the electric-only efficiency of DG is lower than that of central-station production, relatively high tariff rates and the potential for CHP applications increase the attraction of on-site generation. Nevertheless, a microgrid contemplatingthe installation of gas-fired DG has to be aware of the uncertainty in the natural gas price. Treatment of uncertainty via real options increases the value of the investment opportunity, which then delays the adoption decision as the opportunity cost of exercising the investment option increases as well. In this paper, we take the perspective of a microgrid that can proceed in a sequential manner with DG capacity and HX investment in order to reduce its exposure to risk from natural gas price volatility. In particular, with the availability of the HX, the microgrid faces a tradeoff between reducing its exposure to the natural gas price and maximising its cost savings. By varying the volatility parameter, we find that the microgrid prefers a direct investment strategy for low levels of volatility and a sequential one for higher levels of volatility.
The McDonald exponentiated gamma distribution and its statistical properties
Al-Babtain, Abdulhakim A; Merovci, Faton; Elbatal, Ibrahim
2015-01-01
Abstract In this paper, we propose a five-parameter lifetime model called the McDonald exponentiated gamma distribution to extend beta exponentiated gamma, Kumaraswamy exponentiated gamma and exponentiated gamma, among several other models. We provide a comprehensive mathematical treatment of this distribution. We derive the moment generating function and the rth moment. We discuss estimation of the parameters by maximum likelihood and provide the information matrix. AMS Subject Classificatio...
One of the main uses of biomarker measurements is to compare different populations to each other and to assess risk in comparison to established parameters. This is most often done using summary statistics such as central tendency, variance components, confidence intervals, excee...
Sample Size Requirements for Assessing Statistical Moments of Simulated Crop Yield Distributions
Lehmann, N.; Finger, R.; Klein, T.; Calanca, P.
2013-01-01
Mechanistic crop growth models are becoming increasingly important in agricultural research and are extensively used in climate change impact assessments. In such studies, statistics of crop yields are usually evaluated without the explicit consideration of sample size requirements. The purpose of
Distribution of Oxycephalidae (Hyperiidea-Amphipoda) in the Indian Ocean- A statistical study
Digital Repository Service at National Institute of Oceanography (India)
Nair, K.K.C.; Jayalakshmy, K.V.
Statistical analysis of oxycephalids on coexistence of the species showed two clusters of high affinity in the Arabian Sea, four in the Bay of Bengal, one in the South East Indian Ocean and three in the South West Indian Ocean. Species occurring...
Decoy-state quantum key distribution with both source errors and statistical fluctuations
International Nuclear Information System (INIS)
Wang Xiangbin; Yang Lin; Peng Chengzhi; Pan Jianwei
2009-01-01
We show how to calculate the fraction of single-photon counts of the 3-intensity decoy-state quantum cryptography faithfully with both statistical fluctuations and source errors. Our results rely only on the bound values of a few parameters of the states of pulses.
DEFF Research Database (Denmark)
Rodriguez, Pedro; Luna, Alvaro; Hermoso, Juan Ramon
2011-01-01
The operation of distributed power generation systems under grid fault conditions is a key issue for the massive integration of renewable energy systems. Several studies have been conducted to improve the response of such distributed generation systems under voltage dips. In spite of being less s...
Control of power converters in distributed generation applications under grid fault conditions
DEFF Research Database (Denmark)
Rodriguez, Pedro; Luna, Alvaro; Munoz-Aguilar, Raul
2011-01-01
The operation of distributed power generation systems under grid fault conditions is a key issue for the massive integration of renewable energy systems. Several studies have been conducted to improve the response of such distributed generation systems under voltage dips. In spite of being less s...
26 CFR 1.457-7 - Taxation of Distributions Under Eligible Plans.
2010-04-01
... 26 Internal Revenue 6 2010-04-01 2010-04-01 false Taxation of Distributions Under Eligible Plans. 1.457-7 Section 1.457-7 Internal Revenue INTERNAL REVENUE SERVICE, DEPARTMENT OF THE TREASURY...-7 Taxation of Distributions Under Eligible Plans. (a) General rules for when amounts are included in...
Profile of State Prisoners under Age 18, 1985-97. Bureau of Justice Statistics Special Report.
Strom, Kevin J.
This report presents data on all individuals under age 18 in state prisons, whether under the original jurisdiction of the juvenile or adult criminal system. Most of the data are from the National Corrections Reporting Program. On December 31, 1997, less than 1% of inmates in state prisons were under age 18, a proportion that has remained stable…
Shang, Ce; Chaloupka, Frank J; Zahra, Nahleen; Fong, Geoffrey T
2013-01-01
Background The distribution of cigarette prices has rarely been studied and compared under different tax structures. Descriptive evidence on price distributions by countries can shed light on opportunities for tax avoidance and brand switching under different tobacco tax structures, which could impact the effectiveness of increased taxation in reducing smoking. Objective This paper aims to describe the distribution of cigarette prices by countries and to compare these distributions based on the tobacco tax structure in these countries. Methods We employed data for 16 countries taken from the International Tobacco Control Policy Evaluation Project to construct survey-derived cigarette prices for each country. Self-reported prices were weighted by cigarette consumption and described using a comprehensive set of statistics. We then compared these statistics for cigarette prices under different tax structures. In particular, countries of similar income levels and countries that impose similar total excise taxes using different tax structures were paired and compared in mean and variance using a two-sample comparison test. Findings Our investigation illustrates that, compared with specific uniform taxation, other tax structures, such as ad valorem uniform taxation, mixed (a tax system using ad valorem and specific taxes) uniform taxation, and tiered tax structures of specific, ad valorem and mixed taxation tend to have price distributions with greater variability. Countries that rely heavily on ad valorem and tiered taxes also tend to have greater price variability around the median. Among mixed taxation systems, countries that rely more heavily on the ad valorem component tend to have greater price variability than countries that rely more heavily on the specific component. In countries with tiered tax systems, cigarette prices are skewed more towards lower prices than are prices under uniform tax systems. The analyses presented here demonstrate that more opportunities
Shang, Ce; Chaloupka, Frank J; Zahra, Nahleen; Fong, Geoffrey T
2014-03-01
The distribution of cigarette prices has rarely been studied and compared under different tax structures. Descriptive evidence on price distributions by countries can shed light on opportunities for tax avoidance and brand switching under different tobacco tax structures, which could impact the effectiveness of increased taxation in reducing smoking. This paper aims to describe the distribution of cigarette prices by countries and to compare these distributions based on the tobacco tax structure in these countries. We employed data for 16 countries taken from the International Tobacco Control Policy Evaluation Project to construct survey-derived cigarette prices for each country. Self-reported prices were weighted by cigarette consumption and described using a comprehensive set of statistics. We then compared these statistics for cigarette prices under different tax structures. In particular, countries of similar income levels and countries that impose similar total excise taxes using different tax structures were paired and compared in mean and variance using a two-sample comparison test. Our investigation illustrates that, compared with specific uniform taxation, other tax structures, such as ad valorem uniform taxation, mixed (a tax system using ad valorem and specific taxes) uniform taxation, and tiered tax structures of specific, ad valorem and mixed taxation tend to have price distributions with greater variability. Countries that rely heavily on ad valorem and tiered taxes also tend to have greater price variability around the median. Among mixed taxation systems, countries that rely more heavily on the ad valorem component tend to have greater price variability than countries that rely more heavily on the specific component. In countries with tiered tax systems, cigarette prices are skewed more towards lower prices than are prices under uniform tax systems. The analyses presented here demonstrate that more opportunities exist for tax avoidance and brand
International Nuclear Information System (INIS)
Grendel, M.
1981-01-01
Boundary conditions for distribution functions of quasiparticles scattered by an interface between two crystalline grains are presented. Contrary to former formulations where Maxwell-Boltzmann statistics was considered, the present boundary conditions take into account the quantum statistics (Fermi-Dirac or Bose-Einstein) of quasiparticles. Provided that small deviations only from thermodynamic equilibrium are present, the boundary conditions are linearized, and then their ''renormalization'' is investigated in case of elastic scattering. The final results of the renormalization, which are obtained for a simplified model of an interface, sugo.est that the portion of the Fermi (Bose)-quasiparticles reflected or transmitted specularly is decreased (increased) in comparison with the case of quasiparticles obeying Maxwell-Boltzmann statistics. (author)
Jansma, J Martijn; de Zwart, Jacco A; van Gelderen, Peter; Duyn, Jeff H; Drevets, Wayne C; Furey, Maura L
2013-05-15
Technical developments in MRI have improved signal to noise, allowing use of analysis methods such as Finite impulse response (FIR) of rapid event related functional MRI (er-fMRI). FIR is one of the most informative analysis methods as it determines onset and full shape of the hemodynamic response function (HRF) without any a priori assumptions. FIR is however vulnerable to multicollinearity, which is directly related to the distribution of stimuli over time. Efficiency can be optimized by simplifying a design, and restricting stimuli distribution to specific sequences, while more design flexibility necessarily reduces efficiency. However, the actual effect of efficiency on fMRI results has never been tested in vivo. Thus, it is currently difficult to make an informed choice between protocol flexibility and statistical efficiency. The main goal of this study was to assign concrete fMRI signal to noise values to the abstract scale of FIR statistical efficiency. Ten subjects repeated a perception task with five random and m-sequence based protocol, with varying but, according to literature, acceptable levels of multicollinearity. Results indicated substantial differences in signal standard deviation, while the level was a function of multicollinearity. Experiment protocols varied up to 55.4% in standard deviation. Results confirm that quality of fMRI in an FIR analysis can significantly and substantially vary with statistical efficiency. Our in vivo measurements can be used to aid in making an informed decision between freedom in protocol design and statistical efficiency. Published by Elsevier B.V.
Kato, Takeyoshi; Sugimoto, Hiroyuki; Suzuoki, Yasuo
We established a procedure for estimating regional electricity demand and regional potential capacity of distributed generators (DGs) by using a grid square statistics data set. A photovoltaic power system (PV system) for residential use and a co-generation system (CGS) for both residential and commercial use were taken into account. As an example, the result regarding Aichi prefecture was presented in this paper. The statistical data of the number of households by family-type and the number of employees by business category for about 4000 grid-square with 1km × 1km area was used to estimate the floor space or the electricity demand distribution. The rooftop area available for installing PV systems was also estimated with the grid-square statistics data set. Considering the relation between a capacity of existing CGS and a scale-index of building where CGS is installed, the potential capacity of CGS was estimated for three business categories, i.e. hotel, hospital, store. In some regions, the potential capacity of PV systems was estimated to be about 10,000kW/km2, which corresponds to the density of the existing area with intensive installation of PV systems. Finally, we discussed the ratio of regional potential capacity of DGs to regional maximum electricity demand for deducing the appropriate capacity of DGs in the model of future electricity distribution system.
Production-distribution of electric power in France: 1997-98 statistical data
International Nuclear Information System (INIS)
1999-01-01
This document has been realized using the annual inquiry carried out by the French direction of gas, electricity and coal (Digec). It brings together the main statistical data about the production, transport and consumption of electric power in France: 1997 and 1998 balance sheets, foreign exchanges, long-term evolutions, production with respect to the different energy sources, consumption in the different departments and regions.. (J.S.)
Yu, Xu; Yu, Miao; Xu, Li-xun; Yang, Jing; Xie, Zhi-qiang
2015-01-01
The assumption that the training and testing samples are drawn from the same distribution is violated under covariate shift setting, and most algorithms for the covariate shift setting try to first estimate distributions and then reweight samples based on the distributions estimated. Due to the difficulty of estimating a correct distribution, previous methods can not get good classification performance. In this paper, we firstly present two types of covariate shift problems. Rather than estim...
Bakker, A.; Dierdorp, A.; Maanen, J.A. van; Eijkelhof, H.M.C.
2012-01-01
To stimulate students’ shuttling between contextual and statistical spheres, we based tasks on professional practices. This article focuses on two tasks to support reasoning about sampling by students aged 16-17. The purpose of the tasks was to find out which smaller sample size would have been
Farthouat, Juliane; Franco, Ana; Mary, Alison; Delpouve, Julie; Wens, Vincent; Op de Beeck, Marc; De Tiège, Xavier; Peigneux, Philippe
2017-03-01
Humans are highly sensitive to statistical regularities in their environment. This phenomenon, usually referred as statistical learning, is most often assessed using post-learning behavioural measures that are limited by a lack of sensibility and do not monitor the temporal dynamics of learning. In the present study, we used magnetoencephalographic frequency-tagged responses to investigate the neural sources and temporal development of the ongoing brain activity that supports the detection of regularities embedded in auditory streams. Participants passively listened to statistical streams in which tones were grouped as triplets, and to random streams in which tones were randomly presented. Results show that during exposure to statistical (vs. random) streams, tritone frequency-related responses reflecting the learning of regularities embedded in the stream increased in the left supplementary motor area and left posterior superior temporal sulcus (pSTS), whereas tone frequency-related responses decreased in the right angular gyrus and right pSTS. Tritone frequency-related responses rapidly developed to reach significance after 3 min of exposure. These results suggest that the incidental extraction of novel regularities is subtended by a gradual shift from rhythmic activity reflecting individual tone succession toward rhythmic activity synchronised with triplet presentation, and that these rhythmic processes are subtended by distinct neural sources.
Guala, M.; Liu, M.
2017-12-01
The kinematics of sediment particles is investigated by non-intrusive imaging methods to provide a statistical description of bedload transport in conditions near the threshold of motion. In particular, we focus on the cyclic transition between motion and rest regimes to quantify the waiting time statistics inferred to be responsible for anomalous diffusion, and so far elusive. Despite obvious limitations in the spatio-temporal domain of the observations, we are able to identify the probability distributions of the particle step time and length, velocity, acceleration, waiting time, and thus distinguish which quantities exhibit well converged mean values, based on the thickness of their respective tails. The experimental results shown here for four different transport conditions highlight the importance of the waiting time distribution and represent a benchmark dataset for the stochastic modeling of bedload transport.
Directory of Open Access Journals (Sweden)
Jean-Michel eHupé
2015-02-01
Full Text Available Published studies using functional and structural MRI include many errors in the way data are analyzed and conclusions reported. This was observed when working on a comprehensive review of the neural bases of synesthesia, but these errors are probably endemic to neuroimaging studies. All studies reviewed had based their conclusions using Null Hypothesis Significance Tests (NHST. NHST have yet been criticized since their inception because they are more appropriate for taking decisions related to a Null hypothesis (like in manufacturing than for making inferences about behavioral and neuronal processes. Here I focus on a few key problems of NHST related to brain imaging techniques, and explain why or when we should not rely on significance tests. I also observed that, often, the ill-posed logic of NHST was even not correctly applied, and describe what I identified as common mistakes or at least problematic practices in published papers, in light of what could be considered as the very basics of statistical inference. MRI statistics also involve much more complex issues than standard statistical inference. Analysis pipelines vary a lot between studies, even for those using the same software, and there is no consensus which pipeline is the best. I propose a synthetic view of the logic behind the possible methodological choices, and warn against the usage and interpretation of two statistical methods popular in brain imaging studies, the false discovery rate (FDR procedure and permutation tests. I suggest that current models for the analysis of brain imaging data suffer from serious limitations and call for a revision taking into account the new statistics (confidence intervals logic.
HI column density distribution function at z=0 : Connection to damped Ly alpha statistics
Zwaan, Martin; Verheijen, MAW; Briggs, FH
We present a measurement of the HI column density distribution function f(N-HI) at the present epoch for column densities > 10(20) cm(-2). These high column densities compare to those measured in damped Ly alpha lines seen in absorption against background quasars. Although observationally rare, it
A statistical and distributed packet filter against DDoS attacks in ...
Indian Academy of Sciences (India)
VIKASH C PANDEY
2018-03-14
Mar 14, 2018 ... Distributed Denial of Service (DDoS) attacks are a serious threat to Cloud. These attacks ... packet filtering model is proposed against DDoS attacks in Cloud. The key idea of this .... generates alerts or logs. If a deviation from ...
Statistics for PV, wind and biomass generators and their impact on distribution grid planning
Nykamp, Stefan; Molderink, Albert; Hurink, Johann L.; Smit, Gerardus Johannes Maria
2012-01-01
The integration of renewable energy generation leads to major challenges for distribution grid operators. When the feed-in of photovoltaic (PV), biomass and wind generators exceed significantly the local consumption, large investments are needed. To improve the knowledge on the interaction between
Directory of Open Access Journals (Sweden)
B. Azzouz
2007-01-01
Full Text Available The textile fibre mixture as a multicomponent blend of variable fibres imposes regarding the proper method to predict the characteristics of the final blend. The length diagram and the fibrogram of cotton are generated. Then the length distribution, the length diagram, and the fibrogram of a blend of different categories of cotton are determined. The length distributions by weight of five different categories of cotton (Egyptian, USA (Pima, Brazilian, USA (Upland, and Uzbekistani are measured by AFIS. From these distributions, the length distribution, the length diagram, and the fibrogram by weight of four binary blends are expressed. The length parameters of these cotton blends are calculated and their variations are plotted against the mass fraction x of one component in the blend .These calculated parameters are compared to those of real blends. Finally, the selection of the optimal blends using the linear programming method, based on the hypothesis that the cotton blend parameters vary linearly in function of the components rations, is proved insufficient.
Craig's XY distribution and the statistics of Lagrangian power in two-dimensional turbulence
Bandi, Mahesh M.; Connaughton, Colm
2008-03-01
We examine the probability distribution function (PDF) of the energy injection rate (power) in numerical simulations of stationary two-dimensional (2D) turbulence in the Lagrangian frame. The simulation is designed to mimic an electromagnetically driven fluid layer, a well-documented system for generating 2D turbulence in the laboratory. In our simulations, the forcing and velocity fields are close to Gaussian. On the other hand, the measured PDF of injected power is very sharply peaked at zero, suggestive of a singularity there, with tails which are exponential but asymmetric. Large positive fluctuations are more probable than large negative fluctuations. It is this asymmetry of the tails which leads to a net positive mean value for the energy input despite the most probable value being zero. The main features of the power distribution are well described by Craig’s XY distribution for the PDF of the product of two correlated normal variables. We show that the power distribution should exhibit a logarithmic singularity at zero and decay exponentially for large absolute values of the power. We calculate the asymptotic behavior and express the asymmetry of the tails in terms of the correlation coefficient of the force and velocity. We compare the measured PDFs with the theoretical calculations and briefly discuss how the power PDF might change with other forcing mechanisms.
Lee, L.; Helsel, D.
2007-01-01
Analysis of low concentrations of trace contaminants in environmental media often results in left-censored data that are below some limit of analytical precision. Interpretation of values becomes complicated when there are multiple detection limits in the data-perhaps as a result of changing analytical precision over time. Parametric and semi-parametric methods, such as maximum likelihood estimation and robust regression on order statistics, can be employed to model distributions of multiply censored data and provide estimates of summary statistics. However, these methods are based on assumptions about the underlying distribution of data. Nonparametric methods provide an alternative that does not require such assumptions. A standard nonparametric method for estimating summary statistics of multiply-censored data is the Kaplan-Meier (K-M) method. This method has seen widespread usage in the medical sciences within a general framework termed "survival analysis" where it is employed with right-censored time-to-failure data. However, K-M methods are equally valid for the left-censored data common in the geosciences. Our S-language software provides an analytical framework based on K-M methods that is tailored to the needs of the earth and environmental sciences community. This includes routines for the generation of empirical cumulative distribution functions, prediction or exceedance probabilities, and related confidence limits computation. Additionally, our software contains K-M-based routines for nonparametric hypothesis testing among an unlimited number of grouping variables. A primary characteristic of K-M methods is that they do not perform extrapolation and interpolation. Thus, these routines cannot be used to model statistics beyond the observed data range or when linear interpolation is desired. For such applications, the aforementioned parametric and semi-parametric methods must be used.
Directory of Open Access Journals (Sweden)
Krzysztof Jόzwikowska
2015-06-01
Full Text Available The main goal of this work is to determine a statistical non-equilibrium distribution function for the electron and holes in semiconductor heterostructures in steady-state conditions. Based on the postulates of local equilibrium, as well as on the integral form of the weighted Gyarmati’s variational principle in the force representation, using an alternative method, we have derived general expressions, which have the form of the Fermi–Dirac distribution function with four additional components. The physical interpretation of these components has been carried out in this paper. Some numerical results of a non-equilibrium distribution function for an electron in HgCdTe structures are also presented.
Some statistical aspects of characterizing the distribution of radionuclides in the environment
International Nuclear Information System (INIS)
Hutchinson, S.W.; Miller, F.L. Jr.
1981-05-01
A radiological characterization was performed at the former Kellex site, the first pilot uranium-diffusion plant, in Jersey City, New Jersey to determine if there were any contaminated regions containing 40 pCi/g or more of 238 U in the top 20 cm of soil over a 400 m 2 area. As a result of this radiological survey, final decisions would be made about the need for remedial action (cleanup) and the suitability for unrestricted use of the site. This paper describes the development and the statistical reasoning behind a sampling plan for the radiological characterization
Accounting providing of statistical analysis of intangible assets renewal under marketing strategy
Directory of Open Access Journals (Sweden)
I.R. Polishchuk
2016-12-01
Full Text Available The article analyzes the content of the Regulations on accounting policies of the surveyed enterprises in terms of the operations concerning the amortization of intangible assets on the following criteria: assessment on admission, determination of useful life, the period of depreciation, residual value, depreciation method, reflection in the financial statements, a unit of account, revaluation, formation of fair value. The characteristic of factors affecting the accounting policies and determining the mechanism for evaluating the completeness and timeliness of intangible assets renewal is showed. The algorithm for selecting the method of intangible assets amortization is proposed. The knowledge base of statistical analysis of timeliness and completeness of intangible assets renewal in terms of the developed internal reporting is expanded. The statistical indicators to assess the effectiveness of the amortization policy for intangible assets are proposed. The marketing strategies depending on the condition and amount of intangible assets in relation to increasing marketing potential for continuity of economic activity are described.
Arce-Romero, Antonio Rafael; Monterroso-Rivas, Alejandro Ismael; Gómez-Díaz, Jesús David; Cruz-León, Artemio
2017-01-01
Abstract Plums (Spondias spp.) are species native to Mexico with adaptive, nutritional and ethnobotanical advantages. The aim of this study was to assess the current and potential distribution of two species of Mexican plum: Spondias purpurea L. and Spondias mombin L. The method applied was ecological niche modeling in Maxent software, which has been used in Mexico with good results. In fieldwork, information on the presence of these species in the country was collected. In addition, environm...
Binary star statistics: the mass ratio distribution for very wide systems
International Nuclear Information System (INIS)
Trimble, V.
1987-01-01
The distribution of mass ratios for a sample of common proper motion (CPM) binaries is determined and compared with that of 798 visual binaries (VB's) studied earlier, in hopes of answering the question: Can the member stars of these systems have been drawn at random from the normal initial mass function for single stars? The observed distributions peak strongly toward q = 1.0 for both kinds of systems, but less strongly for the CPM's than for the VB's. Due allowance having been made for assorted observational selection effects, it seems quite probable that the CPM's represent the observed part of a population drawn at random from the normal IMF, while the VB's are much more difficult to interpret that way and could, perhaps, result from a formation mechanism that somewhat favors sytems with roughly equal components. (author)
Statistics for Ratios of Rayleigh, Rician, Nakagami-m, and Weibull Distributed Random Variables
Directory of Open Access Journals (Sweden)
Dragana Č. Pavlović
2013-01-01
Full Text Available The distributions of ratios of random variables are of interest in many areas of the sciences. In this brief paper, we present the joint probability density function (PDF and PDF of maximum of ratios μ1=R1/r1 and μ2=R2/r2 for the cases where R1, R2, r1, and r2 are Rayleigh, Rician, Nakagami-m, and Weibull distributed random variables. Random variables R1 and R2, as well as random variables r1 and r2, are correlated. Ascertaining on the suitability of the Weibull distribution to describe fading in both indoor and outdoor environments, special attention is dedicated to the case of Weibull random variables. For this case, analytical expressions for the joint PDF, PDF of maximum, PDF of minimum, and product moments of arbitrary number of ratios μi=Ri/ri, i=1,…,L are obtained. Random variables in numerator, Ri, as well as random variables in denominator, ri, are exponentially correlated. To the best of the authors' knowledge, analytical expressions for the PDF of minimum and product moments of {μi}i=1L are novel in the open technical literature. The proposed mathematical analysis is complemented by various numerical results. An application of presented theoretical results is illustrated with respect to performance assessment of wireless systems.
Boosting up quantum key distribution by learning statistics of practical single-photon sources
International Nuclear Information System (INIS)
Adachi, Yoritoshi; Yamamoto, Takashi; Koashi, Masato; Imoto, Nobuyuki
2009-01-01
We propose a simple quantum-key-distribution (QKD) scheme for practical single-photon sources (SPSs), which works even with a moderate suppression of the second-order correlation g (2) of the source. The scheme utilizes a passive preparation of a decoy state by monitoring a fraction of the signal via an additional beam splitter and a detector at the sender's side to monitor photon-number splitting attacks. We show that the achievable distance increases with the precision with which the sub-Poissonian tendency is confirmed in higher photon-number distribution of the source, rather than with actual suppression of the multiphoton emission events. We present an example of the secure key generation rate in the case of a poor SPS with g (2) =0.19, in which no secure key is produced with the conventional QKD scheme, and show that learning the photon-number distribution up to several numbers is sufficient for achieving almost the same distance as that of an ideal SPS.
Statistical distribution of the local purity in a large quantum system
International Nuclear Information System (INIS)
De Pasquale, A; Pascazio, S; Facchi, P; Giovannetti, V; Parisi, G; Scardicchio, A
2012-01-01
The local purity of large many-body quantum systems can be studied by following a statistical mechanical approach based on a random matrix model. Restricting the analysis to the case of global pure states, this method proved to be successful, and a full characterization of the statistical properties of the local purity was obtained by computing the partition function of the problem. Here we generalize these techniques to the case of global mixed states. In this context, by uniformly sampling the phase space of states with assigned global mixedness, we determine the exact expression of the first two moments of the local purity and a general expression for the moments of higher order. This generalizes previous results obtained for globally pure configurations. Furthermore, through the introduction of a partition function for a suitable canonical ensemble, we compute the approximate expression of the first moment of the marginal purity in the high-temperature regime. In the process, we establish a formal connection with the theory of quantum twirling maps that provides an alternative, possibly fruitful, way of performing the calculation. (paper)
Statistical quality analysis of schedulers under soft-real-time constraints
Baarsma, H.E.; Hurink, Johann L.; Jansen, P.G.
2007-01-01
This paper describes an algorithm to determine the performance of real-time systems with tasks using stochastic processing times. Such an algorithm can be used for guaranteeing Quality of Service of periodic tasks with soft real-time constraints. We use a discrete distribution model of processing
Fouladi, Rachel T.
2000-01-01
Provides an overview of standard and modified normal theory and asymptotically distribution-free covariance and correlation structure analysis techniques and details Monte Carlo simulation results on Type I and Type II error control. Demonstrates through the simulation that robustness and nonrobustness of structure analysis techniques vary as a…
Occurrence and distribution of soil Fusarium species under wheat crop in zero tillage
Energy Technology Data Exchange (ETDEWEB)
Silvestro, L. B.; Stenglein, S. A.; Forjan, H.; Dinolfo, M. I.; Aramburri, A. M.; Manso, L.; Moreno, M. V.
2013-05-01
The presence of Fusarium species in cultivated soils is commonly associated with plant debris and plant roots. Fusarium species are also soil saprophytes. The aim of this study was to examine the occurrence and distribution of soil Fusarium spp. at different soil depths in a zero tillage system after the wheat was harvested. Soil samples were obtained at three depths (0-5 cm, 5-10 cm and 10-20 cm) from five crop rotations: I, conservationist agriculture (wheat-sorghum-soybean); II, mixed agriculture/livestock with pastures, without using winter or summer forages (wheat-sorghum-soybean-canola-pastures); III, winter agriculture in depth limited soils (wheat-canola-barley-late soybean); IV, mixed with annual forage (wheat-oat/Vicia-sunflower); V, intensive agriculture (wheat-barley-canola, with alternation of soybean or late soybean). One hundred twenty two isolates of Fusarium were obtained and identified as F. equiseti, F. merismoides, F. oxysporum, F. scirpi and F. solani. The most prevalent species was F. oxysporum, which was observed in all sequences and depths. The Tukey's test showed that the relative frequency of F. oxysporum under intensive agricultural management was higher than in mixed traditional ones. The first 5 cm of soil showed statistically significant differences (p=0.05) with respect to 5-10 cm and 10-20 cm depths. The ANOVA test for the relative frequency of the other species as F. equiseti, F. merismoides, F. scirpi and F. solani, did not show statistically significant differences (p<0.05). We did not find significant differences (p<0.05) in the effect of crop rotations and depth on Shannon, Simpson indexes and species richness. Therefore we conclude that the different sequences and the sampling depth did not affect the alpha diversity of Fusarium community in this system. (Author) 51 refs.
International Nuclear Information System (INIS)
Hanot, C.; Riaud, P.; Absil, O.; Mennesson, B.; Martin, S.; Liewer, K.; Loya, F.; Mawet, D.; Serabyn, E.
2011-01-01
A new 'self-calibrated' statistical analysis method has been developed for the reduction of nulling interferometry data. The idea is to use the statistical distributions of the fluctuating null depth and beam intensities to retrieve the astrophysical null depth (or equivalently the object's visibility) in the presence of fast atmospheric fluctuations. The approach yields an accuracy much better (about an order of magnitude) than is presently possible with standard data reduction methods, because the astrophysical null depth accuracy is no longer limited by the magnitude of the instrumental phase and intensity errors but by uncertainties on their probability distributions. This approach was tested on the sky with the two-aperture fiber nulling instrument mounted on the Palomar Hale telescope. Using our new data analysis approach alone-and no observations of calibrators-we find that error bars on the astrophysical null depth as low as a few 10 -4 can be obtained in the near-infrared, which means that null depths lower than 10 -3 can be reliably measured. This statistical analysis is not specific to our instrument and may be applicable to other interferometers.
Quantum-like microeconomics: Statistical model of distribution of investments and production
Khrennikov, Andrei
2008-10-01
In this paper we demonstrate that the probabilistic quantum-like (QL) behavior-the Born’s rule, interference of probabilities, violation of Bell’s inequality, representation of variables by in general noncommutative self-adjoint operators, Schrödinger’s dynamics-can be exhibited not only by processes in the micro world, but also in economics. In our approach the QL-behavior is induced not by properties of systems. Here systems (commodities) are macroscopic. They could not be superpositions of two different states. In our approach the QL-behavior of economical statistics is a consequence of the organization of the process of production as well as investments. In particular, Hamiltonian (“financial energy”) is determined by rate of return.
Statistics analysis of distribution of Bradysia Ocellaris insect on Oyster mushroom cultivation
Sari, Kurnia Novita; Amelia, Ririn
2015-12-01
Bradysia Ocellaris insect is a pest on Oyster mushroom cultivation. The disitribution of Bradysia Ocellaris have a special pattern that can observed every week with several asumption such as independent, normality and homogenity. We can analyze the number of Bradysia Ocellaris for each week through descriptive analysis. Next, the distribution pattern of Bradysia Ocellaris is described through by semivariogram that is diagram of variance from difference value between pair of observation that separeted by d. Semivariogram model that suitable for Bradysia Ocellaris data is spherical isotropic model.
International Nuclear Information System (INIS)
Croce, R P; Demma, Th; Longo, M; Marano, S; Matta, V; Pierro, V; Pinto, I M
2003-01-01
The cumulative distribution of the supremum of a set (bank) of correlators is investigated in the context of maximum likelihood detection of gravitational wave chirps from coalescing binaries with unknown parameters. Accurate (lower-bound) approximants are introduced based on a suitable generalization of previous results by Mohanty. Asymptotic properties (in the limit where the number of correlators goes to infinity) are highlighted. The validity of numerical simulations made on small-size banks is extended to banks of any size, via a Gaussian correlation inequality
Portfolio selection problem with liquidity constraints under non-extensive statistical mechanics
International Nuclear Information System (INIS)
Zhao, Pan; Xiao, Qingxian
2016-01-01
In this study, we consider the optimal portfolio selection problem with liquidity limits. A portfolio selection model is proposed in which the risky asset price is driven by the process based on non-extensive statistical mechanics instead of the classic Wiener process. Using dynamic programming and Lagrange multiplier methods, we obtain the optimal policy and value function. Moreover, the numerical results indicate that this model is considerably different from the model based on the classic Wiener process, the optimal strategy is affected by the non-extensive parameter q, the increase in the investment in the risky asset is faster at a larger parameter q and the increase in wealth is similar.
Statistical analysis of traversal behavior under different types of traffic lights
Wang, Boran; Wang, Ziyang; Li, Zhiyin
2017-12-01
According to the video observation, it is found that the traffic signal type signal has a significant effect on the illegal crossing behavior of pedestrians at the intersection. Through the method of statistical analysis and variance analysis, the difference between the violation rate and the waiting position of pedestrians at different intersecting lights is compared, and the influence of traffic signal type on pedestrian crossing behavior is evaluated. The results show that the violation rate of the intersection of the static pedestrian lights is significantly higher than that of the countdown signal lights. There are significant differences in the waiting position of the intersection of different signal lights.
Panagiotopoulou, Olga; Pataky, Todd C; Hill, Zoe; Hutchinson, John R
2012-05-01
Foot pressure distributions during locomotion have causal links with the anatomical and structural configurations of the foot tissues and the mechanics of locomotion. Elephant feet have five toes bound in a flexible pad of fibrous tissue (digital cushion). Does this specialized foot design control peak foot pressures in such giant animals? And how does body size, such as during ontogenetic growth, influence foot pressures? We addressed these questions by studying foot pressure distributions in elephant feet and their correlation with body mass and centre of pressure trajectories, using statistical parametric mapping (SPM), a neuro-imaging technology. Our results show a positive correlation between body mass and peak pressures, with the highest pressures dominated by the distal ends of the lateral toes (digits 3, 4 and 5). We also demonstrate that pressure reduction in the elephant digital cushion is a complex interaction of its viscoelastic tissue structure and its centre of pressure trajectories, because there is a tendency to avoid rear 'heel' contact as an elephant grows. Using SPM, we present a complete map of pressure distributions in elephant feet during ontogeny by performing statistical analysis at the pixel level across the entire plantar/palmar surface. We hope that our study will build confidence in the potential clinical and scaling applications of mammalian foot pressures, given our findings in support of a link between regional peak pressures and pathogenesis in elephant feet.
International Nuclear Information System (INIS)
Poudineh, Rahmatallah; Jamasb, Tooraj
2016-01-01
Investment in electricity networks, as regulated natural monopolies, is among the highest regulatory and energy policy priorities. The electricity sector regulators adopt different incentive mechanisms to ensure that the firms undertake sufficient investment to maintain and modernise the grid. Thus, an effective regulatory treatment of investment requires understanding the response of companies to the regulatory incentives. This study analyses the determinants of investment in electricity distribution networks using a panel dataset of 129 Norwegian companies observed from 2004 to 2010. A Bayesian Model Averaging approach is used to provide a robust statistical inference by taking into account the uncertainties around model selection and estimation. The results show that three factors drive nearly all network investments: investment rate in previous period, socio-economic costs of energy not supplied and finally useful life of assets. The results indicate that Norwegian companies have, to some degree, responded to the investment incentives provided by the regulatory framework. However, some of the incentives do not appear to be effective in driving the investments. - Highlights: • This paper investigates determinants of investment under incentive regulation. • We apply a Bayesian model averaging technique to deal with model uncertainty. • Dataset comprises 129 Norwegian electricity network companies from 2004 to 2010. • The results show that firms have generally responded to investment incentives. • However, some of the incentives do not appear to have been effective.
Zhu, Jian-Rong; Li, Jian; Zhang, Chun-Mei; Wang, Qin
2017-10-01
The decoy-state method has been widely used in commercial quantum key distribution (QKD) systems. In view of the practical decoy-state QKD with both source errors and statistical fluctuations, we propose a universal model of full parameter optimization in biased decoy-state QKD with phase-randomized sources. Besides, we adopt this model to carry out simulations of two widely used sources: weak coherent source (WCS) and heralded single-photon source (HSPS). Results show that full parameter optimization can significantly improve not only the secure transmission distance but also the final key generation rate. And when taking source errors and statistical fluctuations into account, the performance of decoy-state QKD using HSPS suffered less than that of decoy-state QKD using WCS.
Directory of Open Access Journals (Sweden)
E. E. Woodfield
2002-12-01
Full Text Available A statistical investigation of the Doppler spectral width parameter routinely observed by HF coherent radars has been conducted between the Northern and Southern Hemispheres for the nightside ionosphere. Data from the SuperDARN radars at Thykkvibær, Iceland and Syowa East, Antarctica have been employed for this purpose. Both radars frequently observe regions of high (>200 ms-1 spectral width polewards of low (<200 ms-1 spectral width. Three years of data from both radars have been analysed both for the spectral width and line of sight velocity. The pointing direction of these two radars is such that the flow reversal boundary may be estimated from the velocity data, and therefore, we have an estimate of the open/closed field line boundary location for comparison with the high spectral widths. Five key observations regarding the behaviour of the spectral width on the nightside have been made. These are (i the two radars observe similar characteristics on a statistical basis; (ii a latitudinal dependence related to magnetic local time is found in both hemispheres; (iii a seasonal dependence of the spectral width is observed by both radars, which shows a marked absence of latitudinal dependence during the summer months; (iv in general, the Syowa East spectral width tends to be larger than that from Iceland East, and (v the highest spectral widths seem to appear on both open and closed field lines. Points (i and (ii indicate that the cause of high spectral width is magnetospheric in origin. Point (iii suggests that either the propagation of the HF radio waves to regions of high spectral width or the generating mechanism(s for high spectral width is affected by solar illumination or other seasonal effects. Point (iv suggests that the radar beams from each of the radars are subject either to different instrumental or propagation effects, or different geophysical conditions due to their locations, although we suggest that this result is more likely to
DEFF Research Database (Denmark)
Sørensen, John Dalsgaard; Enevoldsen, I.
1993-01-01
It has been observed and shown that in some examples a sensitivity analysis of the first order reliability index results in increasing reliability index, when the standard deviation for a stochastic variable is increased while the expected value is fixed. This unfortunate behaviour can occur when...... a stochastic variable is modelled by an asymmetrical density function. For lognormally, Gumbel and Weibull distributed stochastic variables it is shown for which combinations of the/3-point, the expected value and standard deviation the weakness can occur. In relation to practical application the behaviour...... is probably rather infrequent. A simple example is shown as illustration and to exemplify that for second order reliability methods and for exact calculations of the probability of failure this behaviour is much more infrequent....
Statistical inferences with jointly type-II censored samples from two Pareto distributions
Abu-Zinadah, Hanaa H.
2017-08-01
In the several fields of industries the product comes from more than one production line, which is required to work the comparative life tests. This problem requires sampling of the different production lines, then the joint censoring scheme is appeared. In this article we consider the life time Pareto distribution with jointly type-II censoring scheme. The maximum likelihood estimators (MLE) and the corresponding approximate confidence intervals as well as the bootstrap confidence intervals of the model parameters are obtained. Also Bayesian point and credible intervals of the model parameters are presented. The life time data set is analyzed for illustrative purposes. Monte Carlo results from simulation studies are presented to assess the performance of our proposed method.
International Nuclear Information System (INIS)
Vardavas, I.M.
1992-01-01
A simple procedure is presented for the statistical analysis of measurement data where the primary concern is the determination of the value corresponding to a specified average exceedance probability. The analysis employs the normal and log-normal frequency distributions together with a χ 2 -test and an error analysis. The error analysis introduces the concept of a counting error criterion, or ζ-test, to test whether the data are sufficient to make the Z 2 -test reliable. The procedure is applied to the analysis of annual rainfall data recorded at stations in the tropical Top End of Australia where the Ranger uranium deposit is situated. 9 refs., 12 tabs., 9 figs
McArtor, Daniel B; Lubke, Gitta H; Bergeman, C S
2017-12-01
Person-centered methods are useful for studying individual differences in terms of (dis)similarities between response profiles on multivariate outcomes. Multivariate distance matrix regression (MDMR) tests the significance of associations of response profile (dis)similarities and a set of predictors using permutation tests. This paper extends MDMR by deriving and empirically validating the asymptotic null distribution of its test statistic, and by proposing an effect size for individual outcome variables, which is shown to recover true associations. These extensions alleviate the computational burden of permutation tests currently used in MDMR and render more informative results, thus making MDMR accessible to new research domains.
Dabanlı, İsmail; Şen, Zekai
2018-04-01
The statistical climate downscaling model by the Turkish Water Foundation (TWF) is further developed and applied to a set of monthly precipitation records. The model is structured by two phases as spatial (regional) and temporal downscaling of global circulation model (GCM) scenarios. The TWF model takes into consideration the regional dependence function (RDF) for spatial structure and Markov whitening process (MWP) for temporal characteristics of the records to set projections. The impact of climate change on monthly precipitations is studied by downscaling Intergovernmental Panel on Climate Change-Special Report on Emission Scenarios (IPCC-SRES) A2 and B2 emission scenarios from Max Plank Institute (EH40PYC) and Hadley Center (HadCM3). The main purposes are to explain the TWF statistical climate downscaling model procedures and to expose the validation tests, which are rewarded in same specifications as "very good" for all stations except one (Suhut) station in the Akarcay basin that is in the west central part of Turkey. Eventhough, the validation score is just a bit lower at the Suhut station, the results are "satisfactory." It is, therefore, possible to say that the TWF model has reasonably acceptable skill for highly accurate estimation regarding standard deviation ratio (SDR), Nash-Sutcliffe efficiency (NSE), and percent bias (PBIAS) criteria. Based on the validated model, precipitation predictions are generated from 2011 to 2100 by using 30-year reference observation period (1981-2010). Precipitation arithmetic average and standard deviation have less than 5% error for EH40PYC and HadCM3 SRES (A2 and B2) scenarios.
2012-01-01
Background The metals bioavailability in soils is commonly assessed by chemical extractions; however a generally accepted method is not yet established. In this study, the effectiveness of Diffusive Gradients in Thin-films (DGT) technique and single extractions in the assessment of metals bioaccumulation in vegetables, and the influence of soil parameters on phytoavailability were evaluated using multivariate statistics. Soil and plants grown in vegetable gardens from mining-affected rural areas, NW Romania, were collected and analysed. Results Pseudo-total metal content of Cu, Zn and Cd in soil ranged between 17.3-146 mg kg-1, 141–833 mg kg-1 and 0.15-2.05 mg kg-1, respectively, showing enriched contents of these elements. High degrees of metals extractability in 1M HCl and even in 1M NH4Cl were observed. Despite the relatively high total metal concentrations in soil, those found in vegetables were comparable to values typically reported for agricultural crops, probably due to the low concentrations of metals in soil solution (Csoln) and low effective concentrations (CE), assessed by DGT technique. Among the analysed vegetables, the highest metal concentrations were found in carrots roots. By applying multivariate statistics, it was found that CE, Csoln and extraction in 1M NH4Cl, were better predictors for metals bioavailability than the acid extractions applied in this study. Copper transfer to vegetables was strongly influenced by soil organic carbon (OC) and cation exchange capacity (CEC), while pH had a higher influence on Cd transfer from soil to plants. Conclusions The results showed that DGT can be used for general evaluation of the risks associated to soil contamination with Cu, Zn and Cd in field conditions. Although quantitative information on metals transfer from soil to vegetables was not observed. PMID:23079133
Sohn, Illsoo; Lee, Byong Ok; Lee, Kwang Bok
Recently, multimedia services are increasing with the widespread use of various wireless applications such as web browsers, real-time video, and interactive games, which results in traffic asymmetry between the uplink and downlink. Hence, time division duplex (TDD) systems which provide advantages in efficient bandwidth utilization under asymmetric traffic environments have become one of the most important issues in future mobile cellular systems. It is known that two types of intercell interference, referred to as crossed-slot interference, additionally arise in TDD systems; the performances of the uplink and downlink transmissions are degraded by BS-to-BS crossed-slot interference and MS-to-MS crossed-slot interference, respectively. The resulting performance unbalance between the uplink and downlink makes network deployment severely inefficient. Previous works have proposed intelligent time slot allocation algorithms to mitigate the crossed-slot interference problem. However, they require centralized control, which causes large signaling overhead in the network. In this paper, we propose to change the shape of the cellular structure itself. The conventional cellular structure is easily transformed into the proposed cellular structure with distributed receive antennas (DRAs). We set up statistical Markov chain traffic model and analyze the bit error performances of the conventional cellular structure and proposed cellular structure under asymmetric traffic environments. Numerical results show that the uplink and downlink performances of the proposed cellular structure become balanced with the proper number of DRAs and thus the proposed cellular structure is notably cost-effective in network deployment compared to the conventional cellular structure. As a result, extending the conventional cellular structure into the proposed cellular structure with DRAs is a remarkably cost-effective solution to support asymmetric traffic environments in future mobile cellular
International Nuclear Information System (INIS)
Lewis, J.C.
2011-01-01
In a recent paper (Lewis, 2008) a class of models suitable for application to collision-sequence interference was introduced. In these models velocities are assumed to be completely randomized in each collision. The distribution of velocities was assumed to be Gaussian. The integrated induced dipole moment μk, for vector interference, or the scalar modulation μk, for scalar interference, was assumed to be a function of the impulse (integrated force) fk, or its magnitude fk, experienced by the molecule in a collision. For most of (Lewis, 2008) it was assumed that μk fk and μk fk, but it proved to be possible to extend the models, so that the magnitude of the induced dipole moment is equal to an arbitrary power or sum of powers of the intermolecular force. This allows estimates of the in filling of the interference dip by the dis proportionality of the induced dipole moment and force. One particular such model, using data from (Herman and Lewis, 2006), leads to the most realistic estimate for the in filling of the vector interference dip yet obtained. In (Lewis, 2008) the drastic assumption was made that collision times occurred at equal intervals. In the present paper that assumption is removed: the collision times are taken to form a Poisson process. This is much more realistic than the equal-intervals assumption. The interference dip is found to be a Lorentzian in this model
Botbol, Joseph Moses; Evenden, Gerald Ian
1989-01-01
Tables, graphs, and maps are used to portray the frequency characteristics and spatial distribution of manganese oxide-rich phase geochemical data, to characterize the northern Pacific in terms of publicly available nodule geochemical data, and to develop data portrayal methods that will facilitate data analysis. Source data are a subset of the Scripps Institute of Oceanography's Sediment Data Bank. The study area is bounded by 0° N., 40° N., 120° E., and 100° W. and is arbitrarily subdivided into 14-20°x20° geographic subregions. Frequency distributions of trace metals characterized in the original raw data are graphed as ogives, and salient parameters are tabulated. All variables are transformed to enrichment values relative to median concentration within their host subregions. Scatter plots of all pairs of original variables and their enrichment transforms are provided as an aid to the interpretation of correlations between variables. Gridded spatial distributions of all variables are portrayed as gray-scale maps. The use of tables and graphs to portray frequency statistics and gray-scale maps to portray spatial distributions is an effective way to prepare for and facilitate multivariate data analysis.
Li, Ning; Liu, Xueqin; Xie, Wei; Wu, Jidong; Zhang, Peng
2013-01-01
New features of natural disasters have been observed over the last several years. The factors that influence the disasters' formation mechanisms, regularity of occurrence and main characteristics have been revealed to be more complicated and diverse in nature than previously thought. As the uncertainty involved increases, the variables need to be examined further. This article discusses the importance and the shortage of multivariate analysis of natural disasters and presents a method to estimate the joint probability of the return periods and perform a risk analysis. Severe dust storms from 1990 to 2008 in Inner Mongolia were used as a case study to test this new methodology, as they are normal and recurring climatic phenomena on Earth. Based on the 79 investigated events and according to the dust storm definition with bivariate, the joint probability distribution of severe dust storms was established using the observed data of maximum wind speed and duration. The joint return periods of severe dust storms were calculated, and the relevant risk was analyzed according to the joint probability. The copula function is able to simulate severe dust storm disasters accurately. The joint return periods generated are closer to those observed in reality than the univariate return periods and thus have more value in severe dust storm disaster mitigation, strategy making, program design, and improvement of risk management. This research may prove useful in risk-based decision making. The exploration of multivariate analysis methods can also lay the foundation for further applications in natural disaster risk analysis. © 2012 Society for Risk Analysis.
Fusion of fuzzy statistical distributions for classification of thyroid ultrasound patterns.
Iakovidis, Dimitris K; Keramidas, Eystratios G; Maroulis, Dimitris
2010-09-01
This paper proposes a novel approach for thyroid ultrasound pattern representation. Considering that texture and echogenicity are correlated with thyroid malignancy, the proposed approach encodes these sonographic features via a noise-resistant representation. This representation is suitable for the discrimination of nodules of high malignancy risk from normal thyroid parenchyma. The material used in this study includes a total of 250 thyroid ultrasound patterns obtained from 75 patients in Greece. The patterns are represented by fused vectors of fuzzy features. Ultrasound texture is represented by fuzzy local binary patterns, whereas echogenicity is represented by fuzzy intensity histograms. The encoded thyroid ultrasound patterns are discriminated by support vector classifiers. The proposed approach was comprehensively evaluated using receiver operating characteristics (ROCs). The results show that the proposed fusion scheme outperforms previous thyroid ultrasound pattern representation methods proposed in the literature. The best classification accuracy was obtained with a polynomial kernel support vector machine, and reached 97.5% as estimated by the area under the ROC curve. The fusion of fuzzy local binary patterns and fuzzy grey-level histogram features is more effective than the state of the art approaches for the representation of thyroid ultrasound patterns and can be effectively utilized for the detection of nodules of high malignancy risk in the context of an intelligent medical system. Copyright (c) 2010 Elsevier B.V. All rights reserved.
International Nuclear Information System (INIS)
Corti, D.S.; Debenedetti, P.G.
1998-01-01
The rigorous statistical mechanics of metastability requires the imposition of internal constraints that prevent access to regions of phase space corresponding to inhomogeneous states. We derive exactly the Helmholtz energy and equation of state of the one-dimensional hard rod fluid under the influence of an internal constraint that places an upper bound on the distance between nearest-neighbor rods. This type of constraint is relevant to the suppression of boiling in a superheated liquid. We determine the effects of this constraint upon the thermophysical properties and internal structure of the hard rod fluid. By adding an infinitely weak and infinitely long-ranged attractive potential to the hard core, the fluid exhibits a first-order vapor-liquid transition. We determine exactly the equation of state of the one-dimensional superheated liquid and show that it exhibits metastable phase equilibrium. We also derive statistical mechanical relations for the equation of state of a fluid under the action of arbitrary constraints, and show the connection between the statistical mechanics of constrained and unconstrained ensembles. copyright 1998 The American Physical Society
Hansen, John P
2003-01-01
Healthcare quality improvement professionals need to understand and use inferential statistics to interpret sample data from their organizations. In quality improvement and healthcare research studies all the data from a population often are not available, so investigators take samples and make inferences about the population by using inferential statistics. This three-part series will give readers an understanding of the concepts of inferential statistics as well as the specific tools for calculating confidence intervals for samples of data. This article, Part 2, describes probability, populations, and samples. The uses of descriptive and inferential statistics are outlined. The article also discusses the properties and probability of normal distributions, including the standard normal distribution.
Kulikov, Mikhail Y.; Nechaev, Anton A.; Belikovich, Mikhail V.; Ermakova, Tatiana S.; Feigin, Alexander M.
2018-05-01
This Technical Note presents a statistical approach to evaluating simultaneous measurements of several atmospheric components under the assumption of photochemical equilibrium. We consider simultaneous measurements of OH, HO2, and O3 at the altitudes of the mesosphere as a specific example and their daytime photochemical equilibrium as an evaluating relationship. A simplified algebraic equation relating local concentrations of these components in the 50-100 km altitude range has been derived. The parameters of the equation are temperature, neutral density, local zenith angle, and the rates of eight reactions. We have performed a one-year simulation of the mesosphere and lower thermosphere using a 3-D chemical-transport model. The simulation shows that the discrepancy between the calculated evolution of the components and the equilibrium value given by the equation does not exceed 3-4 % in the full range of altitudes independent of season or latitude. We have developed a statistical Bayesian evaluation technique for simultaneous measurements of OH, HO2, and O3 based on the equilibrium equation taking into account the measurement error. The first results of the application of the technique to MLS/Aura data (Microwave Limb Sounder) are presented in this Technical Note. It has been found that the satellite data of the HO2 distribution regularly demonstrate lower altitudes of this component's mesospheric maximum. This has also been confirmed by model HO2 distributions and comparison with offline retrieval of HO2 from the daily zonal means MLS radiance.
Meynard, Christine N; Migeon, Alain; Navajas, Maria
2013-01-01
Many species are shifting their distributions due to climate change and to increasing international trade that allows dispersal of individuals across the globe. In the case of agricultural pests, such range shifts may heavily impact agriculture. Species distribution modelling may help to predict potential changes in pest distributions. However, these modelling strategies are subject to large uncertainties coming from different sources. Here we used the case of the tomato red spider mite (Tetranychus evansi), an invasive pest that affects some of the most important agricultural crops worldwide, to show how uncertainty may affect forecasts of the potential range of the species. We explored three aspects of uncertainty: (1) species prevalence; (2) modelling method; and (3) variability in environmental responses between mites belonging to two invasive clades of T. evansi. Consensus techniques were used to forecast the potential range of the species under current and two different climate change scenarios for 2080, and variance between model projections were mapped to identify regions of high uncertainty. We revealed large predictive variations linked to all factors, although prevalence had a greater influence than the statistical model once the best modelling strategies were selected. The major areas threatened under current conditions include tropical countries in South America and Africa, and temperate regions in North America, the Mediterranean basin and Australia. Under future scenarios, the threat shifts towards northern Europe and some other temperate regions in the Americas, whereas tropical regions in Africa present a reduced risk. Analysis of niche overlap suggests that the current differential distribution of mites of the two clades of T. evansi can be partially attributed to environmental niche differentiation. Overall this study shows how consensus strategies and analysis of niche overlap can be used jointly to draw conclusions on invasive threat
Directory of Open Access Journals (Sweden)
Christine N Meynard
Full Text Available Many species are shifting their distributions due to climate change and to increasing international trade that allows dispersal of individuals across the globe. In the case of agricultural pests, such range shifts may heavily impact agriculture. Species distribution modelling may help to predict potential changes in pest distributions. However, these modelling strategies are subject to large uncertainties coming from different sources. Here we used the case of the tomato red spider mite (Tetranychus evansi, an invasive pest that affects some of the most important agricultural crops worldwide, to show how uncertainty may affect forecasts of the potential range of the species. We explored three aspects of uncertainty: (1 species prevalence; (2 modelling method; and (3 variability in environmental responses between mites belonging to two invasive clades of T. evansi. Consensus techniques were used to forecast the potential range of the species under current and two different climate change scenarios for 2080, and variance between model projections were mapped to identify regions of high uncertainty. We revealed large predictive variations linked to all factors, although prevalence had a greater influence than the statistical model once the best modelling strategies were selected. The major areas threatened under current conditions include tropical countries in South America and Africa, and temperate regions in North America, the Mediterranean basin and Australia. Under future scenarios, the threat shifts towards northern Europe and some other temperate regions in the Americas, whereas tropical regions in Africa present a reduced risk. Analysis of niche overlap suggests that the current differential distribution of mites of the two clades of T. evansi can be partially attributed to environmental niche differentiation. Overall this study shows how consensus strategies and analysis of niche overlap can be used jointly to draw conclusions on invasive
Statistical inference for the additive hazards model under outcome-dependent sampling.
Yu, Jichang; Liu, Yanyan; Sandler, Dale P; Zhou, Haibo
2015-09-01
Cost-effective study design and proper inference procedures for data from such designs are always of particular interests to study investigators. In this article, we propose a biased sampling scheme, an outcome-dependent sampling (ODS) design for survival data with right censoring under the additive hazards model. We develop a weighted pseudo-score estimator for the regression parameters for the proposed design and derive the asymptotic properties of the proposed estimator. We also provide some suggestions for using the proposed method by evaluating the relative efficiency of the proposed method against simple random sampling design and derive the optimal allocation of the subsamples for the proposed design. Simulation studies show that the proposed ODS design is more powerful than other existing designs and the proposed estimator is more efficient than other estimators. We apply our method to analyze a cancer study conducted at NIEHS, the Cancer Incidence and Mortality of Uranium Miners Study, to study the risk of radon exposure to cancer.
Crouch, Daniel J M
2017-10-27
The prevalence of sexual reproduction remains mysterious, as it poses clear evolutionary drawbacks compared to reproducing asexually. Several possible explanations exist, with one of the most likely being that finite population size causes linkage disequilibria to randomly generate and impede the progress of natural selection, and that these are eroded by recombination via sexual reproduction. Previous investigations have either analysed this phenomenon in detail for small numbers of loci, or performed population simulations for many loci. Here we present a quantitative genetic model for fitness, based on the Price Equation, in order to examine the theoretical consequences of randomly generated linkage disequilibria when there are many loci. In addition, most previous work has been concerned with the long-term consequences of deleterious linkage disequilibria for population fitness. The expected change in mean fitness between consecutive generations, a measure of short-term evolutionary success, is shown under random environmental influences to be related to the autocovariance in mean fitness between the generations, capturing the effects of stochastic forces such as genetic drift. Interaction between genetic drift and natural selection, due to randomly generated linkage disequilibria, is demonstrated to be one possible source of mean fitness autocovariance. This suggests a possible role for sexual reproduction in reducing the negative effects of genetic drift, thereby improving the short-term efficacy of natural selection. Copyright © 2017 Elsevier Ltd. All rights reserved.
Gillis, D.M.; Rijnsdorp, A.D.; Poos, J.J.
2008-01-01
The objective identification of targeting behavior in multispecies fisheries is critical to the development and evaluation of management measures. Here, we illustrate how the statistical distribution of commercial catches can provide information on species preference that is consistent with economic
Statistical analysis of wind speed using two-parameter Weibull distribution in Alaçatı region
International Nuclear Information System (INIS)
Ozay, Can; Celiktas, Melih Soner
2016-01-01
Highlights: • Wind speed & direction data from September 2008 to March 2014 has been analyzed. • Mean wind speed for the whole data set has been found to be 8.11 m/s. • Highest wind speed is observed in July with a monthly mean value of 9.10 m/s. • Wind speed with the most energy has been calculated as 12.77 m/s. • Observed data has been fit to a Weibull distribution and k &c parameters have been calculated as 2.05 and 9.16. - Abstract: Weibull Statistical Distribution is a common method for analyzing wind speed measurements and determining wind energy potential. Weibull probability density function can be used to forecast wind speed, wind density and wind energy potential. In this study a two-parameter Weibull statistical distribution is used to analyze the wind characteristics of Alaçatı region, located in Çeşme, İzmir. The data used in the density function are acquired from a wind measurement station in Alaçatı. Measurements were gathered on three different heights respectively 70, 50 and 30 m between 10 min intervals for five and half years. As a result of this study; wind speed frequency distribution, wind direction trends, mean wind speed, and the shape and the scale (k&c) Weibull parameters have been calculated for the region. Mean wind speed for the entirety of the data set is found to be 8.11 m/s. k&c parameters are found as 2.05 and 9.16 in relative order. Wind direction analysis along with a wind rose graph for the region is also provided with the study. Analysis suggests that higher wind speeds which range from 6–12 m/s are prevalent between the sectors 340–360°. Lower wind speeds, from 3 to 6 m/s occur between sectors 10–29°. Results of this study contribute to the general knowledge about the regions wind energy potential and can be used as a source for investors and academics.
International Nuclear Information System (INIS)
Zheng, Feihu; An, Zhenlian; Zhang, Yewen; Liu, Chuandong; Lin, Chen; Lei, Qingquan
2013-01-01
The thermal pulse method is a powerful method to measure space charge and polarization distributions in thin dielectric films, but a complicated calibration procedure is necessary to obtain the real distribution. In addition, charge dynamic behaviour under an applied electric field cannot be observed by the classical thermal pulse method. In this work, an improved thermal pulse measuring system with a supplemental circuit for applying high voltage is proposed to realize the mapping of charge distribution in thin dielectric films under an applied field. The influence of the modified measuring system on the amplitude and phase of the thermal pulse response current are evaluated. Based on the new measuring system, an easy calibration approach is presented with some practical examples. The newly developed system can observe space charge evolution under an applied field, which would be very helpful in understanding space charge behaviour in thin films. (paper)
International Nuclear Information System (INIS)
Silva Junior, H.C. da.
1978-12-01
Reactor fuel elements generally consist of rod bundles with the coolant flowing axially through the region between the rods. The confiability of the thermohydraulic design of such elements is related to a detailed description of the velocity field. A two-equation statistical model (K-epsilon) of turbulence is applied to compute main and secondary flow fields, wall shear stress distributions and friction factors of steady, fully developed turbulent flows, with incompressible, temperature independent fluid flowing axially through triangular or square arrays of rod bundles. The numerical procedure uses the vorticity and the stream function to describe the velocity field. Comparison with experimental and analytical data of several investigators is presented. Results are in good agreement. (Author) [pt
Mwakanyamale, Kisa; Day-Lewis, Frederick D.; Slater, Lee D.
2013-01-01
Fiber-optic distributed temperature sensing (FO-DTS) increasingly is used to map zones of focused groundwater/surface-water exchange (GWSWE). Previous studies of GWSWE using FO-DTS involved identification of zones of focused GWSWE based on arbitrary cutoffs of FO-DTS time-series statistics (e.g., variance, cross-correlation between temperature and stage, or spectral power). New approaches are needed to extract more quantitative information from large, complex FO-DTS data sets while concurrently providing an assessment of uncertainty associated with mapping zones of focused GSWSE. Toward this end, we present a strategy combining discriminant analysis (DA) and spectral analysis (SA). We demonstrate the approach using field experimental data from a reach of the Columbia River adjacent to the Hanford 300 Area site. Results of the combined SA/DA approach are shown to be superior to previous results from qualitative interpretation of FO-DTS spectra alone.
Craven, Galen T.; Nitzan, Abraham
2018-01-01
Statistical properties of Brownian motion that arise by analyzing, separately, trajectories over which the system energy increases (upside) or decreases (downside) with respect to a threshold energy level are derived. This selective analysis is applied to examine transport properties of a nonequilibrium Brownian process that is coupled to multiple thermal sources characterized by different temperatures. Distributions, moments, and correlation functions of a free particle that occur during upside and downside events are investigated for energy activation and energy relaxation processes and also for positive and negative energy fluctuations from the average energy. The presented results are sufficiently general and can be applied without modification to the standard Brownian motion. This article focuses on the mathematical basis of this selective analysis. In subsequent articles in this series, we apply this general formalism to processes in which heat transfer between thermal reservoirs is mediated by activated rate processes that take place in a system bridging them.
Directory of Open Access Journals (Sweden)
Kurniasih Anis
2017-01-01
Full Text Available Analysis of foraminifera in geology,usually being used to find the age of rocks/ sediments and depositional environment. In this study, recent foraminifera was used not only to determinethe sedimentary environment,but also to estimate the ecological condition of the water through a statistical approach.Analysis was performed quantitatively in 10 surface seabed sediment samples in Weda Bay North Maluku. The analysis includes dominance (Sympson Index, diversity and evenness (Shannon Index, and the ratio of planktonic -benthic. The results were shown in the plotting diagram of M-R-T (Miliolid-Rotalid-Textularid to determine the depositional environment. Quantitative analysis was performed using Past software (paleontological version Statistic 1:29.The analysis result showed there was no domination of certain taxon with a moderate degree of evenness and stable communities and considerably a moderate diversity. The results of this analysis indicated that research area had a stable water conditions with the optimum level of carbonate content, oxygen supply, salinity, and temperature. The ratio of planktonic and benthic indicate the relative depth, which was deeper the water increased the percentage of planktonic foraminifera. Based on M-R-T diagram showed the distribution of sediment deposited on exposed carbonate (carbonate platform environment with normal saline.
Spampinato, A.; Axinte, D. A.
2017-12-01
The mechanisms of interaction between bodies with statistically arranged features present characteristics common to different abrasive processes, such as dressing of abrasive tools. In contrast with the current empirical approach used to estimate the results of operations based on attritive interactions, the method we present in this paper allows us to predict the output forces and the topography of a simulated grinding wheel for a set of specific operational parameters (speed ratio and radial feed-rate), providing a thorough understanding of the complex mechanisms regulating these processes. In modelling the dressing mechanisms, the abrasive characteristics of both bodies (grain size, geometry, inter-space and protrusion) are first simulated; thus, their interaction is simulated in terms of grain collisions. Exploiting a specifically designed contact/impact evaluation algorithm, the model simulates the collisional effects of the dresser abrasives on the grinding wheel topography (grain fracture/break-out). The method has been tested for the case of a diamond rotary dresser, predicting output forces within less than 10% error and obtaining experimentally validated grinding wheel topographies. The study provides a fundamental understanding of the dressing operation, enabling the improvement of its performance in an industrial scenario, while being of general interest in modelling collision-based processes involving statistically distributed elements.
Kuss, Oliver; Hoyer, Annika; Solms, Alexander
2014-01-15
There are still challenges when meta-analyzing data from studies on diagnostic accuracy. This is mainly due to the bivariate nature of the response where information on sensitivity and specificity must be summarized while accounting for their correlation within a single trial. In this paper, we propose a new statistical model for the meta-analysis for diagnostic accuracy studies. This model uses beta-binomial distributions for the marginal numbers of true positives and true negatives and links these margins by a bivariate copula distribution. The new model comes with all the features of the current standard model, a bivariate logistic regression model with random effects, but has the additional advantages of a closed likelihood function and a larger flexibility for the correlation structure of sensitivity and specificity. In a simulation study, which compares three copula models and two implementations of the standard model, the Plackett and the Gauss copula do rarely perform worse but frequently better than the standard model. We use an example from a meta-analysis to judge the diagnostic accuracy of telomerase (a urinary tumor marker) for the diagnosis of primary bladder cancer for illustration. Copyright © 2013 John Wiley & Sons, Ltd.
Haaser, R. A.
2011-12-01
The Ion Velocity Meter (IVM), a part of the Coupled Ion Neutral Dynamics Investigation (CINDI) aboard the Communication/ Navigation Outage Forecasting System (C/NOFS) satellite, is used to measure in situ ion densities and drifts at altitudes between 400 and 550 km during the nighttime hours from 2100 to 300 local time. A new approach to detecting and classifying well-formed ionospheric plasma depletion and enhancement plumes (bubbles and blobs) of scale sizes between 50 and 500 km is used to develop geophysical statistics for the summer, winter and equinox seasons of the quiet solar conditions during 2009 and 2010. Some diurnal and seasonal geomagnetic distribution characteristics confirm previous work on irregularities and scintillations, while others reveal new behaviors that require additional observations and modeling to promote full understanding.
Energy Technology Data Exchange (ETDEWEB)
Chen, Hao [School of Tourism and Environment, Shaanxi Normal University, Xi' an 710062 (China); Lu, Xinwei, E-mail: luxinwei@snnu.edu.cn [School of Tourism and Environment, Shaanxi Normal University, Xi' an 710062 (China); Li, Loretta Y., E-mail: lli@civil.ubc.ca [Department of Civil Engineering, University of British Columbia, Vancouver V6T 1Z4 (Canada); Gao, Tianning; Chang, Yuyu [School of Tourism and Environment, Shaanxi Normal University, Xi' an 710062 (China)
2014-06-01
The concentrations of As, Ba, Co, Cr, Cu, Mn, Ni, Pb, V and Zn in campus dust from kindergartens, elementary schools, middle schools and universities of Xi'an, China were determined by X-ray fluorescence spectrometry. Correlation coefficient analysis, principal component analysis (PCA) and cluster analysis (CA) were used to analyze the data and to identify possible sources of these metals in the dust. The spatial distributions of metals in urban dust of Xi'an were analyzed based on the metal concentrations in campus dusts using the geostatistics method. The results indicate that dust samples from campuses have elevated metal concentrations, especially for Pb, Zn, Co, Cu, Cr and Ba, with the mean values of 7.1, 5.6, 3.7, 2.9, 2.5 and 1.9 times the background values for Shaanxi soil, respectively. The enrichment factor results indicate that Mn, Ni, V, As and Ba in the campus dust were deficiently to minimally enriched, mainly affected by nature and partly by anthropogenic sources, while Co, Cr, Cu, Pb and Zn in the campus dust and especially Pb and Zn were mostly affected by human activities. As and Cu, Mn and Ni, Ba and V, and Pb and Zn had similar distribution patterns. The southwest high-tech industrial area and south commercial and residential areas have relatively high levels of most metals. Three main sources were identified based on correlation coefficient analysis, PCA, CA, as well as spatial distribution characteristics. As, Ni, Cu, Mn, Pb, Zn and Cr have mixed sources — nature, traffic, as well as fossil fuel combustion and weathering of materials. Ba and V are mainly derived from nature, but partly also from industrial emissions, as well as construction sources, while Co principally originates from construction. - Highlights: • Metal content in dust from schools was determined by XRF. • Spatial distribution of metals in urban dust was focused on campus samples. • Multivariate statistic and spatial distribution were used to identify metal
International Nuclear Information System (INIS)
Chen, Hao; Lu, Xinwei; Li, Loretta Y.; Gao, Tianning; Chang, Yuyu
2014-01-01
The concentrations of As, Ba, Co, Cr, Cu, Mn, Ni, Pb, V and Zn in campus dust from kindergartens, elementary schools, middle schools and universities of Xi'an, China were determined by X-ray fluorescence spectrometry. Correlation coefficient analysis, principal component analysis (PCA) and cluster analysis (CA) were used to analyze the data and to identify possible sources of these metals in the dust. The spatial distributions of metals in urban dust of Xi'an were analyzed based on the metal concentrations in campus dusts using the geostatistics method. The results indicate that dust samples from campuses have elevated metal concentrations, especially for Pb, Zn, Co, Cu, Cr and Ba, with the mean values of 7.1, 5.6, 3.7, 2.9, 2.5 and 1.9 times the background values for Shaanxi soil, respectively. The enrichment factor results indicate that Mn, Ni, V, As and Ba in the campus dust were deficiently to minimally enriched, mainly affected by nature and partly by anthropogenic sources, while Co, Cr, Cu, Pb and Zn in the campus dust and especially Pb and Zn were mostly affected by human activities. As and Cu, Mn and Ni, Ba and V, and Pb and Zn had similar distribution patterns. The southwest high-tech industrial area and south commercial and residential areas have relatively high levels of most metals. Three main sources were identified based on correlation coefficient analysis, PCA, CA, as well as spatial distribution characteristics. As, Ni, Cu, Mn, Pb, Zn and Cr have mixed sources — nature, traffic, as well as fossil fuel combustion and weathering of materials. Ba and V are mainly derived from nature, but partly also from industrial emissions, as well as construction sources, while Co principally originates from construction. - Highlights: • Metal content in dust from schools was determined by XRF. • Spatial distribution of metals in urban dust was focused on campus samples. • Multivariate statistic and spatial distribution were used to identify metal sources.
A novel stress distribution analytical model of O-ring seals under different properties of materials
International Nuclear Information System (INIS)
Wu, Di; Wang, Shao Ping; Wang, Xing Jian
2017-01-01
The elastomeric O-ring seals have been widely used as sealing elements in hydraulic systems. The sealing performance of O-ring seals is related to stress distribution. The stresses distribution depends on the squeeze rate and internal pressure, and would vary with properties of O-ring seals materials. Thus, in order to study the sealing performance of O-ring seals, it is necessary to describe the analytic relationship between stress distribution and properties of O-ring seals materials. For this purpose, a novel Stress distribution analytical model (SDAM) is proposed in this paper. The analytical model utilizes two stress complex functions to describe the stress distribution of O-ring seals. The proposed SDAM can express not only the analytical relationship between stress distribution and Young’s modulus, but also the one between stress distribution and Poisson’s ratio. Finally, compared results between finite element analysis and the SDAM validate that the proposed model can effectively reveal the stress distribution under different properties for O-ring materials
A novel stress distribution analytical model of O-ring seals under different properties of materials
Energy Technology Data Exchange (ETDEWEB)
Wu, Di; Wang, Shao Ping; Wang, Xing Jian [School of Automation Science and Electrical Engineering, Beihang University, Beijing (China)
2017-01-15
The elastomeric O-ring seals have been widely used as sealing elements in hydraulic systems. The sealing performance of O-ring seals is related to stress distribution. The stresses distribution depends on the squeeze rate and internal pressure, and would vary with properties of O-ring seals materials. Thus, in order to study the sealing performance of O-ring seals, it is necessary to describe the analytic relationship between stress distribution and properties of O-ring seals materials. For this purpose, a novel Stress distribution analytical model (SDAM) is proposed in this paper. The analytical model utilizes two stress complex functions to describe the stress distribution of O-ring seals. The proposed SDAM can express not only the analytical relationship between stress distribution and Young’s modulus, but also the one between stress distribution and Poisson’s ratio. Finally, compared results between finite element analysis and the SDAM validate that the proposed model can effectively reveal the stress distribution under different properties for O-ring materials.
[Effects of sampling plot number on tree species distribution prediction under climate change].
Liang, Yu; He, Hong-Shi; Wu, Zhi-Wei; Li, Xiao-Na; Luo, Xu
2013-05-01
Based on the neutral landscapes under different degrees of landscape fragmentation, this paper studied the effects of sampling plot number on the prediction of tree species distribution at landscape scale under climate change. The tree species distribution was predicted by the coupled modeling approach which linked an ecosystem process model with a forest landscape model, and three contingent scenarios and one reference scenario of sampling plot numbers were assumed. The differences between the three scenarios and the reference scenario under different degrees of landscape fragmentation were tested. The results indicated that the effects of sampling plot number on the prediction of tree species distribution depended on the tree species life history attributes. For the generalist species, the prediction of their distribution at landscape scale needed more plots. Except for the extreme specialist, landscape fragmentation degree also affected the effects of sampling plot number on the prediction. With the increase of simulation period, the effects of sampling plot number on the prediction of tree species distribution at landscape scale could be changed. For generalist species, more plots are needed for the long-term simulation.
ATLAS, Collaboration
2013-01-01
Expected distributions of the test statistics q=log(L(0^+)/L(2^+)) for the spin-0 and spin-2 (produced by gluon fusion) hypotheses. The observed value is indicated by a vertical line. The coloured areas correspond to the integrals of the expected distributions used to compute the p-values for the rejection of each hypothesis.
Directory of Open Access Journals (Sweden)
Chi-Cheng Liao
2012-09-01
Full Text Available Treelines have been found to be lower in small isolated hilltops, but the specific dynamics behind this unique phenomenon are unknown. This study investigates the distribution patterns of woody plants in Yangmingshan National Park (YMSNP, Northern Taiwan in search of the limitation mechanisms unique to small isolated hills, and to evaluate potential threats under global warming. Forests distributed between 200 to 900 m above sea level (ASL. Remnant forest fragments between 400 and 900 m ASL, have the highest species richness, and should be protected to ensure future forest recovery from the former extensive artificial disturbance. The lower boundary is threatened by urban and agricultural development. The lack of native woody species in these low elevation zones may cause a gap susceptible to invasive species. A consistent forest line at 100 m below mountain tops regardless of elevation suggests a topography-induced instead of an elevation-related limiting mechanism. Therefore, upward-shift of forests, caused by global warming, might be limited at 100 m below hilltops in small isolated hills because of topography-related factors. The spatial range of woody plants along the altitudinal gradient, thus, is likely to become narrower under the combined pressures of global warming, limited elevation, exposure-related stress, and artificial disturbance. Management priorities for forest recovery are suggested to include preservation of remnant forest fragments, increasing forest connectivity, and increasing seedling establishment in the grasslands.
Hotspot detection using space-time scan statistics on children under five years of age in Depok
Verdiana, Miranti; Widyaningsih, Yekti
2017-03-01
Some problems that affect the health level in Depok is the high malnutrition rates from year to year and the more spread infectious and non-communicable diseases in some areas. Children under five years old is a vulnerable part of population to get the malnutrition and diseases. Based on this reason, it is important to observe the location and time, where and when, malnutrition in Depok happened in high intensity. To obtain the location and time of the hotspots of malnutrition and diseases that attack children under five years old, space-time scan statistics method can be used. Space-time scan statistic is a hotspot detection method, where the area and time of information and time are taken into account simultaneously in detecting the hotspots. This method detects a hotspot with a cylindrical scanning window: the cylindrical pedestal describes the area, and the height of cylinder describe the time. Cylinders formed is a hotspot candidate that may occur, which require testing of hypotheses, whether a cylinder can be summed up as a hotspot. Hotspot detection in this study carried out by forming a combination of several variables. Some combination of variables provides hotspot detection results that tend to be the same, so as to form groups (clusters). In the case of infant health level in Depok city, Beji health care center region in 2011-2012 is a hotspot. According to the combination of the variables used in the detection of hotspots, Beji health care center is most frequently as a hotspot. Hopefully the local government can take the right policy to improve the health level of children under five in the city of Depok.
Whole Frog Project and Virtual Frog Dissection Statistics wwwstats output for January 1 through duplicate or extraneous accesses. For example, in these statistics, while a POST requesting an image is as well. Note that this under-represents the bytes requested. Starting date for following statistics
Probabilistic accounting of uncertainty in forecasts of species distributions under climate change
Seth J. Wenger; Nicholas A. Som; Daniel C. Dauwalter; Daniel J. Isaak; Helen M. Neville; Charles H. Luce; Jason B. Dunham; Michael K. Young; Kurt D. Fausch; Bruce E. Rieman
2013-01-01
Forecasts of species distributions under future climates are inherently uncertain, but there have been few attempts to describe this uncertainty comprehensively in a probabilistic manner. We developed a Monte Carlo approach that accounts for uncertainty within generalized linear regression models (parameter uncertainty and residual error), uncertainty among competing...
Void fraction distribution in a heated rod bundle under flow stagnation conditions
Energy Technology Data Exchange (ETDEWEB)
Herrero, V.A.; Guido-Lavalle, G.; Clausse, A. [Centro Atomico Bariloche and Instituto Balseiro, Bariloche (Argentina)
1995-09-01
An experimental study was performed to determine the axial void fraction distribution along a heated rod bundle under flow stagnation conditions. The development of the flow pattern was investigated for different heat flow rates. It was found that in general the void fraction is overestimated by the Zuber & Findlay model while the Chexal-Lellouche correlation produces a better prediction.
DEFF Research Database (Denmark)
Boe-Hansen, Rasmus; Martiny, Adam Camillo; Arvin, Erik
2003-01-01
In this study, the construction a model distribution system suitable for studies of attached and suspended microbial activity in drinking water under controlled circumstances is outlined. The model system consisted of two loops connected in series with a total of 140 biofilm sampling points...
Robustness of the Drinking Water Distribution Network under Changing Future Demand
Agudelo-Vera, C.; Blokker, M.; Vreeburg, J.; Bongard, T.; Hillegers, S.; Van der Hoek, J.P.
2014-01-01
A methodology to determine the robustness of the drinking water distribution system is proposed. The performance of three networks under ten future demand scenarios was tested, using head loss and residence time as indicators. The scenarios consider technological and demographic changes. Daily
The temporal distribution of directional gradients under selection for an optimum.
Chevin, Luis-Miguel; Haller, Benjamin C
2014-12-01
Temporal variation in phenotypic selection is often attributed to environmental change causing movements of the adaptive surface relating traits to fitness, but this connection is rarely established empirically. Fluctuating phenotypic selection can be measured by the variance and autocorrelation of directional selection gradients through time. However, the dynamics of these gradients depend not only on environmental changes altering the fitness surface, but also on evolution of the phenotypic distribution. Therefore, it is unclear to what extent variability in selection gradients can inform us about the underlying drivers of their fluctuations. To investigate this question, we derive the temporal distribution of directional gradients under selection for a phenotypic optimum that is either constant or fluctuates randomly in various ways in a finite population. Our analytical results, combined with population- and individual-based simulations, show that although some characteristic patterns can be distinguished, very different types of change in the optimum (including a constant optimum) can generate similar temporal distributions of selection gradients, making it difficult to infer the processes underlying apparent fluctuating selection. Analyzing changes in phenotype distributions together with changes in selection gradients should prove more useful for inferring the mechanisms underlying estimated fluctuating selection. © 2014 The Author(s). Evolution © 2014 The Society for the Study of Evolution.
Jeffrey H. Gove
2003-01-01
Many of the most popular sampling schemes used in forestry are probability proportional to size methods. These methods are also referred to as size biased because sampling is actually from a weighted form of the underlying population distribution. Length- and area-biased sampling are special cases of size-biased sampling where the probability weighting comes from a...
Yang, Hui-Feng; Zheng, Jiang-Hua; Jia, Xiao-Guang; Li, Xiao-Jin
2017-03-01
Apocynum venetum belongs to apocynaceae and is a perennial medicinal plant, its stem is an important textile raw materials. The projection of potential geographic distribution of A. venetum has an important significance for the protection and sustainable utilization of the plant. This study was conducted to determine the potential geographic distribution of A. venetum and to project how climate change would affect its geographic distribution. The projection geographic distribution of A. venetum under current bioclimatic conditions in northern China was simulated using MaxEnt software based on species presence data at 44 locations and 19 bioclimatic parameters. The future distributions of A. venetum were also projected in 2050 and 2070 under the climate change scenarios of RCP2.6 and RCP8.5 described in 5th Assessment Report of the Intergovernmental Panel on Climate Change (IPCC). The result showed that min air temperature of the coldest month, annual mean air temperature, precipitation of the coldest quarter and mean air temperature of the wettest quarter dominated the geographic distribution of A. venetum. Under current climate, the suitable habitats of A. venetum is 11.94% in China, the suitable habitats are mainly located in the middle of Xinjiang, in the northern part of Gansu, in the southern part of Neimeng, in the northern part of Ningxia, in the middle and northern part of Shaanxi, in the southern part of Shanxi, in the middle and northern part of Henan, in the middle and southern part of Hebei, Shandong, Tianjin, in the southern part of Liaoning and part of Beijing. From 2050 to 2070, the model outputs indicated that the suitable habitats of A. venetum would decrease under the climate change scenarios of RCP2.6 and RCP8.5. Copyright© by the Chinese Pharmaceutical Association.
Evaluation of 137Cs and 40K distribution in soil under tree crown
International Nuclear Information System (INIS)
Narmontas, A.; Butkus, D.
2003-01-01
In this work is analysed vertical and horizontal distribution of 137 Cs and 40 K in a soil under tree crown. 137 Cs and 40 K have different nature, 137 Cs is artificial radionuclide and 40 K is natural radionuclide, so they have different migration properties. The big influence to the environment was done by accident in Chernobyl nuclear power plant in 1986. Besides that, environment was polluted by radioactive elements in a time of nuclear weapon experiments. After the accident in Chernobyl nuclear power plant in different components of forest soil 137 Cs was distributed variously. In vertical disposition 137 Cs migration is depended from diffusion, convection and by migration in the tree roots. The vertical and horizontal distribution of radionuclides in birch and pine habitat soil is estimated. Also it is determined what influence is doing tree habitat environment, tree crown, dominated winds for the distribution of radionuclides in soil. Also it is discussed about soil sampling and measuring methods. (author)
International Nuclear Information System (INIS)
Puangbut, D.; Vorasoot, N.
2018-01-01
Root length density and rooting depth have been established as drought resistant traits and these could be used as selection criteria for drought resistant genotype in many plant species. However, information on deep rooting and the root distribution pattern of Jerusalem artichoke under drought conditions is not well documented in the literature. The objective of this study was to investigate the root distribution pattern in Jerusalem artichoke genotypes under irrigated and drought conditions. This experiment was conducted within a greenhouse using rhizoboxes. Three Jerusalem artichoke genotypes were tested under two water regimes (irrigated and drought). A 2 × 3 factorial experiment was arranged in a randomized complete block design with three replications over two years. Data were recorded for root traits, photosynthesis and biomass at 30 days after imposing drought. The drought decreased root length, root surface area and root dry weight, while increased the root: shoot ratio, root distribution in the deeper soil and the percentage of root length at deeper in the soil, when compared to the irrigated conditions JA-5 and JA-60 showed high root length in the lower soil profile under drought conditions, indicating these genotypes could be identified as drought resistant genotype. The highest positive correlation was found between root length at deeper soil layer with relative water content (RWC), net photosynthetic rate (Pn) and biomass. It is expected that selection of Jerusalem artichoke with high root length coupled with maintaining high RWC and their promotion to Pn could improve the biomass and tuber yield under drought conditions. (author)
Directory of Open Access Journals (Sweden)
SANKU DEY
2010-11-01
Full Text Available The generalized exponential (GE distribution proposed by Gupta and Kundu (1999 is an important lifetime distribution in survival analysis. In this article, we propose to obtain Bayes estimators and its associated risk based on a class of non-informative prior under the assumption of three loss functions, namely, quadratic loss function (QLF, squared log-error loss function (SLELF and general entropy loss function (GELF. The motivation is to explore the most appropriate loss function among these three loss functions. The performances of the estimators are, therefore, compared on the basis of their risks obtained under QLF, SLELF and GELF separately. The relative efficiency of the estimators is also obtained. Finally, Monte Carlo simulations are performed to compare the performances of the Bayes estimates under different situations.
The flow distribution in the parallel tubes of the cavity receiver under variable heat flux
International Nuclear Information System (INIS)
Hao, Yun; Wang, Yueshe; Hu, Tian
2016-01-01
Highlights: • An experimental loop is built to find the flow distribution in the parallel tubes. • With the concentration of heat flux, two-phase flow makes distribution more uneven. • The total flow rate is chosen appropriately for a wider heat flux distribution. • A suitable system pressure is essential for the optimization of flow distribution. - Abstract: As an optical component of tower solar thermal power station, the heliostat mirror reflects sunlight to one point of the heated surface in the solar cavity receiver, called as one-point focusing system. The radiation heat flux concentrated in the cavity receiver is always non-uniform temporally and spatially, which may lead to extremely local over-heat on the receiver evaporation panels. In this paper, an electrical heated evaporating experimental loop, including five parallel vertical tubes, is set up to evaluate the hydrodynamic characteristics of evaporation panels in a solar cavity receiver under various non-uniform heat flux. The influence of the heat flux concentration ratio, total flow rate, and system pressure on the flow distribution of parallel tubes is discussed. It is found that the flow distribution becomes significantly worse with the increase of heat flux and concentration ratio; and as the system pressure decreased, the flow distribution is improved. It is extremely important to obtain these interesting findings for the safe and stable operation of solar cavity receiver, and can also provide valuable references for the design and optimization of operating parameters solar tower power station system.
Directory of Open Access Journals (Sweden)
E. Baümler
2014-06-01
Full Text Available In this study the kinetics of oil extraction from partially dehulled safflower seeds under two moisture conditions (7 and 9% dry basis was investigated. The extraction assays were performed using a stirred batch system, thermostated at 50 ºC, using n-hexane as solvent. The data obtained were fitted to a modified diffusion model in order to represent the extraction kinetics. The model took into account a washing and a diffusive step. Fitting parameters were compared statistically for both moisture conditions. The oil yield increased with the extraction time in both cases, although the oil was released at different rates. A comparison of the parameters showed that both the portion extracted in the washing phase and the effective diffusion coefficient were moisture-dependent. The effective diffusivities were 2.81 10-12 and 8.06 10-13 m²s-1 for moisture contents of 7% and 9%, respectively.
International Nuclear Information System (INIS)
Bai, D.S.; Chun, Y.R.; Kim, J.G.
1995-01-01
This paper considers the design of life-test sampling plans based on failure-censored accelerated life tests. The lifetime distribution of products is assumed to be Weibull with a scale parameter that is a log linear function of a (possibly transformed) stress. Two levels of stress higher than the use condition stress, high and low, are used. Sampling plans with equal expected test times at high and low test stresses which satisfy the producer's and consumer's risk requirements and minimize the asymptotic variance of the test statistic used to decide lot acceptability are obtained. The properties of the proposed life-test sampling plans are investigated
Wang, Yongli; Wang, Gang; Zuo, Yi; Fan, Lisha; Ling, Yunpeng
2017-03-01
On March 15, 2015, the Central Office issued the "Opinions on Further Deepening the Reform of Electric Power System" (Zhong Fa No. 9). This policy marks the central government officially opened a new round of electricity reform. As a programmatic document under the new situation to comprehensively promote the reform of the power system, No. 9 document will be approved as a separate transmission and distribution of electricity prices, which is the first task of promoting the reform of the power system. Grid tariff reform is not only the transmission and distribution price of a separate approval, more of the grid company input-output relationship and many other aspects of deep-level adjustments. Under the background of the reform of the transmission and distribution price, the main factors affecting the input-output relationship, such as the main business, electricity pricing, and investment approval, financial accounting and so on, have changed significantly. The paper designed the comprehensive evaluation index system of power grid projects' investment benefits under the reform of transmission and distribution price to improve the investment efficiency of power grid projects after the power reform in China.
Wang, Yongli; Wang, Gang; Zuo, Yi; Fan, Lisha; Wei, Jiaxiang
2017-03-01
On March 15, 2015, the central office issued the "Opinions on Further Deepening the Reform of Electric Power System" (in the 2015 No. 9). This policy marks the central government officially opened a new round of electricity reform. As a programmatic document under the new situation to comprehensively promote the reform of the power system, No. 9 document will be approved as a separate transmission and distribution of electricity prices, which is the first task of promoting the reform of the power system. Grid tariff reform is not only the transmission and distribution price of a separate approval, more of the grid company input-output relationship and many other aspects of deep-level adjustments. Under the background of the reform of the transmission and distribution price, the main factors affecting the input-output relationship, such as the main business, electricity pricing, and investment approval, financial accounting and so on, have changed significantly. The paper designed the comprehensive evaluation index system of power grid enterprises' credit rating under the reform of transmission and distribution price to reduce the impact of the reform on the company's international rating results and the ability to raise funds.
Duque-Lazo, Joaquín; Durka, Walter; Hauenschild, Frank; Schnitzler, Jan; Michalak, Ingo; Ogundipe, Oluwatoyin Temitayo; Muellner-Riehl, Alexandra Nora
2018-01-01
Climate change is predicted to impact species’ genetic diversity and distribution. We used Senegalia senegal (L.) Britton, an economically important species distributed in the Sudano-Sahelian savannah belt of West Africa, to investigate the impact of climate change on intraspecific genetic diversity and distribution. We used ten nuclear and two plastid microsatellite markers to assess genetic variation, population structure and differentiation across thirteen sites in West Africa. We projected suitable range, and potential impact of climate change on genetic diversity using a maximum entropy approach, under four different climate change scenarios. We found higher genetic and haplotype diversity at both nuclear and plastid markers than previously reported. Genetic differentiation was strong for chloroplast and moderate for the nuclear genome. Both genomes indicated three spatially structured genetic groups. The distribution of Senegalia senegal is strongly correlated with extractable nitrogen, coarse fragments, soil organic carbon stock, precipitation of warmest and coldest quarter and mean temperature of driest quarter. We predicted 40.96 to 6.34 per cent of the current distribution to favourably support the species’ ecological requirements under future climate scenarios. Our results suggest that climate change is going to affect the population genetic structure of Senegalia senegal, and that patterns of genetic diversity are going to influence the species’ adaptive response to climate change. Our study contributes to the growing evidence predicting the loss of economically relevant plants in West Africa in the next decades due to climate change. PMID:29659603
Zhang, Keliang; Yao, Linjun; Meng, Jiasong; Tao, Jun
2018-09-01
Paeonia (Paeoniaceae), an economically important plant genus, includes many popular ornamentals and medicinal plant species used in traditional Chinese medicine. Little is known about the properties of the habitat distribution and the important eco-environmental factors shaping the suitability. Based on high-resolution environmental data for current and future climate scenarios, we modeled the present and future suitable habitat for P. delavayi and P. rockii by Maxent, evaluated the importance of environmental factors in shaping their distribution, and identified distribution shifts under climate change scenarios. The results showed that the moderate and high suitable areas for P. delavayi and P. rockii encompassed ca. 4.46×10 5 km 2 and 1.89×10 5 km 2 , respectively. Temperature seasonality and isothermality were identified as the most critical factors shaping P. delavayi distribution, and UVB-4 and annual precipitation were identified as the most critical for shaping P. rockii distribution. Under the scenario with a low concentration of greenhouse gas emissions (RCP2.6), the range of both species increased as global warming intensified; however, under the scenario with higher concentrations of emissions (RCP8.5), the suitable habitat range of P. delavayi decreased while P. rockii increased. Overall, our prediction showed that a shift in distribution of suitable habitat to higher elevations would gradually become more significant. The information gained from this study should provide a useful reference for implementing long-term conservation and management strategies for these species. Copyright © 2018. Published by Elsevier B.V.
Potential distribution of pine wilt disease under future climate change scenarios.
Directory of Open Access Journals (Sweden)
Akiko Hirata
Full Text Available Pine wilt disease (PWD constitutes a serious threat to pine forests. Since development depends on temperature and drought, there is a concern that future climate change could lead to the spread of PWD infections. We evaluated the risk of PWD in 21 susceptible Pinus species on a global scale. The MB index, which represents the sum of the difference between the mean monthly temperature and 15 when the mean monthly temperatures exceeds 15°C, was used to determine current and future regions vulnerable to PWD (MB ≥ 22. For future climate conditions, we compared the difference in PWD risks among four different representative concentration pathways (RCPs 2.6, 4.5, 6.0, and 8.5 and two time periods (2050s and 2070s. We also evaluated the impact of climate change on habitat suitability for each Pinus species using species distribution models. The findings were then integrated and the potential risk of PWD spread under climate change was discussed. Within the natural Pinus distribution area, southern parts of North America, Europe, and Asia were categorized as vulnerable regions (MB ≥ 22; 16% of the total Pinus distribution area. Representative provinces in which PWD has been reported at least once overlapped with the vulnerable regions. All RCP scenarios showed expansion of vulnerable regions in northern parts of Europe, Asia, and North America under future climate conditions. By the 2070s, under RCP 8.5, an estimated increase in the area of vulnerable regions to approximately 50% of the total Pinus distribution area was revealed. In addition, the habitat conditions of a large portion of the Pinus distribution areas in Europe and Asia were deemed unsuitable by the 2070s under RCP 8.5. Approximately 40% of these regions overlapped with regions deemed vulnerable to PWD, suggesting that Pinus forests in these areas are at risk of serious damage due to habitat shifts and spread of PWD.
Directory of Open Access Journals (Sweden)
Farhad Yahgmaei
2013-01-01
Full Text Available This paper proposes different methods of estimating the scale parameter in the inverse Weibull distribution (IWD. Specifically, the maximum likelihood estimator of the scale parameter in IWD is introduced. We then derived the Bayes estimators for the scale parameter in IWD by considering quasi, gamma, and uniform priors distributions under the square error, entropy, and precautionary loss functions. Finally, the different proposed estimators have been compared by the extensive simulation studies in corresponding the mean square errors and the evolution of risk functions.
Grid Voltage Synchronization for Distributed Generation Systems under Grid Fault Conditions
DEFF Research Database (Denmark)
Luna, Alvaro; Rocabert, J.; Candela, I.
2015-01-01
on the installation of STATCOMs and DVRs, as well as on advanced control functionalities for the existing power converters of distributed generation plants, have contributed to enhance their response under faulty and distorted scenarios and, hence, to fulfill these requirements. In order to achieve satisfactory......The actual grid code requirements for the grid connection of distributed generation systems, mainly wind and PV systems, are becoming very demanding. The Transmission System Operators (TSOs) are especially concerned about the Low Voltage Ride Through requirements. Solutions based...
Distribution characteristics of 137Cs in soil profiles under different land uses and its implication
International Nuclear Information System (INIS)
Mian Li; Wenyi Yao; Jishan Yang; Zhenzhou Shen; Er Yang
2016-01-01
This paper presents a study of the distribution of 137 Cs in soils under three different land uses in a semiarid watershed. The results showed the average inventory of 137 Cs in the cultivated land, woodland and grassland was 888, 1489 and 1650 Bq/m 2 , respectively. The pattern of depth distribution of 137 Cs in the soil profiles with cultivated land, woodland and grassland was disturbed, eroding and aggrading, and normal profiles, respectively. The coefficient of variation of 137 Cs inventory varied from 8.9 to 38.8 % for different land uses. (author)
Secure Distributed Detection under Energy Constraint in IoT-Oriented Sensor Networks
Directory of Open Access Journals (Sweden)
Guomei Zhang
2016-12-01
Full Text Available We study the secure distributed detection problems under energy constraint for IoT-oriented sensor networks. The conventional channel-aware encryption (CAE is an efficient physical-layer secure distributed detection scheme in light of its energy efficiency, good scalability and robustness over diverse eavesdropping scenarios. However, in the CAE scheme, it remains an open problem of how to optimize the key thresholds for the estimated channel gain, which are used to determine the sensor’s reporting action. Moreover, the CAE scheme does not jointly consider the accuracy of local detection results in determining whether to stay dormant for a sensor. To solve these problems, we first analyze the error probability and derive the optimal thresholds in the CAE scheme under a specified energy constraint. These results build a convenient mathematic framework for our further innovative design. Under this framework, we propose a hybrid secure distributed detection scheme. Our proposal can satisfy the energy constraint by keeping some sensors inactive according to the local detection confidence level, which is characterized by likelihood ratio. In the meanwhile, the security is guaranteed through randomly flipping the local decisions forwarded to the fusion center based on the channel amplitude. We further optimize the key parameters of our hybrid scheme, including two local decision thresholds and one channel comparison threshold. Performance evaluation results demonstrate that our hybrid scheme outperforms the CAE under stringent energy constraints, especially in the high signal-to-noise ratio scenario, while the security is still assured.
Secure Distributed Detection under Energy Constraint in IoT-Oriented Sensor Networks.
Zhang, Guomei; Sun, Hao
2016-12-16
We study the secure distributed detection problems under energy constraint for IoT-oriented sensor networks. The conventional channel-aware encryption (CAE) is an efficient physical-layer secure distributed detection scheme in light of its energy efficiency, good scalability and robustness over diverse eavesdropping scenarios. However, in the CAE scheme, it remains an open problem of how to optimize the key thresholds for the estimated channel gain, which are used to determine the sensor's reporting action. Moreover, the CAE scheme does not jointly consider the accuracy of local detection results in determining whether to stay dormant for a sensor. To solve these problems, we first analyze the error probability and derive the optimal thresholds in the CAE scheme under a specified energy constraint. These results build a convenient mathematic framework for our further innovative design. Under this framework, we propose a hybrid secure distributed detection scheme. Our proposal can satisfy the energy constraint by keeping some sensors inactive according to the local detection confidence level, which is characterized by likelihood ratio. In the meanwhile, the security is guaranteed through randomly flipping the local decisions forwarded to the fusion center based on the channel amplitude. We further optimize the key parameters of our hybrid scheme, including two local decision thresholds and one channel comparison threshold. Performance evaluation results demonstrate that our hybrid scheme outperforms the CAE under stringent energy constraints, especially in the high signal-to-noise ratio scenario, while the security is still assured.
Taggart, T. P.; Endreny, T. A.; Nowak, D.
2014-12-01
Gray and green infrastructure in urban environments alters many natural hydrologic processes, creating an urban water balance unique to the developed environment. A common way to assess the consequences of impervious cover and grey infrastructure is by measuring runoff hydrographs. This focus on the watershed outlet masks the spatial variation of hydrologic process alterations across the urban environment in response to localized landscape characteristics. We attempt to represent this spatial variation in the urban environment using the statistically and spatially distributed i-Tree Hydro model, a scoping level urban forest effects water balance model. i-Tree Hydro has undergone expansion and modification to include the effect of green infrastructure processes, road network attributes, and urban pipe system leakages. These additions to the model are intended to increase the understanding of the altered urban hydrologic cycle by examining the effects of the location of these structures on the water balance. Specifically, the effect of these additional structures and functions on the spatially varying properties of interception, soil moisture and runoff generation. Differences in predicted properties and optimized parameter sets between the two models are examined and related to the recent landscape modifications. Datasets used in this study consist of watersheds and sewersheds within the Syracuse, NY metropolitan area, an urban area that has integrated green and gray infrastructure practices to alleviate stormwater problems.
Directory of Open Access Journals (Sweden)
M. Tabei
2016-02-01
Full Text Available Introduction: The to be limited available water amount from one side and to be increased needs of world population from the other side have caused increase of cultivation for products. For this reason, employing new irrigation ways and using new water resources like using the uncommon water (salty water, water drainage are two main strategies for regulating water shortage conditions. On the other side, accumulation of salts on the soil surface in dry regions having low rainfall and much evaporation, i.e. an avoidable case. As doing experiment for determining moisture distribution form demands needs a lot of time and conducting desert experiments are costly, stimulator models are suitable alternatives in answering the problem concerning moving and saltiness distribution. Materials and Methods: In this research, simulation of soil saltiness under drip irrigation was done by the SWAP model and potency of the above model was done in comparison with evaluated relevant results. SWAP model was performed based on measured data in a corn field equipped with drip irrigation system in the farming year 1391-92 in the number one research field in the engineering faculty of water science, ShahidChamran university of Ahvaz and hydraulic parameters of soil obtained from RETC . Statistical model in the form of a random full base plan with four attendants for irrigating water saltiness including salinity S1 (Karoon River water with salinity 3 ds/m as a control treatment, S2 (S1 +0/5, S3 (S1 +1 and S4 (S1 +1/5 dS/m, in 3 repetition and in 3 intervals of 10 cm emitter, 20 cm emitters on the stack, at a depth of 0-90 cm (instead of each 30 cm from soil surface and intervals of 30, 60 and 90 days after modeling cultiviation was done. The cultivation way was done handheld in plots including four rows of 3 m in distance of 75 cm rows and with denseness of 80 bushes in a hectar. Drip irrigation system was of type strip with space of 20 cm pores. Results and Discussion
International Nuclear Information System (INIS)
Das, Rabindra Nath; Kim, Jinseog; Park, Jeong-Soo
2015-01-01
In quality engineering, the most commonly used lifetime distributions are log-normal, exponential, gamma and Weibull. Experimental designs are useful for predicting the optimal operating conditions of the process in lifetime improvement experiments. In the present article, invariant robust first-order D-optimal designs are derived for correlated lifetime responses having the above four distributions. Robust designs are developed for some correlated error structures. It is shown that robust first-order D-optimal designs for these lifetime distributions are always robust rotatable but the converse is not true. Moreover, it is observed that these designs depend on the respective error covariance structure but are invariant to the above four lifetime distributions. This article generalizes the results of Das and Lin [7] for the above four lifetime distributions with general (intra-class, inter-class, compound symmetry, and tri-diagonal) correlated error structures. - Highlights: • This paper presents invariant robust first-order D-optimal designs under correlated lifetime responses. • The results of Das and Lin [7] are extended for the four lifetime (log-normal, exponential, gamma and Weibull) distributions. • This paper also generalizes the results of Das and Lin [7] to more general correlated error structures
Mimura, Yasuhiro; Takemoto, Satoko; Tachibana, Taro; Ogawa, Yutaka; Nishimura, Masaomi; Yokota, Hideo; Imamoto, Naoko
2017-11-24
Nuclear pore complexes (NPCs) maintain cellular homeostasis by mediating nucleocytoplasmic transport. Although cyclin-dependent kinases (CDKs) regulate NPC assembly in interphase, the location of NPC assembly on the nuclear envelope is not clear. CDKs also regulate the disappearance of pore-free islands, which are nuclear envelope subdomains; this subdomain gradually disappears with increase in homogeneity of the NPC in response to CDK activity. However, a causal relationship between pore-free islands and NPC assembly remains unclear. Here, we elucidated mechanisms underlying NPC assembly from a new perspective by focusing on pore-free islands. We proposed a novel framework for image-based analysis to automatically determine the detailed 'landscape' of pore-free islands from a large quantity of images, leading to the identification of NPC intermediates that appear in pore-free islands with increased frequency in response to CDK activity. Comparison of the spatial distribution between simulated and the observed NPC intermediates within pore-free islands showed that their distribution was spatially biased. These results suggested that the disappearance of pore-free islands is highly related to de novo NPC assembly and indicated the existence of specific regulatory mechanisms for the spatial arrangement of NPC assembly on nuclear envelopes.
Breine, Bastiaan; Malcolm, Philippe; Segers, Veerle; Gerlo, Joeri; Derie, Rud; Pataky, Todd; Frederick, Edward C; De Clercq, Dirk
2017-12-01
In running, foot contact patterns (rear-, mid-, or forefoot contact) influence impact intensity and initial ankle and foot kinematics. The aim of the study was to compare impact intensity and its spatial distribution under the foot between different foot contact patterns. Forty-nine subjects ran at 3.2 m·s -1 over a level runway while ground reaction forces (GRF) and shoe-surface pressures were recorded and foot contact pattern was determined. A 4-zone footmask (forefoot, midfoot, medial and lateral rearfoot) assessed the spatial distribution of the vertical GRF under the foot. We calculated peak vertical instantaneous loading rate of the GRF (VILR) per foot zone as the impact intensity measure. Midfoot contact patterns were shown to have the lowest, and atypical rearfoot contact patterns the highest impact intensities, respectively. The greatest local impact intensity was mainly situated under the rear- and midfoot for the typical rearfoot contact patterns, under the midfoot for the atypical rearfoot contact patterns, and under the mid- and forefoot for the midfoot contact patterns. These findings indicate that different foot contact patterns could benefit from cushioning in different shoe zones.
International Nuclear Information System (INIS)
Tokita, N.; Carpenter, S.G.; Raju, M.R.
1984-01-01
Cell-cycle distributions were measured by flow cytometry for Chinese hamster (CHO) cells cultured continuously under hypoxic conditions. DNA histograms showed an accumulation of cells in the early S phase followed by a traverse delay through the S phase, and a G 2 block. During hypoxic culturing, cell viability decreased rapidly to less than 0.1% at 120 h. Radiation responses for cells cultured under these conditions showed an extreme radioresistance at 72 h. Results suggest that hypoxia induces a condition similar to cell synchrony which itself changes the radioresistance of hypoxic cells. (author)
How old is this bird? The age distribution under some phase sampling schemes.
Hautphenne, Sophie; Massaro, Melanie; Taylor, Peter
2017-12-01
In this paper, we use a finite-state continuous-time Markov chain with one absorbing state to model an individual's lifetime. Under this model, the time of death follows a phase-type distribution, and the transient states of the Markov chain are known as phases. We then attempt to provide an answer to the simple question "What is the conditional age distribution of the individual, given its current phase"? We show that the answer depends on how we interpret the question, and in particular, on the phase observation scheme under consideration. We then apply our results to the computation of the age pyramid for the endangered Chatham Island black robin Petroica traversi during the monitoring period 2007-2014.
Yan, Yonglian; Takáč, Tomáš; Li, Xiaoquan; Chen, Houbin; Wang, Yingying; Xu, Enfeng; Xie, Ling; Su, Zhaohua; Šamaj, Jozef; Xu, Chunxiang
2015-01-01
Information on the spatial distribution of arabinogalactan proteins (AGPs) in plant organs and tissues during plant reactions to low temperature (LT) is limited. In this study, the extracellular distribution of AGPs in banana leaves and roots, and their changes under LT stress were investigated in two genotypes differing in chilling tolerance, by immuno-techniques using 17 monoclonal antibodies against different AGP epitopes. Changes in total classical AGPs in banana leaves were also tested. The results showed that AGP epitopes recognized by JIM4, JIM14, JIM16, and CCRC-M32 antibodies were primarily distributed in leaf veins, while those recognized by JIM8, JIM13, JIM15, and PN16.4B4 antibodies exhibited predominant sclerenchymal localization. Epitopes recognized by LM2, LM14, and MAC207 antibodies were distributed in both epidermal and mesophyll cells. Both genotypes accumulated classical AGPs in leaves under LT treatment, and the chilling tolerant genotype contained higher classical AGPs at each temperature treatment. The abundance of JIM4 and JIM16 epitopes in the chilling-sensitive genotype decreased slightly after LT treatment, and this trend was opposite for the tolerant one. LT induced accumulation of LM2- and LM14-immunoreactive AGPs in the tolerant genotype compared to the sensitive one, especially in phloem and mesophyll cells. These epitopes thus might play important roles in banana LT tolerance. Different AGP components also showed differential distribution patterns in banana roots. In general, banana roots started to accumulate AGPs under LT treatment earlier than leaves. The levels of AGPs recognized by MAC207 and JIM13 antibodies in the control roots of the tolerant genotype were higher than in the chilling sensitive one. Furthermore, the chilling tolerant genotype showed high immuno-reactivity against JIM13 antibody. These results indicate that several AGPs are likely involved in banana tolerance to chilling injury.
Niu, Yu Jie; Yang, Si Wei; Wang, Gui Zhen; Liu, Li; Du, Guo Zhen; Hua, Li Min
2017-12-01
The research selected the alpine meadow located in the northeastern margin of the Qinghai-Tibet Plateau to study the changes of vegetation community and soil properties under different grazing intensities, as well as the quantitative relation between the distribution patterns of plant species and the physical and chemical properties of soil. The results showed that the grazing caused the differentiation of the initial vegetation community with the dominant plants, Elymus nutans and Stipa grandis. In the plots with high and low grazing intensities, the dominant plants had changed to Kobresia humilis and Melissitus ruthenica, and E. nutans and Poa crymophila, respectively. With the increase of grazing intensity, the plant richness, importance value and biomass were significantly decreased. The sequence of plant species importance value in each plot against grazing intensity could be fitted by a logarithmic model. The number of required plant species was reduced while the importance value of the remaining plant species accounted for 50% of the importance value in the whole vegetation community. The available P, available K, soil compaction, soil water content, stable infiltration rate and large aggregate index were significantly changed with grazing intensity, however, the changes were different. The CCA ordination showed that the soil compaction was the key factor affecting the distribution pattern of the plant species under grazing. The variance decomposition indicated that the soil factors together explained 30.5% of the distribution of the plant species, in particular the soil physical properties alone explained 22.8% of the distribution of the plant species, which had the highest rate of contribution to the plant species distribution. The soil physical properties affected the distribution pattern of plant species on grazed alpine meadow.
Blakemore, J S
1962-01-01
Semiconductor Statistics presents statistics aimed at complementing existing books on the relationships between carrier densities and transport effects. The book is divided into two parts. Part I provides introductory material on the electron theory of solids, and then discusses carrier statistics for semiconductors in thermal equilibrium. Of course a solid cannot be in true thermodynamic equilibrium if any electrical current is passed; but when currents are reasonably small the distribution function is but little perturbed, and the carrier distribution for such a """"quasi-equilibrium"""" co
International Nuclear Information System (INIS)
Postma, A.K.; Hilliard, R.K.
1983-03-01
This report reviews the current state of technology regarding hydrogen safety issues in light water reactor plants. Topics considered in this report include hydrogen generation, distribution in containment, and combustion characteristics. A companion report addresses hydrogen control. The objectives of the study were to identify the key safety issues related to hydrogen produced under severe accident conditions, to describe the state of technology for each issue, and to point out ongoing programs aimed at resolving the open issues
Time evolution of a Gaussian class of quasi-distribution functions under quadratic Hamiltonian.
Ginzburg, D; Mann, A
2014-03-10
A Lie algebraic method for propagation of the Wigner quasi-distribution function (QDF) under quadratic Hamiltonian was presented by Zoubi and Ben-Aryeh. We show that the same method can be used in order to propagate a rather general class of QDFs, which we call the "Gaussian class." This class contains as special cases the well-known Wigner, Husimi, Glauber, and Kirkwood-Rihaczek QDFs. We present some examples of the calculation of the time evolution of those functions.
International Nuclear Information System (INIS)
Purtymun, W.D.; Maes, M.N.; Peters, R.
1985-01-01
A study of the distribution of moisture, tritium, and plutonium in the Mortandad Canyon aquifer indicates some infiltration of water into the underlying tuff. This infiltration was accompanied by similar movement of tritium. The concentrations of plutonium on the sediments in the aquifer were low when compared with the high concentrations in solution in an ionic complex that does not readily exchange or is adsorbed by clay minerals in the alluvium. 2 references, 4 figures, 2 tables
Potential distribution of dengue fever under scenarios of climate change and economic development.
Aström, Christofer; Rocklöv, Joacim; Hales, Simon; Béguin, Andreas; Louis, Valerie; Sauerborn, Rainer
2012-12-01
Dengue fever is the most important viral vector-borne disease with ~50 million cases per year globally. Previous estimates of the potential effect of global climate change on the distribution of vector-borne disease have not incorporated the effect of socioeconomic factors, which may have biased the results. We describe an empirical model of the current geographic distribution of dengue, based on the independent effects of climate and gross domestic product per capita (GDPpc, a proxy for socioeconomic development). We use the model, along with scenario-based projections of future climate, economic development, and population, to estimate populations at risk of dengue in the year 2050. We find that both climate and GDPpc influence the distribution of dengue. If the global climate changes as projected but GDPpc remained constant, the population at risk of dengue is estimated to increase by about 0.28 billion in 2050. However, if both climate and GDPpc change as projected, we estimate a decrease of 0.12 billion in the population at risk of dengue in 2050. Empirically, the geographic distribution of dengue is strongly dependent on both climatic and socioeconomic variables. Under a scenario of constant GDPpc, global climate change results in a modest but important increase in the global population at risk of dengue. Under scenarios of high GDPpc, this adverse effect of climate change is counteracted by the beneficial effect of socioeconomic development.
Nagao, Kan; Kawano, Fumiaki; Ichikawa, Tetsuo
2004-12-01
In case of making complete dentures, we have to consider not only denture stability but also the restoration of aesthetics and function such as mastication and speech. However these are contradictory theoretically from the point of view of denture stability, and it is very difficult to satisfy both requirements in the case of a patient who has poor upper and lower alveolar ridges. We investigated the effect of artificial posterior teeth form and occlusal scheme on the distribution of pressure on supporting structures under complete dentures during mastication with upper and lower edentulous simulators. In this report, a guideline for the selection of occlusal scheme for complete dentures, based on our previous investigations, is described. The occlusal scheme remarkably affected the distribution of pressure under simulated complete dentures, as shown by comparing the distribution of pressure using two different occlusal schemes:fully balanced occlusion and lingualized occlusion. However other factors such as posterior teeth form and position affect the distribution of pressure as well, and are related to each other. Therefore, not only occlusal scheme but also posterior artificial teeth form has to be considered, and the form of posterior teeth should be carefully and comprehensively decided when making complete dentures.
An under-aisle air distribution system facilitating humidification of commercial aircraft cabins
Energy Technology Data Exchange (ETDEWEB)
Zhang, Tengfei; Yin, Shi; Wang, Shugang [School of Civil and Hydraulic Engineering, Dalian University of Technology (DUT), 2 Linggong Road, Dalian 116024 (China)
2010-04-15
Air environment in aircraft cabins has long been criticized especially for the dryness of the air within. Low moisture content in cabins is known to be responsible for headache, tiredness and many other non-specific symptoms. In addition, current widely used air distribution systems on airplanes dilute internally generated pollutants by promoting air mixing and thus impose risks of infectious airborne disease transmission. To boost air humidity level while simultaneously restricting air mixing, this investigation uses a validated computational fluid dynamics (CFD) program to design a new under-aisle air distribution system for wide-body aircraft cabins. The new system supplies fully outside, dry air at low momentum through a narrow channel passage along both side cabin walls to middle height of the cabin just beneath the stowage bins, while simultaneously humidified air is supplied through both perforated under aisles. By comparing with the current mixing air distribution system in terms of distribution of relative humidity, CO{sub 2} concentration, velocity, temperature and draught risk, the new system is found being able to improve the relative humidity from the existent 10% to the new level of 20% and lessen the inhaled CO{sub 2} concentration by 30%, without causing moisture condensation on cabin interior and inducing draught risks for passengers. The water consumption rate in air humidification is only around 0.05 kg/h per person, which should be affordable by airliners. (author)
Pricing American Asian options with higher moments in the underlying distribution
Lo, Keng-Hsin; Wang, Kehluh; Hsu, Ming-Feng
2009-01-01
We develop a modified Edgeworth binomial model with higher moment consideration for pricing American Asian options. With lognormal underlying distribution for benchmark comparison, our algorithm is as precise as that of Chalasani et al. [P. Chalasani, S. Jha, F. Egriboyun, A. Varikooty, A refined binomial lattice for pricing American Asian options, Rev. Derivatives Res. 3 (1) (1999) 85-105] if the number of the time steps increases. If the underlying distribution displays negative skewness and leptokurtosis as often observed for stock index returns, our estimates can work better than those in Chalasani et al. [P. Chalasani, S. Jha, F. Egriboyun, A. Varikooty, A refined binomial lattice for pricing American Asian options, Rev. Derivatives Res. 3 (1) (1999) 85-105] and are very similar to the benchmarks in Hull and White [J. Hull, A. White, Efficient procedures for valuing European and American path-dependent options, J. Derivatives 1 (Fall) (1993) 21-31]. The numerical analysis shows that our modified Edgeworth binomial model can value American Asian options with greater accuracy and speed given higher moments in their underlying distribution.
Ren, Zhoupeng; Wang, Duoquan; Ma, Aimin; Hwang, Jimee; Bennett, Adam; Sturrock, Hugh J. W.; Fan, Junfu; Zhang, Wenjie; Yang, Dian; Feng, Xinyu; Xia, Zhigui; Zhou, Xiao-Nong; Wang, Jinfeng
2016-02-01
Projecting the distribution of malaria vectors under climate change is essential for planning integrated vector control activities for sustaining elimination and preventing reintroduction of malaria. In China, however, little knowledge exists on the possible effects of climate change on malaria vectors. Here we assess the potential impact of climate change on four dominant malaria vectors (An. dirus, An. minimus, An. lesteri and An. sinensis) using species distribution models for two future decades: the 2030 s and the 2050 s. Simulation-based estimates suggest that the environmentally suitable area (ESA) for An. dirus and An. minimus would increase by an average of 49% and 16%, respectively, under all three scenarios for the 2030 s, but decrease by 11% and 16%, respectively in the 2050 s. By contrast, an increase of 36% and 11%, respectively, in ESA of An. lesteri and An. sinensis, was estimated under medium stabilizing (RCP4.5) and very heavy (RCP8.5) emission scenarios. in the 2050 s. In total, we predict a substantial net increase in the population exposed to the four dominant malaria vectors in the decades of the 2030 s and 2050 s, considering land use changes and urbanization simultaneously. Strategies to achieve and sustain malaria elimination in China will need to account for these potential changes in vector distributions and receptivity.
Ren, Zhoupeng; Wang, Duoquan; Ma, Aimin; Hwang, Jimee; Bennett, Adam; Sturrock, Hugh J W; Fan, Junfu; Zhang, Wenjie; Yang, Dian; Feng, Xinyu; Xia, Zhigui; Zhou, Xiao-Nong; Wang, Jinfeng
2016-02-12
Projecting the distribution of malaria vectors under climate change is essential for planning integrated vector control activities for sustaining elimination and preventing reintroduction of malaria. In China, however, little knowledge exists on the possible effects of climate change on malaria vectors. Here we assess the potential impact of climate change on four dominant malaria vectors (An. dirus, An. minimus, An. lesteri and An. sinensis) using species distribution models for two future decades: the 2030 s and the 2050 s. Simulation-based estimates suggest that the environmentally suitable area (ESA) for An. dirus and An. minimus would increase by an average of 49% and 16%, respectively, under all three scenarios for the 2030 s, but decrease by 11% and 16%, respectively in the 2050 s. By contrast, an increase of 36% and 11%, respectively, in ESA of An. lesteri and An. sinensis, was estimated under medium stabilizing (RCP4.5) and very heavy (RCP8.5) emission scenarios. in the 2050 s. In total, we predict a substantial net increase in the population exposed to the four dominant malaria vectors in the decades of the 2030 s and 2050 s, considering land use changes and urbanization simultaneously. Strategies to achieve and sustain malaria elimination in China will need to account for these potential changes in vector distributions and receptivity.
Predicting the distribution of commercially important invertebrate stocks under future climate.
Directory of Open Access Journals (Sweden)
Bayden D Russell
Full Text Available The future management of commercially exploited species is challenging because techniques used to predict the future distribution of stocks under climate change are currently inadequate. We projected the future distribution and abundance of two commercially harvested abalone species (blacklip abalone, Haliotis rubra and greenlip abalone, H. laevigata inhabiting coastal South Australia, using multiple species distribution models (SDM and for decadal time slices through to 2100. Projections are based on two contrasting global greenhouse gas emissions scenarios. The SDMs identified August (winter Sea Surface Temperature (SST as the best descriptor of abundance and forecast that warming of winter temperatures under both scenarios may be beneficial to both species by allowing increased abundance and expansion into previously uninhabited coasts. This range expansion is unlikely to be realised, however, as projected warming of March SST is projected to exceed temperatures which cause up to 10-fold increases in juvenile mortality. By linking fine-resolution forecasts of sea surface temperature under different climate change scenarios to SDMs and physiological experiments, we provide a practical first approximation of the potential impact of climate-induced change on two species of marine invertebrates in the same fishery.
Zhang, Yaxin; Tian, Ye; Shen, Maocai; Zeng, Guangming
2018-03-03
Heavy metal contamination in soils/sediments and its impact on human health and ecological environment have aroused wide concerns. Our study investigated 30 samples of soils and sediments around Dongting Lake to analyze the concentration of As, Cd, Cr, Cu, Fe, Mn, Ni, Pb, and Zn in the samples and to distinguish the natural and anthropogenic sources. Also, the relationship between heavy metals and the physicochemical properties of samples was studied by multivariate statistical analysis. Concentration of Cd at most sampling sites were more than five times that of national environmental quality standard for soil in China (GB 15618-1995), and Pb and Zn levels exceeded one to two times. Moreover, Cr in the soil was higher than the national environmental quality standards for one to two times while in sediment was lower than the national standard. The investigation revealed that the accumulations of As, Cd, Mn, and Pb in the soils, and sediments were affected apparently by anthropogenic activities; however, Cr, Fe, and Ni levels were impacted by parent materials. Human activities around Dongting Lake mainly consisted of industrial activities, mining and smelting, sewage discharges, fossil fuel combustion, and agricultural chemicals. The spatial distribution of heavy metal in soil followed the rule of geographical gradient, whereas in sediments, it was significantly affected by the river basins and human activities. The result of principal component analysis (PCA) demonstrated that heavy metals in soils were associated with pH and total phosphorus (TP), while in sediments, As, Cr, Fe, and Ni were closely associated with cation exchange capacity (CEC) and pH, where Pb, Zn, and Cd were associated with total nitrogen (TN), TP, total carbon (TC), moisture content (MC), soil organic matter (SOM), and ignition lost (IL). Our research provides comprehensive approaches to better understand the potential sources and the fate of contaminants in lakeshore soils and sediments.
Keita, Souleymane; Zhonghua, Tang
2017-10-01
Sustainable management of groundwater resources is a major issue for developing countries, especially in Mali. The multiple uses of groundwater led countries to promote sound management policies for sustainable use of the groundwater resources. For this reason, each country needs data enabling it to monitor and predict the changes of the resources. Also given the importance of groundwater quality changes often marked by the recurrence of droughts; the potential impacts of regional and geological setting of groundwater resources requires careful study. Unfortunately, recent decades have seen a considerable reduction of national capacities to ensure the hydrogeological monitoring and production of qualit data for decision making. The purpose of this work is to use the groundwater data and translate into useful information that can improve water resources management capacity in Mali. In this paper, we used groundwater analytical data from accredited, laboratories in Mali to carry out a national scale assessment of the groundwater types and their distribution. We, adapted multivariate statistical methods to classify 2035 groundwater samples into seven main groundwater types and built a national scale map from the results. We used a two-level K-mean clustering technique to examine the hydro-geochemical records as percentages of the total concentrations of major ions, namely sodium (Na), magnesium (Mg), calcium (Ca), chloride (Cl), bicarbonate (HCO3), and sulphate (SO4). The first step of clustering formed 20 groups, and these groups were then re-clustered to produce the final seven groundwater types. The results were verified and confirmed using Principal Component Analysis (PCA) and RockWare (Aq.QA) software. We found that HCO3 was the most dominant anion throughout the country and that Cl and SO4 were only important in some local zones. The dominant cations were Na and Mg. Also, major ion ratios changed with geographical location and geological, and climatic
Studies on the temperature distribution of steel plates with different paints under solar radiation
International Nuclear Information System (INIS)
Liu, Hongbo; Chen, Zhihua; Chen, Binbin; Xiao, Xiao; Wang, Xiaodun
2014-01-01
Thermal effects on steel structures exposed to solar radiation are significant and complicated. Furthermore, the solar radiation absorption coefficient of steel surface with different paintings is the main factor affecting the non-uniform temperature of spatial structures under solar radiation. In this paper, nearly two hundreds steel specimens with different paintings were designed and measured to obtain their solar radiation absorption coefficients using spectrophotometer. Based on the test results, the effect of surface color, painting type, painting thickness on the solar radiation absorption coefficient was analyzed. The actual temperatures under solar radiation for all specimens were also measured in summer not only to verify the absorption coefficient but also provide insight for the temperature distribution of steel structures with different paintings. A numerical simulation and simplified formula were also conducted and verified by test, in order to study the temperature distribution of steel plates with different paints under solar radiation. The results have given an important reference in the future research of thermal effect of steel structures exposed to solar radiation. - Highlights: • Solar radiation absorptions for steel with different paintings were measured. • The temperatures of all specimens under solar radiation were measured. • The effect of color, thickness and painting type on solar absorption was analyzed. • A numerical analysis was conducted and verified by test data. • A simplified formula was deduced and verified by test data
Directory of Open Access Journals (Sweden)
Yanlong Guo
2016-10-01
Full Text Available Climate change will significantly affect plant distribution as well as the quality of medicinal plants. Although numerous studies have analyzed the effect of climate change on future habitats of plants through species distribution models (SDMs, few of them have incorporated the change of effective content of medicinal plants. Schisandra sphenanthera Rehd. et Wils. is an endangered traditional Chinese medical plant which is mainly located in the Qinling Mountains. Combining fuzzy theory and a maximum entropy model, we obtained current spatial distribution of quality assessment for S. spenanthera. Moreover, the future quality and distribution of S. spenanthera were also projected for the periods 2020s, 2050s and 2080s under three different climate change scenarios (SRES-A1B, SRES-A2 and SRES-B1 emission scenarios described in the Special Report on Emissions Scenarios (SRES of IPCC (Intergovernmental Panel on Climate Change. The results showed that the moderately suitable habitat of S. sphenanthera under all climate change scenarios remained relatively stable in the study area. The highly suitable habitat of S. sphenanthera would gradually decrease in the future and a higher decline rate of the highly suitable habitat area would occur under climate change scenarios SRES-A1B and SRES-A2. The result suggested that in the study area, there would be no more highly suitable habitat areas for S. sphenanthera when the annual mean temperature exceeds 20 °C or its annual precipitation exceeds 1,200 mm. Our results will be influential in the future ecological conservation and management of S. sphenanthera and can be taken as a reference for habitat suitability assessment research for other medicinal plants.
Under-Frequency Load Shedding Technique Considering Event-Based for an Islanded Distribution Network
Directory of Open Access Journals (Sweden)
Hasmaini Mohamad
2016-06-01
Full Text Available One of the biggest challenge for an islanding operation is to sustain the frequency stability. A large power imbalance following islanding would cause under-frequency, hence an appropriate control is required to shed certain amount of load. The main objective of this research is to develop an adaptive under-frequency load shedding (UFLS technique for an islanding system. The technique is designed considering an event-based which includes the moment system is islanded and a tripping of any DG unit during islanding operation. A disturbance magnitude is calculated to determine the amount of load to be shed. The technique is modeled by using PSCAD simulation tool. A simulation studies on a distribution network with mini hydro generation is carried out to evaluate the UFLS model. It is performed under different load condition: peak and base load. Results show that the load shedding technique have successfully shed certain amount of load and stabilized the system frequency.
Study of Stand-Alone Microgrid under Condition of Faults on Distribution Line
Malla, S. G.; Bhende, C. N.
2014-10-01
The behavior of stand-alone microgrid is analyzed under the condition of faults on distribution feeders. During fault since battery is not able to maintain dc-link voltage within limit, the resistive dump load control is presented to do so. An inverter control is proposed to maintain balanced voltages at PCC under the unbalanced load condition and to reduce voltage unbalance factor (VUF) at load points. The proposed inverter control also has facility to protect itself from high fault current. Existing maximum power point tracker (MPPT) algorithm is modified to limit the speed of generator during fault. Extensive simulation results using MATLAB/SIMULINK established that the performance of the controllers is quite satisfactory under different fault conditions as well as unbalanced load conditions.
Energy Technology Data Exchange (ETDEWEB)
Tang, Yinjie; Martin, Hector Garcia; Deutschbauer, Adam; Feng, Xueyang; Huang, Rick; Llora, Xavier; Arkin, Adam; Keasling, Jay D.
2009-04-21
An environmentally important bacterium with versatile respiration, Shewanella oneidensis MR-1, displayed significantly different growth rates under three culture conditions: minimal medium (doubling time {approx} 3 hrs), salt stressed minimal medium (doubling time {approx} 6 hrs), and minimal medium with amino acid supplementation (doubling time {approx}1.5 hrs). {sup 13}C-based metabolic flux analysis indicated that fluxes of central metabolic reactions remained relatively constant under the three growth conditions, which is in stark contrast to the reported significant changes in the transcript and metabolite profiles under various growth conditions. Furthermore, ten transposon mutants of S. oneidensis MR-1 were randomly chosen from a transposon library and their flux distributions through central metabolic pathways were revealed to be identical, even though such mutational processes altered the secondary metabolism, for example, glycine and C1 (5,10-Me-THF) metabolism.
Rizvi, Mohd Suhail; Pal, Anupam
2014-09-01
The fibrous matrices are widely used as scaffolds for the regeneration of load-bearing tissues due to their structural and mechanical similarities with the fibrous components of the extracellular matrix. These scaffolds not only provide the appropriate microenvironment for the residing cells but also act as medium for the transmission of the mechanical stimuli, essential for the tissue regeneration, from macroscopic scale of the scaffolds to the microscopic scale of cells. The requirement of the mechanical loading for the tissue regeneration requires the fibrous scaffolds to be able to sustain the complex three-dimensional mechanical loading conditions. In order to gain insight into the mechanical behavior of the fibrous matrices under large amount of elongation as well as shear, a statistical model has been formulated to study the macroscopic mechanical behavior of the electrospun fibrous matrix and the transmission of the mechanical stimuli from scaffolds to the cells via the constituting fibers. The study establishes the load-deformation relationships for the fibrous matrices for different structural parameters. It also quantifies the changes in the fiber arrangement and tension generated in the fibers with the deformation of the matrix. The model reveals that the tension generated in the fibers on matrix deformation is not homogeneous and hence the cells located in different regions of the fibrous scaffold might experience different mechanical stimuli. The mechanical response of fibrous matrices was also found to be dependent on the aspect ratio of the matrix. Therefore, the model establishes a structure-mechanics interdependence of the fibrous matrices under large deformation, which can be utilized in identifying the appropriate structure and external mechanical loading conditions for the regeneration of load-bearing tissues. Copyright © 2014 Elsevier Ltd. All rights reserved.
Understanding the Sampling Distribution and the Central Limit Theorem.
Lewis, Charla P.
The sampling distribution is a common source of misuse and misunderstanding in the study of statistics. The sampling distribution, underlying distribution, and the Central Limit Theorem are all interconnected in defining and explaining the proper use of the sampling distribution of various statistics. The sampling distribution of a statistic is…
Narukawa, Takafumi; Yamaguchi, Akira; Jang, Sunghyon; Amaya, Masaki
2018-02-01
For estimating fracture probability of fuel cladding tube under loss-of-coolant accident conditions of light-water-reactors, laboratory-scale integral thermal shock tests were conducted on non-irradiated Zircaloy-4 cladding tube specimens. Then, the obtained binary data with respect to fracture or non-fracture of the cladding tube specimen were analyzed statistically. A method to obtain the fracture probability curve as a function of equivalent cladding reacted (ECR) was proposed using Bayesian inference for generalized linear models: probit, logit, and log-probit models. Then, model selection was performed in terms of physical characteristics and information criteria, a widely applicable information criterion and a widely applicable Bayesian information criterion. As a result, it was clarified that the log-probit model was the best among the three models to estimate the fracture probability in terms of the degree of prediction accuracy for both next data to be obtained and the true model. Using the log-probit model, it was shown that 20% ECR corresponded to a 5% probability level with a 95% confidence of fracture of the cladding tube specimens.
Tian, Guo-Liang; Li, Hui-Qiong
2017-08-01
Some existing confidence interval methods and hypothesis testing methods in the analysis of a contingency table with incomplete observations in both margins entirely depend on an underlying assumption that the sampling distribution of the observed counts is a product of independent multinomial/binomial distributions for complete and incomplete counts. However, it can be shown that this independency assumption is incorrect and can result in unreliable conclusions because of the under-estimation of the uncertainty. Therefore, the first objective of this paper is to derive the valid joint sampling distribution of the observed counts in a contingency table with incomplete observations in both margins. The second objective is to provide a new framework for analyzing incomplete contingency tables based on the derived joint sampling distribution of the observed counts by developing a Fisher scoring algorithm to calculate maximum likelihood estimates of parameters of interest, the bootstrap confidence interval methods, and the bootstrap testing hypothesis methods. We compare the differences between the valid sampling distribution and the sampling distribution under the independency assumption. Simulation studies showed that average/expected confidence-interval widths of parameters based on the sampling distribution under the independency assumption are shorter than those based on the new sampling distribution, yielding unrealistic results. A real data set is analyzed to illustrate the application of the new sampling distribution for incomplete contingency tables and the analysis results again confirm the conclusions obtained from the simulation studies.
Wang, Wei; Wen, Changyun; Huang, Jiangshuai; Fan, Huijin
2017-11-01
In this paper, a backstepping based distributed adaptive control scheme is proposed for multiple uncertain Euler-Lagrange systems under directed graph condition. The common desired trajectory is allowed totally unknown by part of the subsystems and the linearly parameterized trajectory model assumed in currently available results is no longer needed. To compensate the effects due to unknown trajectory information, a smooth function of consensus errors and certain positive integrable functions are introduced in designing virtual control inputs. Besides, to overcome the difficulty of completely counteracting the coupling terms of distributed consensus errors and parameter estimation errors in the presence of asymmetric Laplacian matrix, extra information transmission of local parameter estimates are introduced among linked subsystem and adaptive gain technique is adopted to generate distributed torque inputs. It is shown that with the proposed distributed adaptive control scheme, global uniform boundedness of all the closed-loop signals and asymptotically output consensus tracking can be achieved. Copyright © 2017 ISA. Published by Elsevier Ltd. All rights reserved.
Llope, W. J.; STAR Collaboration
2013-10-01
Specific products of the statistical moments of the multiplicity distributions of identified particles can be directly compared to susceptibility ratios obtained from lattice QCD calculations. They may also diverge for nuclear systems formed close to a possible QCD critical point due to the phenomenon of critical opalescence. Of particular interest are the moments products for net-protons, net-kaons, and net-charge, as these are considered proxies for conserved quantum numbers. The moments products have been measured by the STAR experiment for Au+Au collisions at seven beam energies ranging from 7.7 to 200 GeV. In this presentation, the experimental results are compared to data-based calculations in which the intra-event correlations of the numbers of positive and negative particles are broken by construction. The importance of intra-event correlations to the moments products values for net-protons, net-kaons, and net-charge can thus be evaluated. Work supported by the U.S. Dept of Energy under grant DE-PS02-09ER09.
Energy Technology Data Exchange (ETDEWEB)
Radunovic, J [Institute of nuclear sciences Boris Kidric, Vinca, Beograd (Yugoslavia)
1973-07-01
This paper deals with the application of statistical method for the analysis of nuclear reactions related to complex nuclei. It is shown that inelastic neutron scattering which occurs by creation of a complex nucleus in the higher energy range can be treated by statistical approach.
Holland, Bart K.
2006-01-01
A generally-educated individual should have some insight into how decisions are made in the very wide range of fields that employ statistical and probabilistic reasoning. Also, students of introductory probability and statistics are often best motivated by specific applications rather than by theory and mathematical development, because most…
DEFF Research Database (Denmark)
Risager, Morten S.; Rudnick, Zeev
We study a variant of a problem considered by Dinaburg and Sinai on the statistics of the minimal solution to a linear Diophantine equation. We show that the signed ratio between the Euclidean norms of the minimal solution and the coefficient vector is uniformly distributed modulo one. We reduce ...
Lunsford, M. Leigh; Rowell, Ginger Holmes; Goodson-Espy, Tracy
2006-01-01
We applied a classroom research model to investigate student understanding of sampling distributions of sample means and the Central Limit Theorem in post-calculus introductory probability and statistics courses. Using a quantitative assessment tool developed by previous researchers and a qualitative assessment tool developed by the authors, we…
The Weibull distribution a handbook
Rinne, Horst
2008-01-01
The Most Comprehensive Book on the SubjectChronicles the Development of the Weibull Distribution in Statistical Theory and Applied StatisticsExploring one of the most important distributions in statistics, The Weibull Distribution: A Handbook focuses on its origin, statistical properties, and related distributions. The book also presents various approaches to estimate the parameters of the Weibull distribution under all possible situations of sampling data as well as approaches to parameter and goodness-of-fit testing.Describes the Statistical Methods, Concepts, Theories, and Applications of T
Distributed Consensus of Stochastic Delayed Multi-agent Systems Under Asynchronous Switching.
Wu, Xiaotai; Tang, Yang; Cao, Jinde; Zhang, Wenbing
2016-08-01
In this paper, the distributed exponential consensus of stochastic delayed multi-agent systems with nonlinear dynamics is investigated under asynchronous switching. The asynchronous switching considered here is to account for the time of identifying the active modes of multi-agent systems. After receipt of confirmation of mode's switching, the matched controller can be applied, which means that the switching time of the matched controller in each node usually lags behind that of system switching. In order to handle the coexistence of switched signals and stochastic disturbances, a comparison principle of stochastic switched delayed systems is first proved. By means of this extended comparison principle, several easy to verified conditions for the existence of an asynchronously switched distributed controller are derived such that stochastic delayed multi-agent systems with asynchronous switching and nonlinear dynamics can achieve global exponential consensus. Two examples are given to illustrate the effectiveness of the proposed method.
2010-04-01
... 21 Food and Drugs 9 2010-04-01 2010-04-01 false Reinstatement of exemption for drug products distributed under the Food, Drug and Cosmetic Act. 1310.11 Section 1310.11 Food and Drugs DRUG ENFORCEMENT... Reinstatement of exemption for drug products distributed under the Food, Drug and Cosmetic Act. (a) The...
2010-04-01
... 21 Food and Drugs 9 2010-04-01 2010-04-01 false Removal of the exemption of drugs distributed under the Food, Drug and Cosmetic Act. 1310.10 Section 1310.10 Food and Drugs DRUG ENFORCEMENT... Removal of the exemption of drugs distributed under the Food, Drug and Cosmetic Act. (a) The Administrator...
Dierker, Lisa; Alexander, Jalen; Cooper, Jennifer L.; Selya, Arielle; Rose, Jennifer; Dasgupta, Nilanjana
2016-01-01
Introductory statistics needs innovative, evidence-based teaching practices that support and engage diverse students. To evaluate the success of a multidisciplinary, project-based course, we compared experiences of under-represented (URM) and non-underrepresented students in 4 years of the course. While URM students considered the material more…
Density and spatial distribution of Parkia biglobosa pattern in Benin under climate change
Directory of Open Access Journals (Sweden)
Fafunkè Titilayo Dotchamou
2016-06-01
Full Text Available Parkia biglobosa is an indigenous species which, traditionally contributes to the resilience of the agricultural production system in terms of food security, source of income, poverty reduction and ecosystem stability. Therefore, it is important to improve knowledge on its density, current and future spatial distribution. The main objective of this study is to evaluate the tree density, the climate change effects on the spatial distribution of the species in the future for better conservation. The modeling of the current and future geographical distribution of the species is based on the principle of Maximum Entropy (MaxEnt on a total of 286 occurrence points from field work and Global Biodiversity Information Facility GBIF-Data Portal-(www.gbif.org. Two climatic models (HadGEM2_ES and Csiro_mk3_6_0 have been used under two scenarios RCP 2.6 and RCP 8.5 for the projection of the species distribution at the horizon 2050. The correlation analyses and Jackknife test have helped to identify seven variables which are less correlated (r < 0.80 with highest modeling participation. The soil, annual precipitation (BIO12 and temperature (diurnal average Deviation are the variables which have mostly contributed to performance of the models. Currently, 53% of national territory, spread from north to south is very suitable to the cultivation of P. biglobosa. The scenarios have predicted at the horizon 2050, a loss of the habitats which are currently very suitable for the cultivation and conservation of P. biglobosa, to the benefit of moderate and weak habitats. 51% and 57% are the highest proportion of this lost which will be registered with HadGEM2_ES model under two scenarios. These results revealed that the suitable habitat of the species is threatened by climate change in Benin. In order to limit damage such as decreased productivity, extinction of species, some appropriate solutions must be found.
Directory of Open Access Journals (Sweden)
Xuezhen Ge
Full Text Available As the primary pest of palm trees, Rhynchophorus ferrugineus (Olivier (Coleoptera: Curculionidae has caused serious harm to palms since it first invaded China. The present study used CLIMEX 1.1 to predict the potential distribution of R. ferrugineus in China according to both current climate data (1981-2010 and future climate warming estimates based on simulated climate data for the 2020s (2011-2040 provided by the Tyndall Center for Climate Change Research (TYN SC 2.0. Additionally, the Ecoclimatic Index (EI values calculated for different climatic conditions (current and future, as simulated by the B2 scenario were compared. Areas with a suitable climate for R. ferrugineus distribution were located primarily in central China according to the current climate data, with the northern boundary of the distribution reaching to 40.1°N and including Tibet, north Sichuan, central Shaanxi, south Shanxi, and east Hebei. There was little difference in the potential distribution predicted by the four emission scenarios according to future climate warming estimates. The primary prediction under future climate warming models was that, compared with the current climate model, the number of highly favorable habitats would increase significantly and expand into northern China, whereas the number of both favorable and marginally favorable habitats would decrease. Contrast analysis of EI values suggested that climate change and the density of site distribution were the main effectors of the changes in EI values. These results will help to improve control measures, prevent the spread of this pest, and revise the targeted quarantine areas.
Energy Technology Data Exchange (ETDEWEB)
Vallee, T.; Keller, Th. [Ecole Polytech Fed Lausanne, CCLab, CH-1015 Lausanne, (Switzerland); Fourestey, G. [Ecole Polytech Fed Lausanne, IACS, Chair Modeling and Sci Comp, CH-1015 Lausanne, (Switzerland); Fournier, B. [CEA SACLAY ENSMP, DEN, DANS, DMN, SRMA, LC2M, F-91191 Gif Sur Yvette, (France); Correia, J.R. [Univ Tecn Lisbon, Inst Super Tecn, Civil Engn and Architecture Dept, P-1049001 Lisbon, (Portugal)
2009-07-01
The Weibull distribution, used to describe the scaling of strength of materials, has been verified on a wide range of materials and geometries: however, the quality of the fitting tended to be less good towards the upper tail. Based on a previously developed probabilistic strength prediction method for adhesively bonded joints composed of pultruded glass fiber-reinforced polymer (GFRP) adherends, where it was verified that a two-parameter Weibull probabilistic distribution was not able to model accurately the upper tail of a material strength distribution, different improved probabilistic distributions were compared to enhance the quality of strength predictions. The following probabilistic distributions were examined: a two-parameter Weibull (as a reference), m-fold Weibull, a Grafted Distribution, a Birnbaum-Saunders Distribution and a Generalized Lambda Distribution. The Generalized Lambda Distribution turned out to be the best analytical approximation for the strength data, providing a good fit to the experimental data, and leading to more accurate joint strength predictions than the original two-parameter Weibull distribution. It was found that a proper modeling of the upper tail leads to a noticeable increase of the quality of the predictions. (authors)
International Nuclear Information System (INIS)
Vallee, T.; Keller, Th.; Fourestey, G.; Fournier, B.; Correia, J.R.
2009-01-01
The Weibull distribution, used to describe the scaling of strength of materials, has been verified on a wide range of materials and geometries: however, the quality of the fitting tended to be less good towards the upper tail. Based on a previously developed probabilistic strength prediction method for adhesively bonded joints composed of pultruded glass fiber-reinforced polymer (GFRP) adherends, where it was verified that a two-parameter Weibull probabilistic distribution was not able to model accurately the upper tail of a material strength distribution, different improved probabilistic distributions were compared to enhance the quality of strength predictions. The following probabilistic distributions were examined: a two-parameter Weibull (as a reference), m-fold Weibull, a Grafted Distribution, a Birnbaum-Saunders Distribution and a Generalized Lambda Distribution. The Generalized Lambda Distribution turned out to be the best analytical approximation for the strength data, providing a good fit to the experimental data, and leading to more accurate joint strength predictions than the original two-parameter Weibull distribution. It was found that a proper modeling of the upper tail leads to a noticeable increase of the quality of the predictions. (authors)
Stergiopoulos, Ch.; Stavrakas, I.; Triantis, D.; Vallianatos, F.; Stonham, J.
2015-02-01
Weak electric signals termed as 'Pressure Stimulated Currents, PSC' are generated and detected while cement based materials are found under mechanical load, related to the creation of cracks and the consequent evolution of cracks' network in the bulk of the specimen. During the experiment a set of cement mortar beams of rectangular cross-section were subjected to Three-Point Bending (3PB). For each one of the specimens an abrupt mechanical load step was applied, increased from the low load level (Lo) to a high final value (Lh) , where Lh was different for each specimen and it was maintained constant for long time. The temporal behavior of the recorded PSC show that during the load increase a spike-like PSC emission was recorded and consequently a relaxation of the PSC, after reaching its final value, follows. The relaxation process of the PSC was studied using non-extensive statistical physics (NESP) based on Tsallis entropy equation. The behavior of the Tsallis q parameter was studied in relaxation PSCs in order to investigate its potential use as an index for monitoring the crack evolution process with a potential use in non-destructive laboratory testing of cement-based specimens of unknown internal damage level. The dependence of the q-parameter on the Lh (when Lh <0.8Lf), where Lf represents the 3PB strength of the specimen, shows an increase on the q value when the specimens are subjected to gradually higher bending loadings and reaches a maximum value close to 1.4 when the applied Lh becomes higher than 0.8Lf. While the applied Lh becomes higher than 0.9Lf the value of the q-parameter gradually decreases. This analysis of the experimental data manifests that the value of the entropic index q obtains a characteristic decrease while reaching the ultimate strength of the specimen, and thus could be used as a forerunner of the expected failure.
Misnaza, Sandra Patricia; Roncancio, Claudia Patricia; Peña, Isabel Cristina; Prieto, Franklin Edwin
2016-09-01
During 2012, 13% of the deaths worldwide in children under the age of 28 days were due to congenital malformations. In Colombia, congenital malformations are the second leading cause of infant mortality. Objective: To determine the geographical distribution of extended perinatal mortality due to congenital malformations in Colombia between 1999 and 2008. Materials and methods: We conducted a cross-sectional study. We revised all death certificates issued between 1999 and 2008. We defined perinatal mortality as fetal or non-fetal deaths within the first 28 days after delivery in children with body weight ≥500 grams, and congenital malformations according to ICD-10 diagnostic codes Q000 - Q999. The annual birth projection was used as the denominator. We defined high mortality areas due to congenital malformations as those in the 90th percentile. Results: We recorded 22,361 perinatal deaths due to congenital malformations. The following provinces exceeded the 90th perinatal mortality percentile: Antioquia, Caldas, Risaralda, Huila, Quindío, Bogotá, Valle del Cauca and Guainía. Among the municipalities, the highest perinatal mortality rates were found in Giraldo, Ciudad Bolívar, Riosucio, Liborina, Supía, Alejandría, Sopetrán, San Jerónimo, Santa Fe de Antioquia and Marmato (205.81 and 74.18 per 10.000 live births).The perinatal mortality rate due to malformations of the circulatory system was 28.1 per 10.000 live births, whereas the rates for central nervous system defects and chromosomal abnormalities were 13.7 and 7.0, respectively. The Andean region showed high perinatal mortality rates due to congenital malformations. There is an urgent need to identify possible risk factors of perinatal mortality and implement successive prevention programs in that particular region.
Digital Repository Service at National Institute of Oceanography (India)
Jayalakshmy, K.V.; Rao, K.K.
A study of planktonic foraminiferal assemblages from 19 stations in the neritic and oceanic regions off the Coromandel Coast, Bay of Bengal has been made using a multivariate statistical method termed as factor analysis. On the basis of abundance...
Directory of Open Access Journals (Sweden)
Xinwei Wang
2016-11-01
Full Text Available Sandwich structures are widely used in practice and thus various engineering theories adopting simplifying assumptions are available. However, most engineering theories of beams, plates and shells cannot recover all stresses accurately through their constitutive equations. Therefore, the soft-core is directly modeled by two-dimensional (2D elasticity theory without any pre-assumption on the displacement field. The top and bottom faces act like the elastic supports on the top and bottom edges of the core. The differential equations of the 2D core are then solved by the harmonic differential quadrature method (HDQM. To circumvent the difficulties in dealing with the locally distributed load by point discrete methods such as the HDQM, a general and rigorous way is proposed to treat the locally distributed load. Detailed formulations are provided. The static behavior of sandwich panels under different locally distributed loads is investigated. For verification, results are compared with data obtained by ABAQUS with very fine meshes. A high degree of accuracy on both displacement and stress has been observed.
Allman, Elizabeth S; Degnan, James H; Rhodes, John A
2011-06-01
Gene trees are evolutionary trees representing the ancestry of genes sampled from multiple populations. Species trees represent populations of individuals-each with many genes-splitting into new populations or species. The coalescent process, which models ancestry of gene copies within populations, is often used to model the probability distribution of gene trees given a fixed species tree. This multispecies coalescent model provides a framework for phylogeneticists to infer species trees from gene trees using maximum likelihood or Bayesian approaches. Because the coalescent models a branching process over time, all trees are typically assumed to be rooted in this setting. Often, however, gene trees inferred by traditional phylogenetic methods are unrooted. We investigate probabilities of unrooted gene trees under the multispecies coalescent model. We show that when there are four species with one gene sampled per species, the distribution of unrooted gene tree topologies identifies the unrooted species tree topology and some, but not all, information in the species tree edges (branch lengths). The location of the root on the species tree is not identifiable in this situation. However, for 5 or more species with one gene sampled per species, we show that the distribution of unrooted gene tree topologies identifies the rooted species tree topology and all its internal branch lengths. The length of any pendant branch leading to a leaf of the species tree is also identifiable for any species from which more than one gene is sampled.
Fruit yield and root system distribution of 'Tommy Atkins' mango under different irrigation regimes
Directory of Open Access Journals (Sweden)
Marcelo R. dos Santos
2014-04-01
Full Text Available This study aimed to evaluate the fruit yield and the distribution of 'Tommy Atkins' mango root system under different irrigation regimes in the semiarid region of Bahia. The experimental design was completely randomized with five treatments and three replicates: 1 - Irrigation supplying 100% of ETc in phases I, II and III; 2 - Regulated deficit irrigation (RDI supplying 50% of ETc in phase I (beginning of flowering to early fruit growth; 3 - RDI supplying 50% ETc in phase II (start of expansion until the beginning of physiological maturity; 4 - RDI supplying 50% ETc in phase III (physiological mature fruits; 5 - No irrigation during all three phases. The regulated deficit irrigation supplying 50% of the ETc during phase I and II provided larger root length density of 'Tommy Atkins' mango. Regardless of management strategy, the roots were developed in all evaluated soil volume and the highest density is concentrated from 0.50 to 1.50 m distance from the trunk and in 0.20 to 0.90 m depth in the soil, that suggests this region to be the best place for fertilizer application as well for soil water sensor placement. The application of RDI during fruit set does not influence either root distribution or production. Root system and crop production is significantly reduced under no irrigation conditions.
The potential distribution of bioenergy crops in Europe under present and future climate
International Nuclear Information System (INIS)
Tuck, Gill; Glendining, Margaret J.; Smith, Pete; Wattenbach, Martin; House, Jo I.
2006-01-01
We have derived maps of the potential distribution of 26 promising bioenergy crops in Europe, based on simple rules for suitable climatic conditions and elevation. Crops suitable for temperate and Mediterranean climates were selected from four groups: oilseeds (e.g. oilseed rape, sunflower), starch crops (e.g. potatoes), cereals (e.g. barley) and solid biofuel crops (e.g. sorghum, Miscanthus). The impact of climate change under different scenarios and GCMs on the potential future distribution of these crops was determined, based on predicted future climatic conditions. Climate scenarios based on four IPCC SRES emission scenarios, A1FI, A2, B1 and B2, implemented by four global climate models, HadCM3, CSIRO2, PCM and CGCM2, were used. The potential distribution of temperate oilseeds, cereals, starch crops and solid biofuels is predicted to increase in northern Europe by the 2080s, due to increasing temperatures, and decrease in southern Europe (e.g. Spain, Portugal, southern France, Italy, and Greece) due to increased drought. Mediterranean oil and solid biofuel crops, currently restricted to southern Europe, are predicted to extend further north due to higher summer temperatures. Effects become more pronounced with time and are greatest under the A1FI scenario and for models predicting the greatest climate forcing. Different climate models produce different regional patterns. All models predict that bioenergy crop production in Spain is especially vulnerable to climate change, with many temperate crops predicted to decline dramatically by the 2080s. The choice of bioenergy crops in southern Europe will be severely reduced in future unless measures are taken to adapt to climate change. (author)
International Nuclear Information System (INIS)
Reed, Donald Timothy; Borkowski, Marian; Lucchini, Jean-Francois; Ams, David; Richmann, M.K.; Khaing, H.; Swanson, J.S.
2010-01-01
The fate and potential mobility of multivalent actinides in the subsurface is receiving increased attention as the DOE looks to cleanup the many legacy nuclear waste sites and associated subsurface contamination. Plutonium, uranium and neptunium are the near-surface multivalent contaminants of concern and are also key contaminants for the deep geologic disposal of nuclear waste. Their mobility is highly dependent on their redox distribution at their contamination source as well as along their potential migration pathways. This redox distribution is often controlled, especially in the near-surface where organic/inorganic contaminants often coexist, by the direct and indirect effects of microbial activity. Under anoxic conditions, indirect and direct bioreduction mechanisms exist that promote the prevalence of lower-valent species for multivalent actinides. Oxidation-state-specific biosorption is also an important consideration for long-term migration and can influence oxidation state distribution. Results of ongoing studies to explore and establish the oxidation-state specific interactions of soil bacteria (metal reducers and sulfate reducers) as well as halo-tolerant bacteria and Archaea for uranium, neptunium and plutonium will be presented. Enzymatic reduction is a key process in the bioreduction of plutonium and uranium, but co-enzymatic processes predominate in neptunium systems. Strong sorptive interactions can occur for most actinide oxidation states but are likely a factor in the stabilization of lower-valent species when more than one oxidation state can persist under anaerobic microbiologically-active conditions. These results for microbiologically active systems are interpreted in the context of their overall importance in defining the potential migration of multivalent actinides in the subsurface.
Vlad, Marcel Ovidiu; Tsuchiya, Masa; Oefner, Peter; Ross, John
2002-01-01
We investigate the statistical properties of systems with random chemical composition and try to obtain a theoretical derivation of the self-similar Dirichlet distribution, which is used empirically in molecular biology, environmental chemistry, and geochemistry. We consider a system made up of many chemical species and assume that the statistical distribution of the abundance of each chemical species in the system is the result of a succession of a variable number of random dilution events, which can be described by using the renormalization-group theory. A Bayesian approach is used for evaluating the probability density of the chemical composition of the system in terms of the probability densities of the abundances of the different chemical species. We show that for large cascades of dilution events, the probability density of the composition vector of the system is given by a self-similar probability density of the Dirichlet type. We also give an alternative formal derivation for the Dirichlet law based on the maximum entropy approach, by assuming that the average values of the chemical potentials of different species, expressed in terms of molar fractions, are constant. Although the maximum entropy approach leads formally to the Dirichlet distribution, it does not clarify the physical origin of the Dirichlet statistics and has serious limitations. The random theory of dilution provides a physical picture for the emergence of Dirichlet statistics and makes it possible to investigate its validity range. We discuss the implications of our theory in molecular biology, geochemistry, and environmental science.
Nick, Todd G
2007-01-01
Statistics is defined by the Medical Subject Headings (MeSH) thesaurus as the science and art of collecting, summarizing, and analyzing data that are subject to random variation. The two broad categories of summarizing and analyzing data are referred to as descriptive and inferential statistics. This chapter considers the science and art of summarizing data where descriptive statistics and graphics are used to display data. In this chapter, we discuss the fundamentals of descriptive statistics, including describing qualitative and quantitative variables. For describing quantitative variables, measures of location and spread, for example the standard deviation, are presented along with graphical presentations. We also discuss distributions of statistics, for example the variance, as well as the use of transformations. The concepts in this chapter are useful for uncovering patterns within the data and for effectively presenting the results of a project.
Wijngaarden, R.J.; Westerterp, K.R.
1992-01-01
Pellet heat and mass transfer coefficients inside packed beds do not have definite deterministic values, but are stochastic quantities with a certain distribution. Here, a method is presented to incorporate the stochastic distribution of pellet properties in reactor design and operation models. The
International Nuclear Information System (INIS)
Fanelli, Pierluigi; Ubertini, Stefano; Biscarini, Chiara; Jannelli, Elio; Ubertini, Filippo
2017-01-01
Various mechanical, ocean, aerospace and civil engineering problems involve solid bodies impacting the water surface and often result in complex coupled dynamics, characterized by impulsive loading conditions, high amplitude vibrations and large local deformations. Monitoring in such problems for purposes such as remaining fatigue life estimation and real time damage detection is a technical and scientific challenge of primary concern in this context. Open issues include the need for developing distributed sensing systems able to operate at very high acquisition frequencies, to be utilized to study rapidly varying strain fields, with high resolution and very low noise, while scientific challenges mostly relate to the definition of appropriate signal processing and modeling tools enabling the extraction of useful information from distributed sensing signals. Building on previous work by some of the authors, we propose an enhanced method for real time deformed shape reconstruction using distributed FBG strain measurements in curved bodies subjected to impulsive loading and we establish a new framework for applying this method for structural health monitoring purposes, as the main focus of the work. Experiments are carried out on a cylinder impacting the water at various speeds, proving improved performance in displacement reconstruction of the enhanced method compared to its previous version. A numerical study is then carried out considering the same physical problem with different delamination damages affecting the body. The potential for detecting, localizing and quantifying this damage using the reconstruction algorithm is thoroughly investigated. Overall, the results presented in the paper show the potential of distributed FBG strain measurements for real time structural health monitoring of curved bodies under impulsive hydrodynamic loading, defining damage sensitive features in terms of strain or displacement reconstruction errors at selected locations along
Energy statistics yearbook 2002
International Nuclear Information System (INIS)
2005-01-01
The Energy Statistics Yearbook 2002 is a comprehensive collection of international energy statistics prepared by the United Nations Statistics Division. It is the forty-sixth in a series of annual compilations which commenced under the title World Energy Supplies in Selected Years, 1929-1950. It updates the statistical series shown in the previous issue. Supplementary series of monthly and quarterly data on production of energy may be found in the Monthly Bulletin of Statistics. The principal objective of the Yearbook is to provide a global framework of comparable data on long-term trends in the supply of mainly commercial primary and secondary forms of energy. Data for each type of fuel and aggregate data for the total mix of commercial fuels are shown for individual countries and areas and are summarized into regional and world totals. The data are compiled primarily from the annual energy questionnaire distributed by the United Nations Statistics Division and supplemented by official national statistical publications. Where official data are not available or are inconsistent, estimates are made by the Statistics Division based on governmental, professional or commercial materials. Estimates include, but are not limited to, extrapolated data based on partial year information, use of annual trends, trade data based on partner country reports, breakdowns of aggregated data as well as analysis of current energy events and activities
Energy statistics yearbook 2001
International Nuclear Information System (INIS)
2004-01-01
The Energy Statistics Yearbook 2001 is a comprehensive collection of international energy statistics prepared by the United Nations Statistics Division. It is the forty-fifth in a series of annual compilations which commenced under the title World Energy Supplies in Selected Years, 1929-1950. It updates the statistical series shown in the previous issue. Supplementary series of monthly and quarterly data on production of energy may be found in the Monthly Bulletin of Statistics. The principal objective of the Yearbook is to provide a global framework of comparable data on long-term trends in the supply of mainly commercial primary and secondary forms of energy. Data for each type of fuel and aggregate data for the total mix of commercial fuels are shown for individual countries and areas and are summarized into regional and world totals. The data are compiled primarily from the annual energy questionnaire distributed by the United Nations Statistics Division and supplemented by official national statistical publications. Where official data are not available or are inconsistent, estimates are made by the Statistics Division based on governmental, professional or commercial materials. Estimates include, but are not limited to, extrapolated data based on partial year information, use of annual trends, trade data based on partner country reports, breakdowns of aggregated data as well as analysis of current energy events and activities
Energy statistics yearbook 2000
International Nuclear Information System (INIS)
2002-01-01
The Energy Statistics Yearbook 2000 is a comprehensive collection of international energy statistics prepared by the United Nations Statistics Division. It is the forty-third in a series of annual compilations which commenced under the title World Energy Supplies in Selected Years, 1929-1950. It updates the statistical series shown in the previous issue. Supplementary series of monthly and quarterly data on production of energy may be found in the Monthly Bulletin of Statistics. The principal objective of the Yearbook is to provide a global framework of comparable data on long-term trends in the supply of mainly commercial primary and secondary forms of energy. Data for each type of fuel and aggregate data for the total mix of commercial fuels are shown for individual countries and areas and are summarized into regional and world totals. The data are compiled primarily from the annual energy questionnaire distributed by the United Nations Statistics Division and supplemented by official national statistical publications. Where official data are not available or are inconsistent, estimates are made by the Statistics Division based on governmental, professional or commercial materials. Estimates include, but are not limited to, extrapolated data based on partial year information, use of annual trends, trade data based on partner country reports, breakdowns of aggregated data as well as analysis of current energy events and activities
Paliwal, Bhasker
brittle materials. The model incorporates pre-existing defect distributions and a crack growth law. The damage is defined as a scalar parameter which is a function of the micro-crack density, the evolution of which is a function of the existing defect distribution and the crack growth dynamics. A specific case of a uniaxial compressive loading under constant strain-rate has been studied to predict the effects of the strain-rate, defect distribution and the crack growth dynamics on the constitutive response and failure behavior of brittle materials. Finally, the effects of crack growth dynamics on the strain-rate sensitivity of brittle materials are studied with the help of the micro-mechanical damage model. The results are compared with the experimentally observed damage evolution and the rate-sensitive behavior of the compressive strength of several engineering ceramics. The dynamic failure of armor-grade hot-pressed boron carbide (B 4C) under loading rates of ˜ 5X10-6 to 200 MPa/mus is also discussed.
Kauweloa, Kevin I; Gutierrez, Alonso N; Stathakis, Sotirios; Papanikolaou, Niko; Mavroidis, Panayiotis
2016-07-01
A toolkit has been developed for calculating the 3-dimensional biological effective dose (BED) distributions in multi-phase, external beam radiotherapy treatments such as those applied in liver stereotactic body radiation therapy (SBRT) and in multi-prescription treatments. This toolkit also provides a wide range of statistical results related to dose and BED distributions. MATLAB 2010a, version 7.10 was used to create this GUI toolkit. The input data consist of the dose distribution matrices, organ contour coordinates, and treatment planning parameters from the treatment planning system (TPS). The toolkit has the capability of calculating the multi-phase BED distributions using different formulas (denoted as true and approximate). Following the calculations of the BED distributions, the dose and BED distributions can be viewed in different projections (e.g. coronal, sagittal and transverse). The different elements of this toolkit are presented and the important steps for the execution of its calculations are illustrated. The toolkit is applied on brain, head & neck and prostate cancer patients, who received primary and boost phases in order to demonstrate its capability in calculating BED distributions, as well as measuring the inaccuracy and imprecision of the approximate BED distributions. Finally, the clinical situations in which the use of the present toolkit would have a significant clinical impact are indicated. Copyright © 2016 Elsevier Ireland Ltd. All rights reserved.
International Nuclear Information System (INIS)
Ochiai, S; Matsubayashi, H; Okuda, H; Osamura, K; Otto, A; Malozemoff, A
2009-01-01
Distributions of local and overall critical currents and correlation of n value to the critical current of bent Bi2223 composite tape were studied from the statistical viewpoint. The data of the local and overall transport critical currents and n values of the Bi2223 composite tape specimens were collected experimentally for a wide range of bending strain (0-1.1%) by using the specimens, designed so as to characterize the local and overall critical currents and n values. The measured local and overall critical currents were analyzed with various types of Weibull distribution function. Which of the Weibull distribution functions is suitable for the description of the distribution of local and overall critical currents at each bending strain, and also how much the Weibull parameter values characterizing the distribution vary with bending strain, were revealed. Then we attempted to reproduce the overall critical current distribution and correlation of the overall n value to the overall critical current from the distribution of local critical currents and the correlation of the local n value to the local critical current by a Monte Carlo simulation. The measured average values of critical current and n value at each bending strain and the correlation of n value to critical current were reproduced well by the present simulation, while the distribution of critical current values was reproduced fairly well but not fully. The reason for this is discussed.
Privacy-Preserving k-Means Clustering under Multiowner Setting in Distributed Cloud Environments
Directory of Open Access Journals (Sweden)
Hong Rong
2017-01-01
Full Text Available With the advent of big data era, clients who lack computational and storage resources tend to outsource data mining tasks to cloud service providers in order to improve efficiency and reduce costs. It is also increasingly common for clients to perform collaborative mining to maximize profits. However, due to the rise of privacy leakage issues, the data contributed by clients should be encrypted using their own keys. This paper focuses on privacy-preserving k-means clustering over the joint datasets encrypted under multiple keys. Unfortunately, existing outsourcing k-means protocols are impractical because not only are they restricted to a single key setting, but also they are inefficient and nonscalable for distributed cloud computing. To address these issues, we propose a set of privacy-preserving building blocks and outsourced k-means clustering protocol under Spark framework. Theoretical analysis shows that our scheme protects the confidentiality of the joint database and mining results, as well as access patterns under the standard semihonest model with relatively small computational overhead. Experimental evaluations on real datasets also demonstrate its efficiency improvements compared with existing approaches.
Predicting the Potential Distribution of Polygala tenuifolia Willd. under Climate Change in China.
Directory of Open Access Journals (Sweden)
Hongjun Jiang
Full Text Available Global warming has created opportunities and challenges for the survival and development of species. Determining how climate change may impact multiple ecosystem levels and lead to various species adaptations is necessary for both biodiversity conservation and sustainable biological resource utilization. In this study, we employed Maxent to predict changes in the habitat range and altitude of Polygala tenuifolia Willd. under current and future climate scenarios in China. Four representative concentration pathways (RCP2.6, RCP4.5, RCP6.0, and RCP8.5 were modeled for two time periods (2050 and 2070. The model inputs included 732 presence points and nine sets of environmental variables under the current conditions and the four RCPs in 2050 and 2070. The area under the receiver-operating characteristic (ROC curve (AUC was used to evaluate model performance. All of the AUCs were greater than 0.80, thereby placing these models in the "very good" category. Using a jackknife analysis, the precipitation in the warmest quarter, annual mean temperature, and altitude were found to be the top three variables that affect the range of P. tenuifolia. Additionally, we found that the predicted highly suitable habitat was in reasonable agreement with its actual distribution. Furthermore, the highly suitable habitat area was slowly reduced over time.
Directory of Open Access Journals (Sweden)
Qing Shuang
2016-01-01
Full Text Available The stability of water service is a hot point in industrial production, public safety, and academic research. The paper establishes a service evaluation model for the water distribution network (WDN. The serviceability is measured in three aspects: (1 the functionality of structural components under disaster environment; (2 the recognition of cascading failure process; and (3 the calculation of system reliability. The node and edge failures in WDN are interrelated under seismic excitations. The cascading failure process is provided with the balance of water supply and demand. The matrix-based system reliability (MSR method is used to represent the system events and calculate the nonfailure probability. An example is used to illustrate the proposed method. The cascading failure processes with different node failures are simulated. The serviceability is analyzed. The critical node can be identified. The result shows that the aged network has a greater influence on the system service under seismic scenario. The maintenance could improve the antidisaster ability of WDN. Priority should be given to controlling the time between the initial failure and the first secondary failure, for taking postdisaster emergency measures within this time period can largely cut down the spread of cascade effect in the whole WDN.
International Nuclear Information System (INIS)
Zhixiang, Z.
1983-01-01
The least squares fit has been performed using chi-squared distribution function for all available evaluated data for s-wave reduced neutron width of several nuclei. The number of degrees of freedom and average value have been obtained. The missing levels of weak s-wave resonances and extra p-wave levels have been taken into account, if any. For 75 As and 103 Rh, s-wave population has been separated by Bayes' theorem before making fit. The results thus obtained are consistent with Porter-Thomas distribution, i.e., chi-squared distribution with γ=1, as one would expect. It has not been found in this work that the number of degrees of freedom for the distribution of s-wave reduced neutron width might be greater than one as reported by H.C.Sharma et al. (1976) at the international conference on interactions of neutrons with nuclei. (Auth.)
Tygert, Mark
2010-09-21
We discuss several tests for determining whether a given set of independent and identically distributed (i.i.d.) draws does not come from a specified probability density function. The most commonly used are Kolmogorov-Smirnov tests, particularly Kuiper's variant, which focus on discrepancies between the cumulative distribution function for the specified probability density and the empirical cumulative distribution function for the given set of i.i.d. draws. Unfortunately, variations in the probability density function often get smoothed over in the cumulative distribution function, making it difficult to detect discrepancies in regions where the probability density is small in comparison with its values in surrounding regions. We discuss tests without this deficiency, complementing the classical methods. The tests of the present paper are based on the plain fact that it is unlikely to draw a random number whose probability is small, provided that the draw is taken from the same distribution used in calculating the probability (thus, if we draw a random number whose probability is small, then we can be confident that we did not draw the number from the same distribution used in calculating the probability).
Distribution of light in the human retina under natural viewing conditions
Gibert, Jorge C.
Age-related macular degeneration (AMD) is the leading cause of blindness inAmerica. The fact that AMD wreaks most of the damage in the center of the retina raises the question of whether light, integrated over long periods, is more concentrated in the macula. A method, based on eye-tracking, was developed to measure the distribution of light in the retina under natural viewing conditions. The hypothesis was that integrated over time, retinal illumination peaked in the macula. Additionally a possible relationship between age and retinal illumination was investigated. The eye tracker superimposed the subject's gaze position on a video recorded by a scene camera. Five informed subjects were employed in feasibility tests, and 58 naive subjects participated in 5 phases. In phase 1 the subjects viewed a gray-scale image. In phase 2, they observed a sequence of photographic images. In phase 3 they viewed a video. In phase 4, they worked on a computer; in phase 5, the subjects walked around freely. The informed subjects were instructed to gaze at bright objects in the field of view and then at dark objects. Naive subjects were allowed to gaze freely for all phases. Using the subject's gaze coordinates, and the video provided by the scene camera, the cumulative light distribution on the retina was calculated for ˜15° around the fovea. As expected for control subjects, cumulative retinal light distributions peaked and dipped in the fovea when they gazed at bright or dark objects respectively. The light distribution maps obtained from the naive subjects presented a tendency to peak in the macula for phases 1, 2, and 3, a consistent tendency in phase 4 and a variable tendency in phase 5. The feasibility of using an eye-tracker system to measure the distribution of light in the retina was demonstrated, thus helping to understand the role played by light exposure in the etiology of AMD. Results showed that a tendency for light to peak in the macula is a characteristic of some
International Nuclear Information System (INIS)
Dučić, Tanja; Borchert, Manuela; Savić, Aleksandar; Kalauzi, Aleksandar; Mitrović, Aleksandra; Radotić, Ksenija
2013-01-01
Synchrotron-radiation-based X-ray microfluorescence has been used for in situ investigation of the distribution of micronutrient and macronutrient elements in an unstained cross section of a stem of monocotyledonous liana plant Dioscorea balcanica Košanin. The elemental allocation has been quantified and the grouping/co-localization in straight and twisted stem internodes has been analysed. Synchrotron-based X-ray microfluorescence (µSXRF) is an analytical method suitable for in situ investigation of the distribution of micronutrient and macronutrient elements in several-micrometres-thick unstained biological samples, e.g. single cells and tissues. Elements are mapped and quantified at sub-p.p.m. concentrations. In this study the quantity, distribution and grouping/co-localization of various elements have been identified in straight and twisted internodes of the stems of the monocotyledonous climber D. balcanica Košanin. Three different statistical methods were employed to analyse the macro-nutrient and micronutrient distributions and co-localization. Macronutrient elements (K, P, Ca, Cl) are distributed homogeneously in both straight and twisted internodes. Micronutrient elements are mostly grouped in the vasculature and in the sclerenchyma cell layer. In addition, co-localization of micronutrient elements is much more prominent in twisted than in straight internodes. These image analyses and statistical methods provided very similar outcomes and could be applied to various types of biological samples imaged by µSXRF
Flexible voltage support control for three-phase distributed generation inverters under grid fault
DEFF Research Database (Denmark)
Camacho, Antonio; Castilla, Miguel; Miret, Jaume
2013-01-01
Operators describe the behavior of the energy source, regulating voltage limits and reactive power injection to remain connected and support the grid under fault. On the basis that different kinds of voltage sags require different voltage support strategies, a flexible control scheme for three phase grid...... connected inverters is proposed. In three phase balanced voltage sags, the inverter should inject reactive power in order to raise the voltage in all phases. In one or two phase faults, the main concern of the distributed generation inverter is to equalize voltages by reducing the negative symmetric...... sequence and clear the phase jump. Due to system limitations, a balance between these two extreme policies is mandatory. Thus, over-voltage and undervoltage can be avoided, and the proposed control scheme prevents disconnection while achieving the desired voltage support service. The main contribution...
Directory of Open Access Journals (Sweden)
Manna S.K.
2008-01-01
Full Text Available In this paper, we consider the problem of simultaneous determination of retail price and lot-size (RPLS under the assumption that the supplier offers a fixed credit period to the retailer. It is assumed that the item in stock deteriorates over time at a rate that follows a two-parameter Weibull distribution and that the price-dependent demand is represented by a constant-price-elasticity function of retail price. The RPLS decision model is developed and solved analytically. Results are illustrated with the help of a base example. Computational results show that the supplier earns more profits when the credit period is greater than the replenishment cycle length. Sensitivity analysis of the solution to changes in the value of input parameters of the base example is also discussed.
Hosting Capacity of Solar Photovoltaics in Distribution Grids under Different Pricing Schemes
DEFF Research Database (Denmark)
Carollo, Riccardo; Chaudhary, Sanjay Kumar; Pillai, Jayakrishnan Radhakrishna
2015-01-01
Most of the solar photovoltaic (SPV) installations are connected to distribution networks. The majority of these systems are represented by single-phase rooftop SPVs connected to residential low voltage (LV) grids. The large SPV shares lead to grid integration issues such as voltage rise....... The results show that with the present TOU tariffs the EV integration in LV networks does not ease the grid bottlenecks for large PV penetration. Under the Net metering and DLMP the EV integration in LV grids tend to increase the PV hosting capacity......., overloading of the network components, voltage phase unbalance etc. A rapid expansion of Electric Vehicles (EVs) technology is estimated, whose connection is also expected to take place in the LV networks. EVs might represent a possible solution to the SPV integration issues as they can be used as fast...