Using historical vital statistics to predict the distribution of under-five mortality by cause.
Rao, Chalapati; Adair, Timothy; Kinfu, Yohannes
2011-06-01
Cause-specific mortality data is essential for planning intervention programs to reduce mortality in the under age five years population (under-five). However, there is a critical paucity of such information for most of the developing world, particularly where progress towards the United Nations Millennium Development Goal 4 (MDG 4) has been slow. This paper presents a predictive cause of death model for under-five mortality based on historical vital statistics and discusses the utility of the model in generating information that could accelerate progress towards MDG 4. Over 1400 country years of vital statistics from 34 countries collected over a period of nearly a century were analyzed to develop relationships between levels of under-five mortality, related mortality ratios, and proportionate mortality from four cause groups: perinatal conditions; diarrhea and lower respiratory infections; congenital anomalies; and all other causes of death. A system of multiple equations with cross-equation parameter restrictions and correlated error terms was developed to predict proportionate mortality by cause based on given measures of under-five mortality. The strength of the predictive model was tested through internal and external cross-validation techniques. Modeled cause-specific mortality estimates for major regions in Africa, Asia, Central America, and South America are presented to illustrate its application across a range of under-five mortality rates. Consistent and plausible trends and relationships are observed from historical data. High mortality rates are associated with increased proportions of deaths from diarrhea and lower respiratory infections. Perinatal conditions assume importance as a proportionate cause at under-five mortality rates below 60 per 1000 live births. Internal and external validation confirms strength and consistency of the predictive model. Model application at regional level demonstrates heterogeneity and non-linearity in cause
Statistical modeling of urban air temperature distributions under different synoptic conditions
Beck, Christoph; Breitner, Susanne; Cyrys, Josef; Hald, Cornelius; Hartz, Uwe; Jacobeit, Jucundus; Richter, Katja; Schneider, Alexandra; Wolf, Kathrin
2015-04-01
Within urban areas air temperature may vary distinctly between different locations. These intra-urban air temperature variations partly reach magnitudes that are relevant with respect to human thermal comfort. Therefore and furthermore taking into account potential interrelations with other health related environmental factors (e.g. air quality) it is important to estimate spatial patterns of intra-urban air temperature distributions that may be incorporated into urban planning processes. In this contribution we present an approach to estimate spatial temperature distributions in the urban area of Augsburg (Germany) by means of statistical modeling. At 36 locations in the urban area of Augsburg air temperatures are measured with high temporal resolution (4 min.) since December 2012. These 36 locations represent different typical urban land use characteristics in terms of varying percentage coverages of different land cover categories (e.g. impervious, built-up, vegetated). Percentage coverages of these land cover categories have been extracted from different sources (Open Street Map, European Urban Atlas, Urban Morphological Zones) for regular grids of varying size (50, 100, 200 meter horizonal resolution) for the urban area of Augsburg. It is well known from numerous studies that land use characteristics have a distinct influence on air temperature and as well other climatic variables at a certain location. Therefore air temperatures at the 36 locations are modeled utilizing land use characteristics (percentage coverages of land cover categories) as predictor variables in Stepwise Multiple Regression models and in Random Forest based model approaches. After model evaluation via cross-validation appropriate statistical models are applied to gridded land use data to derive spatial urban air temperature distributions. Varying models are tested and applied for different seasons and times of the day and also for different synoptic conditions (e.g. clear and calm
Quinn, Kevin Martin
The total amount of precipitation integrated across a precipitation cluster (contiguous precipitating grid cells exceeding a minimum rain rate) is a useful measure of the aggregate size of the disturbance, expressed as the rate of water mass lost or latent heat released, i.e. the power of the disturbance. Probability distributions of cluster power are examined during boreal summer (May-September) and winter (January-March) using satellite-retrieved rain rates from the Tropical Rainfall Measuring Mission (TRMM) 3B42 and Special Sensor Microwave Imager and Sounder (SSM/I and SSMIS) programs, model output from the High Resolution Atmospheric Model (HIRAM, roughly 0.25-0.5 0 resolution), seven 1-2° resolution members of the Coupled Model Intercomparison Project Phase 5 (CMIP5) experiment, and National Center for Atmospheric Research Large Ensemble (NCAR LENS). Spatial distributions of precipitation-weighted centroids are also investigated in observations (TRMM-3B42) and climate models during winter as a metric for changes in mid-latitude storm tracks. Observed probability distributions for both seasons are scale-free from the smallest clusters up to a cutoff scale at high cluster power, after which the probability density drops rapidly. When low rain rates are excluded by choosing a minimum rain rate threshold in defining clusters, the models accurately reproduce observed cluster power statistics and winter storm tracks. Changes in behavior in the tail of the distribution, above the cutoff, are important for impacts since these quantify the frequency of the most powerful storms. End-of-century cluster power distributions and storm track locations are investigated in these models under a "business as usual" global warming scenario. The probability of high cluster power events increases by end-of-century across all models, by up to an order of magnitude for the highest-power events for which statistics can be computed. For the three models in the suite with continuous
Limit distributions for the terms of central order statistics under power normalization
El Sayed M. Nigm
2007-01-01
In this paper the limiting distributions for sequences of central terms under power nonrandom normalization are obtained. The classes of the limit types having domain of L- attraction are investigated.
Limit distributions for the terms of central order statistics under power normalization
Directory of Open Access Journals (Sweden)
El Sayed M. Nigm
2007-12-01
Full Text Available In this paper the limiting distributions for sequences of central terms under power nonrandom normalization are obtained. The classes of the limit types having domain of L- attraction are investigated.
Statistical distribution of quantum particles
Indian Academy of Sciences (India)
In this work, the statistical distribution functions for boson, fermions and their mixtures have been derived and it is found that distribution functions follow the symmetry features of β distribution. If occupation index is greater than unity, then it is easy in the present approach to visualise condensations in terms of intermediate ...
Statistical distribution of quantum particles
Khasare, S. B.; Khasare, Shashank S.
2018-03-01
In this work, the statistical distribution functions for boson, fermions and their mixtures have been derived and it is found that distribution functions follow the symmetry features of β distribution. If occupation index is greater than unity, then it is easy in the present approach to visualise condensations in terms of intermediate values of mixing parameters. There are some applications of intermediate values of mixing parameters.
Statistical distribution of quantum particles
Indian Academy of Sciences (India)
S B Khasare
2018-02-08
Feb 8, 2018 ... The rest of the paper is organised as follows. In §2, we introduce the basic definition of thermodynamic prob- ability W. Section 3 gives the derivation and graphical plot for occupation index in terms of parameter μb using multivariate β distribution. The application of inter- mediate statistics is discussed in §4 ...
Statistical properties of galaxy distributions
Directory of Open Access Journals (Sweden)
F. Sylos Labini
1996-01-01
Full Text Available The recent availability of complete three dimensional samples of galaxies and clusters permits a direct study of their spatial properties. We present a brief review of galaxy correlations based on the methods modern statistical Physics. These methods which able to identify self-similar and non-analytical prop ties, allow us to test the usual homogeneity assumption of luminous matter distribution. We conclude that both the three dimensional prop ties, and the angular log N - log S relation, point out the fact that the distribution of galaxies and clusters fractal with D ≈ 2 up to the deepest scale probed luminous matter (≈≥ 1000h-1 Mpc. This result has important implications for the theoretical framework that should be adopted.
Statistical distribution of quantum particles
Indian Academy of Sciences (India)
S B Khasare
2018-02-08
Feb 8, 2018 ... One of the motivation to study intermediate statistics is to construct fault tolerant quantum computer using an approach such as topological quantum computation [3] that relies on the existence of topological states of mat- ter whose quasiparticle excitations are neither bosons nor fermions but are particles ...
Statistical Physics for Adaptive Distributed Control
Wolpert, David H.
2005-01-01
A viewgraph presentation on statistical physics for distributed adaptive control is shown. The topics include: 1) The Golden Rule; 2) Advantages; 3) Roadmap; 4) What is Distributed Control? 5) Review of Information Theory; 6) Iterative Distributed Control; 7) Minimizing L(q) Via Gradient Descent; and 8) Adaptive Distributed Control.
Distributions with given marginals and statistical modelling
Fortiana, Josep; Rodriguez-Lallena, José
2002-01-01
This book contains a selection of the papers presented at the meeting `Distributions with given marginals and statistical modelling', held in Barcelona (Spain), July 17-20, 2000. In 24 chapters, this book covers topics such as the theory of copulas and quasi-copulas, the theory and compatibility of distributions, models for survival distributions and other well-known distributions, time series, categorical models, definition and estimation of measures of dependence, monotonicity and stochastic ordering, shape and separability of distributions, hidden truncation models, diagonal families, orthogonal expansions, tests of independence, and goodness of fit assessment. These topics share the use and properties of distributions with given marginals, this being the fourth specialised text on this theme. The innovative aspect of the book is the inclusion of statistical aspects such as modelling, Bayesian statistics, estimation, and tests.
Predicting Statistical Distributions of Footbridge Vibrations
DEFF Research Database (Denmark)
Pedersen, Lars; Frier, Christian
2009-01-01
The paper considers vibration response of footbridges to pedestrian loading. Employing Newmark and Monte Carlo simulation methods, a statistical distribution of bridge vibration levels is calculated modelling walking parameters such as step frequency and stride length as random variables...
Distributional Properties of Order Statistics and Record Statistics
Directory of Open Access Journals (Sweden)
Abdul Hamid Khan
2012-07-01
Full Text Available Normal 0 false false false EN-US X-NONE X-NONE /* Style Definitions */ table.MsoNormalTable {mso-style-name:"Table Normal"; mso-tstyle-rowband-size:0; mso-tstyle-colband-size:0; mso-style-noshow:yes; mso-style-priority:99; mso-style-parent:""; mso-padding-alt:0in 5.4pt 0in 5.4pt; mso-para-margin:0in; mso-para-margin-bottom:.0001pt; mso-pagination:widow-orphan; font-size:10.0pt; font-family:"Times New Roman","serif";} Distributional properties of the order statistics, upper and lower records have been utilized to characterize distributions of interest. Further, one sided random dilation and contraction are utilized to obtain the distribution of non-adjacent ordered statistics and also their important deductions are discussed.
Statistical distributions applications and parameter estimates
Thomopoulos, Nick T
2017-01-01
This book gives a description of the group of statistical distributions that have ample application to studies in statistics and probability. Understanding statistical distributions is fundamental for researchers in almost all disciplines. The informed researcher will select the statistical distribution that best fits the data in the study at hand. Some of the distributions are well known to the general researcher and are in use in a wide variety of ways. Other useful distributions are less understood and are not in common use. The book describes when and how to apply each of the distributions in research studies, with a goal to identify the distribution that best applies to the study. The distributions are for continuous, discrete, and bivariate random variables. In most studies, the parameter values are not known a priori, and sample data is needed to estimate parameter values. In other scenarios, no sample data is available, and the researcher seeks some insight that allows the estimate of ...
Characterization through distributional properties of dual generalized order statistics
Directory of Open Access Journals (Sweden)
A.H. Khan
2012-10-01
Full Text Available Distributional properties of two non-adjacent dual generalized order statistics have been used to characterize distributions. Further, one sided contraction and dilation for the dual generalized order statistics are discussed and then the results are deduced for generalized order statistics, order statistics, lower record statistics, upper record statistics and adjacent dual generalized order statistics.
Improvement of Statistical Decisions under Parametric Uncertainty
Nechval, Nicholas A.; Nechval, Konstantin N.; Purgailis, Maris; Berzins, Gundars; Rozevskis, Uldis
2011-10-01
A large number of problems in production planning and scheduling, location, transportation, finance, and engineering design require that decisions be made in the presence of uncertainty. Decision-making under uncertainty is a central problem in statistical inference, and has been formally studied in virtually all approaches to inference. The aim of the present paper is to show how the invariant embedding technique, the idea of which belongs to the authors, may be employed in the particular case of finding the improved statistical decisions under parametric uncertainty. This technique represents a simple and computationally attractive statistical method based on the constructive use of the invariance principle in mathematical statistics. Unlike the Bayesian approach, an invariant embedding technique is independent of the choice of priors. It allows one to eliminate unknown parameters from the problem and to find the best invariant decision rule, which has smaller risk than any of the well-known decision rules. To illustrate the proposed technique, application examples are given.
Device for flattening statistically distributed pulses
International Nuclear Information System (INIS)
Il'kanaev, G.I.; Iskenderov, V.G.; Rudnev, O.V.; Teller, V.S.
1976-01-01
The description is given of a device that converts the series of statistically distributed pulses into a pseudo-uniform one. The inlet pulses switch over the first counter, and the second one is switched over by the clock pulses each time the uniformity of the counters' states is violated. This violation is recorded by the logic circuit which passes to the output the clock pulses in the amount equal to that of the pulses that reached the device inlet. Losses at the correlation between the light velocity and the sampling rate up to 0.3 do not exceed 0.7 per cent for the memory of pulse counters 3, and 0.035 per cent for memory 7
Soil nuclide distribution coefficients and their statistical distributions
International Nuclear Information System (INIS)
Sheppard, M.I.; Beals, D.I.; Thibault, D.H.; O'Connor, P.
1984-12-01
Environmental assessments of the disposal of nuclear fuel waste in plutonic rock formations require analysis of the migration of nuclides from the disposal vault to the biosphere. Analyses of nuclide migration via groundwater through the disposal vault, the buffer and backfill, the plutonic rock, and the consolidated and unconsolidated overburden use models requiring distribution coefficients (Ksub(d)) to describe the interaction of the nuclides with the geological and man-made materials. This report presents element-specific soil distribution coefficients and their statistical distributions, based on a detailed survey of the literature. Radioactive elements considered were actinium, americium, bismuth, calcium, carbon, cerium, cesium, iodine, lead, molybdenum, neptunium, nickel, niobium, palladium, plutonium, polonium, protactinium, radium, samarium, selenium, silver, strontium, technetium, terbium, thorium, tin, uranium and zirconium. Stable elements considered were antimony, boron, cadmium, tellurium and zinc. Where sufficient data were available, distribution coefficients and their distributions are given for sand, silt, clay and organic soils. Our values are recommended for use in assessments for the Canadian Nuclear Fuel Waste Management Program
Empirical distribution function under heteroscedasticity
Czech Academy of Sciences Publication Activity Database
Víšek, Jan Ámos
2011-01-01
Roč. 45, č. 5 (2011), s. 497-508 ISSN 0233-1888 Grant - others:GA UK(CZ) GA402/09/0557 Institutional research plan: CEZ:AV0Z10750506 Keywords : Robustness * Convergence * Empirical distribution * Heteroscedasticity Subject RIV: BB - Applied Statistics, Operational Research Impact factor: 0.724, year: 2011 http://library.utia.cas.cz/separaty/2011/SI/visek-0365534.pdf
The Multivariate Order Statistics for Exponential and Weibull Distributions
Directory of Open Access Journals (Sweden)
Mariyam Hafeez
2014-09-01
Full Text Available In this paper we have derived the distribution of multivariate order statistics for multivariate exponential & multivariate weibull distribution. The moment expression for multivariate order statistics has also been derived.
Handbook of Fitting Statistical Distributions with R
Karian, Zaven A
2010-01-01
Strengthened by examples taken from the scientific literature, this handbook provides statisticians and researchers across the physical and social sciences with cutting-edge methods for fitting continuous probability distributions. It presents families with wide-ranging applicability, including Johnson's system, kappa distribution, and generalized lambda distribution. By providing the necessary R programs, the book enables practitioners to implement the techniques using R computer code. To cover distribution method combinations not included in the book's extensive tables, the authors delve int
Large Statistics Study Of QCD Topological Charge Distribution
Giusti, Leonardo; Taglienti, Bruno
2006-01-01
We present preliminary results for a high statistics study of the topological charge distribution in the SU(3) Yang-Mills theory obtained by using the definition of the charge suggested by Neuberger fermions. We find statistical evidence for deviations from a gaussian distribution. The large statistics required has been obtained by using PCs of the INFN-GRID.
Statistical distribution analysis of rubber fatigue data
DeRudder, J. L.
1981-10-01
Average rubber fatigue resistance has previously been related to such factors as elastomer type, cure system, cure temperature, and stress history. This paper extends this treatment to a full statistical analysis of rubber fatigue data. Analyses of laboratory fatigue data are used to predict service life. Particular emphasis is given to the prediction of early tire splice failures, and to adaptations of statistical fatigue analysis for the particular service conditions of the rubber industry.
Statistical Distribution of Energization Overvoltages of EHV Cables
DEFF Research Database (Denmark)
Ohno, Teruo; Bak, Claus Leth; Ametani, Akihiro
2013-01-01
those on the overhead lines with respect to maximum, 2%, and mean values. As the minimum value is almost at the same level, standard deviations are smaller for the cables. The obtained statistical distributions in this paper are of a great importance in considering insulation levels of cable systems.......Statistical distributions of switching overvoltages have been used for decades for the determination of insulation levels of extremely high voltage (EHV) systems. Existing statistical distributions obtained in the 1970s are for overhead lines, and statistical distributions of switching overvoltages...... of EHV cables are not available to date. This paper derives the statistical distribution of energization overvoltages of EHV cables. Through the comparison of the statistical distributions of EHV cables and overhead lines, it has been found that line energization overvoltages on the cables are lower than...
Statistical Distributions of Electron Avalanches and Streamers
Directory of Open Access Journals (Sweden)
T. Ficker
2010-01-01
Full Text Available A new theoretical concept of fractal multiplication of electron avalanches has resulted in forming a generalized distribution function whose multiparameter character has been subjected to detailed discussion.
Comparative Descriptive Statistics of Skewed Probability Distributions
National Research Council Canada - National Science Library
Fewell, M
2004-01-01
...). Increasingly, OA studies involve simulations of varying levels of sophistication. A feature of all simulations is the use of random variables, and this immediately raises the question of what distribution to employ...
Statistical distribution for generalized ideal gas of fractional-statistics particles
International Nuclear Information System (INIS)
Wu, Y.
1994-01-01
We derive the occupation-number distribution in a generalized ideal gas of particles obeying fractional statistics, including mutual statistics, by adopting a state-counting definition. When there is no mutual statistics, the statistical distribution interpolates between bosons and fermions, and respects a fractional exclusion principle (except for bosons). Anyons in a strong magnetic field at low temperatures constitute such a physical system. Applications to the thermodynamic properties of quasiparticle excitations in the Laughlin quantum Hall fluid are discussed
Statistics of matter distribution from halo dynamics
International Nuclear Information System (INIS)
Bonometto, S.A.; Borgani, S.; Persic, M.; Salucci, P.
1990-01-01
The galaxy-background correlation function at short distances is explored by means of the observed disk dynamics of spiral galaxies. Using both the sample of galaxies and the analytical method for dark-to-luminous mass decomposition from optical rotation curves presented by Persic and Salucci (1988, 1990), individual sizes and mean densities are worked out for 42 extended halos as functions of both the intensity and the gradient of the central velocity field. The statistics for the expected density enhancements within given distances from galactic centers shows simple properties which strongly tie galaxy-background and galaxy-galaxy correlation functions. 42 refs
Statistical decisions under nonparametric a priori information
International Nuclear Information System (INIS)
Chilingaryan, A.A.
1985-01-01
The basic module of applied program package for statistical analysis of the ANI experiment data is described. By means of this module tasks of choosing theoretical model most adequately fitting to experimental data, selection of events of definte type, identification of elementary particles are carried out. For mentioned problems solving, the Bayesian rules, one-leave out test and KNN (K Nearest Neighbour) adaptive density estimation are utilized
ON DISTRIBUTIONS OF ORDER STATISTICS FROM NONIDENTICAL DISCRETE VARIABLES
Directory of Open Access Journals (Sweden)
Mehmet GÜNGÖR
2011-04-01
Full Text Available In this study, the distributions of X_r:n order statistic of innid discrete random variables are obtained.In addition, the distributions are also expressed in the form of an integral. Then, the results related to pf and dfof minimum and maximum order statistics of innid discrete random variables are given.
Prototyping a Distributed Information Retrieval System That Uses Statistical Ranking.
Harman, Donna; And Others
1991-01-01
Built using a distributed architecture, this prototype distributed information retrieval system uses statistical ranking techniques to provide better service to the end user. Distributed architecture was shown to be a feasible alternative to centralized or CD-ROM information retrieval, and user testing of the ranking methodology showed both…
Ultrarelativistic transverse momentum distribution of the Tsallis statistics
Energy Technology Data Exchange (ETDEWEB)
Parvan, A.S. [Joint Institute for Nuclear Research, Bogoliubov Laboratory of Theoretical Physics, Dubna (Russian Federation); Horia Hulubei National Institute of Physics and Nuclear Engineering, Department of Theoretical Physics, Bucharest-Magurele (Romania); Moldova Academy of Sciences, Institute of Applied Physics, Chisinau (Moldova, Republic of)
2017-03-15
The analytical expressions for the ultrarelativistic transverse momentum distributions of the Tsallis and the Tsallis-2 statistics were obtained. We found that the transverse momentum distribution of the Tsallis-factorized statistics, which is now largely used to describe the experimental transverse momentum spectra of hadrons measured in pp collisions at LHC and RHIC energies, in the ultrarelativistic case is not equivalent to the transverse momentum distributions of the Tsallis and the Tsallis-2 statistics. However, we revealed that this distribution exactly coincides with the transverse momentum distribution of the Tsallis-2 statistics in the zeroth term approximation and is transformed to the transverse momentum distribution of the Tsallis statistics in the zeroth term approximation by changing the parameter q to 1/q{sub c}. We demonstrated analytically on the basis of the ultrarelativistic ideal gas that the Tsallis-factorized statistics is not equivalent to the Tsallis and the Tsallis-2 statistics. In the present paper the Tsallis statistics corresponds to the standard expectation values. (orig.)
Portfolio return distributions: Sample statistics with non-stationary correlations
Chetalova, Desislava; Schmitt, Thilo A.; Schäfer, Rudi; Guhr, Thomas
2013-01-01
We consider random vectors drawn from a multivariate normal distribution and compute the sample statistics in the presence of non-stationary correlations. For this purpose, we construct an ensemble of random correlation matrices and average the normal distribution over this ensemble. The resulting distribution contains a modified Bessel function of the second kind whose behavior differs significantly from the multivariate normal distribution, in the central part as well as in the tails. This ...
Statistical distributions as applied to environmental surveillance data
International Nuclear Information System (INIS)
Speer, D.R.; Waite, D.A.
1975-09-01
Application of normal, log normal, and Weibull distributions to environmental surveillance data was investigated for approximately 300 nuclide-medium-year-location combinations. Corresponding W test calculations were made to determine the probability of a particular data set falling within the distribution of interest. Conclusions are drawn as to the fit of any data group to the various distributions. The significance of fitting statistical distributions to the data is discussed
Statistical Inference for a Class of Multivariate Negative Binomial Distributions
DEFF Research Database (Denmark)
Rubak, Ege H.; Møller, Jesper; McCullagh, Peter
This paper considers statistical inference procedures for a class of models for positively correlated count variables called -permanental random fields, and which can be viewed as a family of multivariate negative binomial distributions. Their appealing probabilistic properties have earlier been...
Statistical test for the distribution of galaxies on plates
International Nuclear Information System (INIS)
Garcia Lambas, D.
1985-01-01
A statistical test for the distribution of galaxies on plates is presented. We apply the test to synthetic astronomical plates obtained by means of numerical simulation (Garcia Lambas and Sersic 1983) with three different models for the 3-dimensional distribution, comparison with an observational plate, suggest the presence of filamentary structure. (author)
Multiple defect distributions on weibull statistical analysis of fatigue ...
African Journals Online (AJOL)
By relaxing the assumptions of a single cast defect distribution, of uniformity throughout the material and of uniformity from specimen to specimen, Weibull statistical analysis for multiple defect distributions have been applied to correctly describe the fatigue life data of aluminium alloy castings having multiple cast defects ...
Statistical exponential distribution function as distance indicator to stellar groups
Abdel Rahman, H.; Sabry, M. A.; Issa, I. A.
2012-12-01
In this paper, statistical distribution functions are developed for distance determination of stellar groups. This method depends on the assumption that, absolute magnitudes and apparent magnitudes follow an exponential distribution function. The developed approaches have been implemented to determine distances of some clusters and stellar associations. The comparison with the distances derived by different authors revealed good agreement.
On the limit distribution of lower extreme generalized order statistics
Indian Academy of Sciences (India)
m−gOs (as well as the classical extreme value theory of ordinary order statistics) yields three types of limit distributions that are possible in case of linear normalization. In this paper a similar classification of limit distributions holds for extreme gOs, where the parameters γj , j = 1,..., n, are assumed to be pairwise different.
Length-Biased Weighted Lomax Distribution: Statistical Properties and Application
Directory of Open Access Journals (Sweden)
Afaq Ahmad
2016-06-01
Full Text Available The concept of length-biased distribution can be employed in development of proper models for lifetime data. Length-biased distribution is a special case of the more general form known as weighted distribution. In this paper we introduce a new class of length-biased weighted Lomax distribution, (LBWLD. The statistical properties of this distribution are derived and the model parameters are estimated by maximum likelihood estimation and the observed information matrix is determined. An application to real data set is finally presented for illustration.
Statistical distributions as applied to environmental surveillance data
International Nuclear Information System (INIS)
Speer, D.R.; Waite, D.A.
1976-01-01
Application of normal, lognormal, and Weibull distributions to radiological environmental surveillance data was investigated for approximately 300 nuclide-medium-year-location combinations. The fit of data to distributions was compared through probability plotting (special graph paper provides a visual check) and W test calculations. Results show that 25% of the data fit the normal distribution, 50% fit the lognormal, and 90% fit the Weibull.Demonstration of how to plot each distribution shows that normal and lognormal distributions are comparatively easy to use while Weibull distribution is complicated and difficult to use. Although current practice is to use normal distribution statistics, normal fit the least number of data groups considered in this study
mitants of Order Statistics from Bivariate Inverse Rayleigh Distribution
Directory of Open Access Journals (Sweden)
Muhammad Aleem
2006-01-01
Full Text Available The probability density function (pdf of the rth, 1 r n and joint pdf of the rth and sth, 1 rstatistics are derived for Bivariate Inverse Rayleigh Distribution and their moments, product moments are obtained. Its percentiles are also obtained.
Fisher information and statistical inference for phase-type distributions
DEFF Research Database (Denmark)
Bladt, Mogens; Esparza, Luz Judith R; Nielsen, Bo Friis
2011-01-01
This paper is concerned with statistical inference for both continuous and discrete phase-type distributions. We consider maximum likelihood estimation, where traditionally the expectation-maximization (EM) algorithm has been employed. Certain numerical aspects of this method are revised and we...... and Newton--Raphson approach. The inverse of the Fisher information matrix provides the variances and covariances of the estimated parameters....
Statistical inference for a class of multivariate negative binomial distributions
DEFF Research Database (Denmark)
Rubak, Ege Holger; Møller, Jesper; McCullagh, Peter
This paper considers statistical inference procedures for a class of models for positively correlated count variables called α-permanental random fields, and which can be viewed as a family of multivariate negative binomial distributions. Their appealing probabilistic properties have earlier been...
Log-concave Probability Distributions: Theory and Statistical Testing
DEFF Research Database (Denmark)
An, Mark Yuing
1996-01-01
This paper studies the broad class of log-concave probability distributions that arise in economics of uncertainty and information. For univariate, continuous, and log-concave random variables we prove useful properties without imposing the differentiability of density functions. Discrete...... and multivariate distributions are also discussed. We propose simple non-parametric testing procedures for log-concavity. The test statistics are constructed to test one of the two implicati ons of log-concavity: increasing hazard rates and new-is-better-than-used (NBU) property. The test for increasing hazard...... rates are based on normalized spacing of the sample order statistics. The tests for NBU property fall into the category of Hoeffding's U-statistics...
Statistical distribution of thermal vacancies close to the melting point
Energy Technology Data Exchange (ETDEWEB)
José Pozo, María, E-mail: mariaj.pozom@gmail.com [Grupo de Nanomateriales, Departamento de Física, Facultad de Ciencias, Universidad de Chile, Casilla 653, Santiago (Chile); Davis, Sergio, E-mail: sdavis@gnm.cl [Grupo de Nanomateriales, Departamento de Física, Facultad de Ciencias, Universidad de Chile, Casilla 653, Santiago (Chile); Peralta, Joaquín, E-mail: joaquin.peralta@unab.cl [Departamento de Ciencias Físicas, Facultad de Ciencias Exactas, Universidad Andrés Bello, Santiago (Chile)
2015-01-15
A detailed description of the statistical distribution of thermal vacancies in an homogeneous crystal near its melting point is presented, using the embedded atom model for copper as an example. As the temperature increase, the average number of thermal vacancies generated by atoms migrating to neighboring sites increases according to Arrhenius’ law. We present for the first time a model for the statistical distribution of thermal vacancies, which according to our atomistic computer simulations follow a Gamma distribution. All the simulations are carried out by classical molecular dynamics and the recognition of vacancies is achieved via a recently developed algorithm. Our results could be useful in the further development of a theory explaining the mechanism of homogeneous melting, which seems to be mediated by the accumulation of thermal vacancies near the melting point.
Karian, Zaven A
2000-01-01
Throughout the physical and social sciences, researchers face the challenge of fitting statistical distributions to their data. Although the study of statistical modelling has made great strides in recent years, the number and variety of distributions to choose from-all with their own formulas, tables, diagrams, and general properties-continue to create problems. For a specific application, which of the dozens of distributions should one use? What if none of them fit well?Fitting Statistical Distributions helps answer those questions. Focusing on techniques used successfully across many fields, the authors present all of the relevant results related to the Generalized Lambda Distribution (GLD), the Generalized Bootstrap (GB), and Monte Carlo simulation (MC). They provide the tables, algorithms, and computer programs needed for fitting continuous probability distributions to data in a wide variety of circumstances-covering bivariate as well as univariate distributions, and including situations where moments do...
Experimental investigation of statistical models describing distribution of counts
International Nuclear Information System (INIS)
Salma, I.; Zemplen-Papp, E.
1992-01-01
The binomial, Poisson and modified Poisson models which are used for describing the statistical nature of the distribution of counts are compared theoretically, and conclusions for application are considered. The validity of the Poisson and the modified Poisson statistical distribution for observing k events in a short time interval is investigated experimentally for various measuring times. The experiments to measure the influence of the significant radioactive decay were performed with 89 Y m (T 1/2 =16.06 s), using a multichannel analyser (4096 channels) in the multiscaling mode. According to the results, Poisson statistics describe the counting experiment for short measuring times (up to T=0.5T 1/2 ) and its application is recommended. However, analysis of the data demonstrated, with confidence, that for long measurements (T≥T 1/2 ) Poisson distribution is not valid and the modified Poisson function is preferable. The practical implications in calculating uncertainties and in optimizing the measuring time are discussed. Differences between the standard deviations evaluated on the basis of the Poisson and binomial models are especially significant for experiments with long measuring time (T/T 1/2 ≥2) and/or large detection efficiency (ε>0.30). Optimization of the measuring time for paired observations yields the same solution for either the binomial or the Poisson distribution. (orig.)
New advances in the statistical parton distributions approach*
Directory of Open Access Journals (Sweden)
Soffer Jacques
2016-01-01
Full Text Available The quantum statistical parton distributions approach proposed more than one decade ago is revisited by considering a larger set of recent and accurate Deep Inelastic Scattering experimental results. It enables us to improve the description of the data by means of a new determination of the parton distributions. This global next-to-leading order QCD analysis leads to a good description of several structure functions, involving unpolarized parton distributions and helicity distributions, in terms of a rather small number of free parameters. There are many serious challenging issues. The predictions of this theoretical approach will be tested for single-jet production and charge asymmetry in W± production in p̄p and pp collisions up to LHC energies, using recent data and also for forthcoming experimental results.
Maximum likelihood estimation of exponential distribution under ...
African Journals Online (AJOL)
Maximum likelihood estimation of exponential distribution under type-ii censoring from imprecise data. ... Journal of Fundamental and Applied Sciences ... This paper deals with the estimation of exponential mean parameter under Type-II censoring scheme when the lifetime observations are fuzzy and are assumed to be ...
Use of the Digamma Function in Statistical Astrophysics Distributions
Cahill, Michael
2017-06-01
Relaxed astrophysical statistical distributions may be constructed by using the inverse of a most probable energy distribution equation giving the energy ei of each particle in cell i in terms of the cell’s particle population Ni. The digamma mediated equation is A + Bei = Ψ(1+ Ni), where the constants A & B are Lagrange multipliers and Ψ is the digamma function given by Ψ(1+x) = dln(x!)/dx. Results are discussed for a Monatomic Ideal Gas, Atmospheres of Spherical Planets or Satellites and for Spherical Globular Clusters. These distributions are self-terminating even if other factors do not cause a cutoff. The examples are discussed classically but relativistic extensions are possible.
Statistical distribution of solar soft X-ray bursts
International Nuclear Information System (INIS)
Kaufmann, P.; Piazza, L.R.; Schaal, R.E.
1979-01-01
Nearly 1000 solar events with fluxes measured in 0.5-3A 0 , 1-8A 0 and 8-20A 0 bands by Explorer 37 (US NRL Solrad) satelite are statistically analysed. The differential distribution of peak fluxes can be represented by power laws with exponents -1.4, -2.2, -2.9 respectively, which are compared to 2-12A 0 results. At the 0.5-3A 0 band there is a suggested peak in the distribution. Autocorrelation analysis of the distribution have shown that in the harder band (0.5-3A 0 ) there is a concentration of events at preferred values multiplied of about 10x10 -5 erg cm -2 S -1 of unknown origin [pt
Statistical modelling of a new global potential vegetation distribution
Levavasseur, G.; Vrac, M.; Roche, D. M.; Paillard, D.
2012-12-01
The potential natural vegetation (PNV) distribution is required for several studies in environmental sciences. Most of the available databases are quite subjective or depend on vegetation models. We have built a new high-resolution world-wide PNV map using a objective statistical methodology based on multinomial logistic models. Our method appears as a fast and robust alternative in vegetation modelling, independent of any vegetation model. In comparison with other databases, our method provides a realistic PNV distribution in agreement with respect to BIOME 6000 data. Among several advantages, the use of probabilities allows us to estimate the uncertainty, bringing some confidence in the modelled PNV, or to highlight the regions needing some data to improve the PNV modelling. Despite our PNV map being highly dependent on the distribution of data points, it is easily updatable as soon as additional data are available and provides very useful additional information for further applications.
A method for statistically comparing spatial distribution maps
Directory of Open Access Journals (Sweden)
Reynolds Mary G
2009-01-01
Full Text Available Abstract Background Ecological niche modeling is a method for estimation of species distributions based on certain ecological parameters. Thus far, empirical determination of significant differences between independently generated distribution maps for a single species (maps which are created through equivalent processes, but with different ecological input parameters, has been challenging. Results We describe a method for comparing model outcomes, which allows a statistical evaluation of whether the strength of prediction and breadth of predicted areas is measurably different between projected distributions. To create ecological niche models for statistical comparison, we utilized GARP (Genetic Algorithm for Rule-Set Production software to generate ecological niche models of human monkeypox in Africa. We created several models, keeping constant the case location input records for each model but varying the ecological input data. In order to assess the relative importance of each ecological parameter included in the development of the individual predicted distributions, we performed pixel-to-pixel comparisons between model outcomes and calculated the mean difference in pixel scores. We used a two sample Student's t-test, (assuming as null hypothesis that both maps were identical to each other regardless of which input parameters were used to examine whether the mean difference in corresponding pixel scores from one map to another was greater than would be expected by chance alone. We also utilized weighted kappa statistics, frequency distributions, and percent difference to look at the disparities in pixel scores. Multiple independent statistical tests indicated precipitation as the single most important independent ecological parameter in the niche model for human monkeypox disease. Conclusion In addition to improving our understanding of the natural factors influencing the distribution of human monkeypox disease, such pixel-to-pixel comparison
Wu, Hao
2018-05-01
In structural equation modelling (SEM), a robust adjustment to the test statistic or to its reference distribution is needed when its null distribution deviates from a χ 2 distribution, which usually arises when data do not follow a multivariate normal distribution. Unfortunately, existing studies on this issue typically focus on only a few methods and neglect the majority of alternative methods in statistics. Existing simulation studies typically consider only non-normal distributions of data that either satisfy asymptotic robustness or lead to an asymptotic scaled χ 2 distribution. In this work we conduct a comprehensive study that involves both typical methods in SEM and less well-known methods from the statistics literature. We also propose the use of several novel non-normal data distributions that are qualitatively different from the non-normal distributions widely used in existing studies. We found that several under-studied methods give the best performance under specific conditions, but the Satorra-Bentler method remains the most viable method for most situations. © 2017 The British Psychological Society.
Inverted rank distributions: Macroscopic statistics, universality classes, and critical exponents
Eliazar, Iddo; Cohen, Morrel H.
2014-01-01
An inverted rank distribution is an infinite sequence of positive sizes ordered in a monotone increasing fashion. Interlacing together Lorenzian and oligarchic asymptotic analyses, we establish a macroscopic classification of inverted rank distributions into five “socioeconomic” universality classes: communism, socialism, criticality, feudalism, and absolute monarchy. We further establish that: (i) communism and socialism are analogous to a “disordered phase”, feudalism and absolute monarchy are analogous to an “ordered phase”, and criticality is the “phase transition” between order and disorder; (ii) the universality classes are characterized by two critical exponents, one governing the ordered phase, and the other governing the disordered phase; (iii) communism, criticality, and absolute monarchy are characterized by sharp exponent values, and are inherently deterministic; (iv) socialism is characterized by a continuous exponent range, is inherently stochastic, and is universally governed by continuous power-law statistics; (v) feudalism is characterized by a continuous exponent range, is inherently stochastic, and is universally governed by discrete exponential statistics. The results presented in this paper yield a universal macroscopic socioeconophysical perspective of inverted rank distributions.
DEFF Research Database (Denmark)
Jurado-Navas, Antonio
2015-01-01
Recently, a new and generalized statistical model, called Málaga or simply M distribution, has been proposed to characterize the irradiance fluctuations of an unbounded optical wavefront (plane and spherical waves) propagating through a turbulent medium under all irradiance fluctuation conditions...
Yuan, Ke-Hai
2008-01-01
In the literature of mean and covariance structure analysis, noncentral chi-square distribution is commonly used to describe the behavior of the likelihood ratio (LR) statistic under alternative hypothesis. Due to the inaccessibility of the rather technical literature for the distribution of the LR statistic, it is widely believed that the…
Hayslett, H T
1991-01-01
Statistics covers the basic principles of Statistics. The book starts by tackling the importance and the two kinds of statistics; the presentation of sample data; the definition, illustration and explanation of several measures of location; and the measures of variation. The text then discusses elementary probability, the normal distribution and the normal approximation to the binomial. Testing of statistical hypotheses and tests of hypotheses about the theoretical proportion of successes in a binomial population and about the theoretical mean of a normal population are explained. The text the
Structure Learning and Statistical Estimation in Distribution Networks - Part II
Energy Technology Data Exchange (ETDEWEB)
Deka, Deepjyoti [Univ. of Texas, Austin, TX (United States); Backhaus, Scott N. [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Chertkov, Michael [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)
2015-02-13
Limited placement of real-time monitoring devices in the distribution grid, recent trends notwithstanding, has prevented the easy implementation of demand-response and other smart grid applications. Part I of this paper discusses the problem of learning the operational structure of the grid from nodal voltage measurements. In this work (Part II), the learning of the operational radial structure is coupled with the problem of estimating nodal consumption statistics and inferring the line parameters in the grid. Based on a Linear-Coupled(LC) approximation of AC power flows equations, polynomial time algorithms are designed to identify the structure and estimate nodal load characteristics and/or line parameters in the grid using the available nodal voltage measurements. Then the structure learning algorithm is extended to cases with missing data, where available observations are limited to a fraction of the grid nodes. The efficacy of the presented algorithms are demonstrated through simulations on several distribution test cases.
Radio resource allocation over fading channels under statistical delay constraints
Le-Ngoc, Tho
2017-01-01
This SpringerBrief presents radio resource allocation schemes for buffer-aided communications systems over fading channels under statistical delay constraints in terms of upper-bounded average delay or delay-outage probability. This Brief starts by considering a source-destination communications link with data arriving at the source transmission buffer. The first scenario, the joint optimal data admission control and power allocation problem for throughput maximization is considered, where the source is assumed to have a maximum power and an average delay constraints. The second scenario, optimal power allocation problems for energy harvesting (EH) communications systems under average delay or delay-outage constraints are explored, where the EH source harvests random amounts of energy from renewable energy sources, and stores the harvested energy in a battery during data transmission. Online resource allocation algorithms are developed when the statistical knowledge of the random channel fading, data arrivals...
Robustness of Two-Level Testing Procedures under Distortions of First Level Statistics
Kostevich, A. L.; Nikitina, I. S.
2007-01-01
We investigate robustness of some two-level testing procedures under distortions induced by using an asymptotic distribution of first level statistics instead of an exact one. We demonstrate that ignoring the distortions results in unreliable conclusions and we propose robustness conditions for the two-level procedures.
Estimating Predictive Variance for Statistical Gas Distribution Modelling
International Nuclear Information System (INIS)
Lilienthal, Achim J.; Asadi, Sahar; Reggente, Matteo
2009-01-01
Recent publications in statistical gas distribution modelling have proposed algorithms that model mean and variance of a distribution. This paper argues that estimating the predictive concentration variance entails not only a gradual improvement but is rather a significant step to advance the field. This is, first, since the models much better fit the particular structure of gas distributions, which exhibit strong fluctuations with considerable spatial variations as a result of the intermittent character of gas dispersal. Second, because estimating the predictive variance allows to evaluate the model quality in terms of the data likelihood. This offers a solution to the problem of ground truth evaluation, which has always been a critical issue for gas distribution modelling. It also enables solid comparisons of different modelling approaches, and provides the means to learn meta parameters of the model, to determine when the model should be updated or re-initialised, or to suggest new measurement locations based on the current model. We also point out directions of related ongoing or potential future research work.
Bayesian estimations in the Kumaraswamy distribution under ...
African Journals Online (AJOL)
... the parameter of the Kumaraswamy distribution, Bayes approach under squared error loss function in the reliability function has been suggested based on the pervious observations, this approach can be used for both progressively type II censorings. The study is useful for researchers and practitioners in reliability theory ...
Statistical Distributions of Optical Flares from Gamma-Ray Bursts
Energy Technology Data Exchange (ETDEWEB)
Yi, Shuang-Xi [College of Physics and Engineering, Qufu Normal University, Qufu 273165 (China); Yu, Hai; Wang, F. Y.; Dai, Zi-Gao, E-mail: fayinwang@nju.edu.cn [School of Astronomy and Space Science, Nanjing University, Nanjing 210093 (China)
2017-07-20
We statistically study gamma-ray burst (GRB) optical flares from the Swift /UVOT catalog. We compile 119 optical flares, including 77 flares with redshift measurements. Some tight correlations among the timescales of optical flares are found. For example, the rise time is correlated with the decay time, and the duration time is correlated with the peak time of optical flares. These two tight correlations indicate that longer rise times are associated with longer decay times of optical flares and also suggest that broader optical flares peak at later times, which are consistent with the corresponding correlations of X-ray flares. We also study the frequency distributions of optical flare parameters, including the duration time, rise time, decay time, peak time, and waiting time. Similar power-law distributions for optical and X-ray flares are found. Our statistic results imply that GRB optical flares and X-ray flares may share the similar physical origin, and both of them are possibly related to central engine activities.
Statistical Distributions of Optical Flares from Gamma-Ray Bursts
International Nuclear Information System (INIS)
Yi, Shuang-Xi; Yu, Hai; Wang, F. Y.; Dai, Zi-Gao
2017-01-01
We statistically study gamma-ray burst (GRB) optical flares from the Swift /UVOT catalog. We compile 119 optical flares, including 77 flares with redshift measurements. Some tight correlations among the timescales of optical flares are found. For example, the rise time is correlated with the decay time, and the duration time is correlated with the peak time of optical flares. These two tight correlations indicate that longer rise times are associated with longer decay times of optical flares and also suggest that broader optical flares peak at later times, which are consistent with the corresponding correlations of X-ray flares. We also study the frequency distributions of optical flare parameters, including the duration time, rise time, decay time, peak time, and waiting time. Similar power-law distributions for optical and X-ray flares are found. Our statistic results imply that GRB optical flares and X-ray flares may share the similar physical origin, and both of them are possibly related to central engine activities.
Frequency distributions in population genetics parallel those in statistical physics
Higgs, Paul G.
1995-01-01
A class of problems from statistical physics is discussed that is shown to be identical to a class of problems in population genetics. The mathematical treatment of these problems has arisen independently in the two subjects. The important results of both literatures are presented here, together with cross references. In each case there is a stochastic process generating a set of variables xi that satisfy tsumixi=1. For example, the xi may represent the weights of valleys in a spin glass, the sizes of attractors in dynamical systems, the frequency of different alleles in a population, or the sizes of different families in a genealogical tree. The frequency distributions f(x) of the valleys or alleles are calculated, together with the distribution Π(Y) of the quantity Y=tsumix2i. The distribution Π(Y) can be written as a sum of universal functions Πk(Y) that are independent of the parameters of the problem. It is shown that the rather abstract concepts in the physical models are directly related to observables that are experimentally measurable in biology.
Directory of Open Access Journals (Sweden)
Chaeyoung Lee
2012-11-01
Full Text Available Epistasis that may explain a large portion of the phenotypic variation for complex economic traits of animals has been ignored in many genetic association studies. A Baysian method was introduced to draw inferences about multilocus genotypic effects based on their marginal posterior distributions by a Gibbs sampler. A simulation study was conducted to provide statistical powers under various unbalanced designs by using this method. Data were simulated by combined designs of number of loci, within genotype variance, and sample size in unbalanced designs with or without null combined genotype cells. Mean empirical statistical power was estimated for testing posterior mean estimate of combined genotype effect. A practical example for obtaining empirical statistical power estimates with a given sample size was provided under unbalanced designs. The empirical statistical powers would be useful for determining an optimal design when interactive associations of multiple loci with complex phenotypes were examined.
Statistical Distribution Analysis of Lineated Bands on Europa
Chen, T.; Phillips, C. B.; Pappalardo, R. T.
2016-12-01
Tina Chen, Cynthia B. Phillips, Robert T. Pappalardo Europa's surface is covered with intriguing linear and disrupted features, including lineated bands that range in scale and size. Previous studies have shown the possibility of an icy shell at the surface that may be concealing a liquid ocean with the potential to harboring life (Pappalardo et al., 1999). Utilizing the high-resolution imaging data from the Galileo spacecraft, we examined bands through a morphometric and morphologic approach. Greeley et al. (2000) and Procktor et al. (2002) have defined bands as wide, hummocky to lineated features that have distinctive surface texture and albedo compared to its surrounding terrain. We took morphometric measurements of lineated bands to find correlations in properties such as size, location, and orientation, and to shed light on formation models. We will present our measurements of over 100 bands on Europa that was mapped on the USGS Europa Global Mosaic Base Map (2002). We also conducted a statistical analysis to understand the distribution of lineated bands globally, and whether the widths of the bands differ by location. Our preliminary analysis from our statistical distribution evaluation, combined with the morphometric measurements, supports a uniform ice shell thickness for Europa rather than one that varies geographically. References: Greeley, Ronald, et al. "Geologic mapping of Europa." Journal of Geophysical Research: Planets 105.E9 (2000): 22559-22578.; Pappalardo, R. T., et al. "Does Europa have a subsurface ocean? Evaluation of the geological evidence." Journal of Geophysical Research: Planets 104.E10 (1999): 24015-24055.; Prockter, Louise M., et al. "Morphology of Europan bands at high resolution: A mid-ocean ridge-type rift mechanism." Journal of Geophysical Research: Planets 107.E5 (2002).; U.S. Geological Survey, 2002, Controlled photomosaic map of Europa, Je 15M CMN: U.S. Geological Survey Geologic Investigations Series I-2757, available at http
Structure Learning and Statistical Estimation in Distribution Networks - Part I
Energy Technology Data Exchange (ETDEWEB)
Deka, Deepjyoti [Univ. of Texas, Austin, TX (United States); Backhaus, Scott N. [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Chertkov, Michael [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)
2015-02-13
Traditionally power distribution networks are either not observable or only partially observable. This complicates development and implementation of new smart grid technologies, such as those related to demand response, outage detection and management, and improved load-monitoring. In this two part paper, inspired by proliferation of the metering technology, we discuss estimation problems in structurally loopy but operationally radial distribution grids from measurements, e.g. voltage data, which are either already available or can be made available with a relatively minor investment. In Part I, the objective is to learn the operational layout of the grid. Part II of this paper presents algorithms that estimate load statistics or line parameters in addition to learning the grid structure. Further, Part II discusses the problem of structure estimation for systems with incomplete measurement sets. Our newly suggested algorithms apply to a wide range of realistic scenarios. The algorithms are also computationally efficient – polynomial in time– which is proven theoretically and illustrated computationally on a number of test cases. The technique developed can be applied to detect line failures in real time as well as to understand the scope of possible adversarial attacks on the grid.
Garrido, Marta Isabel; Teng, Chee Leong James; Taylor, Jeremy Alexander; Rowe, Elise Genevieve; Mattingley, Jason Brett
2016-06-01
The ability to learn about regularities in the environment and to make predictions about future events is fundamental for adaptive behaviour. We have previously shown that people can implicitly encode statistical regularities and detect violations therein, as reflected in neuronal responses to unpredictable events that carry a unique prediction error signature. In the real world, however, learning about regularities will often occur in the context of competing cognitive demands. Here we asked whether learning of statistical regularities is modulated by concurrent cognitive load. We compared electroencephalographic metrics associated with responses to pure-tone sounds with frequencies sampled from narrow or wide Gaussian distributions. We showed that outliers evoked a larger response than those in the centre of the stimulus distribution (i.e., an effect of surprise) and that this difference was greater for physically identical outliers in the narrow than in the broad distribution. These results demonstrate an early neurophysiological marker of the brain's ability to implicitly encode complex statistical structure in the environment. Moreover, we manipulated concurrent cognitive load by having participants perform a visual working memory task while listening to these streams of sounds. We again observed greater prediction error responses in the narrower distribution under both low and high cognitive load. Furthermore, there was no reliable reduction in prediction error magnitude under high-relative to low-cognitive load. Our findings suggest that statistical learning is not a capacity limited process, and that it proceeds automatically even when cognitive resources are taxed by concurrent demands.
The statistical distribution of aerosol properties in sourthern West Africa
Haslett, Sophie; Taylor, Jonathan; Flynn, Michael; Bower, Keith; Dorsey, James; Crawford, Ian; Brito, Joel; Denjean, Cyrielle; Bourrianne, Thierry; Burnet, Frederic; Batenburg, Anneke; Schulz, Christiane; Schneider, Johannes; Borrmann, Stephan; Sauer, Daniel; Duplissy, Jonathan; Lee, James; Vaughan, Adam; Coe, Hugh
2017-04-01
The population and economy in southern West Africa have been growing at an exceptional rate in recent years and this trend is expected to continue, with the population projected to more than double to 800 million by 2050. This will result in a dramatic increase in anthropogenic pollutants, already estimated to have tripled between 1950 and 2000 (Lamarque et al., 2010). It is known that aerosols can modify the radiative properties of clouds. As such, the entrainment of anthropogenic aerosol into the large banks of clouds forming during the onset of the West African Monsoon could have a substantial impact on the region's response to climate change. Such projections, however, are greatly limited by the scarcity of observations in this part of the world. As part of the Dynamics-Aerosol-Chemistry-Cloud Interactions in West Africa (DACCIWA) project, three research aircraft were deployed, each carrying equipment capable of measuring aerosol properties in-situ. Instrumentation included Aerosol Mass Spectrometers (AMS), Single Particle Soot Photometers (SP2), Condensation Particle Counters (CPC) and Scanning Mobility Particle Sizers (SMPS). Throughout the intensive aircraft campaign, 155 hours of scientific flights covered an area including large parts of Benin, Togo, Ghana and parts of Côte D'Ivoire. Approximately 70 hours were dedicated to the measurement of cloud-aerosol interactions, with many other flights producing data contributing towards this objective. Using datasets collected during this campaign period, it is possible to build a robust statistical understanding of aerosol properties in this region for the first time, including size distributions and optical and chemical properties. Here, we describe preliminary results from aerosol measurements on board the three aircraft. These have been used to describe aerosol properties throughout the region and time period encompassed by the DACCIWA aircraft campaign. Such statistics will be invaluable for improving future
Pleil, Joachim D; Sobus, Jon R; Stiegel, Matthew A; Hu, Di; Oliver, Karen D; Olenick, Cassandra; Strynar, Mark; Clark, Mary; Madden, Michael C; Funk, William E
2014-01-01
The progression of science is driven by the accumulation of knowledge and builds upon published work of others. Another important feature is to place current results into the context of previous observations. The published literature, however, often does not provide sufficient direct information for the reader to interpret the results beyond the scope of that particular article. Authors tend to provide only summary statistics in various forms, such as means and standard deviations, median and range, quartiles, 95% confidence intervals, and so on, rather than providing measurement data. Second, essentially all environmental and biomonitoring measurements have an underlying lognormal distribution, so certain published statistical characterizations may be inappropriate for comparisons. The aim of this study was to review and develop direct conversions of different descriptions of data into a standard format comprised of the geometric mean (GM) and the geometric standard deviation (GSD) and then demonstrate how, under the assumption of lognormal distribution, these parameters are used to answer questions of confidence intervals, exceedance levels, and statistical differences among distributions. A wide variety of real-world measurement data sets was reviewed, and it was demonstrated that these data sets are indeed of lognormal character, thus making them amenable to these methods. Potential errors incurred from making retrospective estimates from disparate summary statistics are described. In addition to providing tools to interpret "other people's data," this review should also be seen as a cautionary tale for publishing one's own data to make it as useful as possible for other researchers.
Investigation of the statistical distance to reach stationary distributions
International Nuclear Information System (INIS)
Nicholson, S.B.; Kim, Eun-jin
2015-01-01
The thermodynamic length gives a Riemannian metric to a system's phase space. Here we extend the traditional thermodynamic length to the information length (L) out of equilibrium and examine its properties. We utilise L as a useful methodology of analysing non-equilibrium systems without evoking conventional assumptions such as Gaussian statistics, detailed balance, priori-known constraints, or ergodicity and numerically examine how L evolves in time for the logistic map in the chaotic regime depending on initial conditions. To this end, we propose a discrete version of L which is mathematically well defined by taking a set theoretic approach. We identify the areas of phase space where the loss of information of the system takes place most rapidly. In particular, we present an interesting result that the unstable fixed points turn out to most efficiently drive the logistic map towards a stationary distribution through L. - Highlights: • Define a set theoretic version of the discrete thermodynamic length. • These sets allow one to analyse systems having zero probabilities in their evolution. • Numerically analyse the Logistic map using the thermodynamic length. • Show how the unstable fixed points most efficiently lead the system to equilibrium
International Nuclear Information System (INIS)
EI-Shanshoury, G.I.
2011-01-01
Several statistical distributions are used to model various reliability and maintainability parameters. The applied distribution depends on the' nature of the data being analyzed. The presented paper deals with analysis of some statistical distributions used in reliability to reach the best fit of distribution analysis. The calculations rely on circuit quantity parameters obtained by using Relex 2009 computer program. The statistical analysis of ten different distributions indicated that Weibull distribution gives the best fit distribution for modeling the reliability of the data set of Temperature Alarm Circuit (TAC). However, the Exponential distribution is found to be the best fit distribution for modeling the failure rate
Alimi, Jean-Michel; de Fromont, Paul
2018-04-01
The statistical properties of cosmic structures are well known to be strong probes for cosmology. In particular, several studies tried to use the cosmic void counting number to obtain tight constrains on dark energy. In this paper, we model the statistical properties of these regions using the CoSphere formalism (de Fromont & Alimi) in both primordial and non-linearly evolved Universe in the standard Λ cold dark matter model. This formalism applies similarly for minima (voids) and maxima (such as DM haloes), which are here considered symmetrically. We first derive the full joint Gaussian distribution of CoSphere's parameters in the Gaussian random field. We recover the results of Bardeen et al. only in the limit where the compensation radius becomes very large, i.e. when the central extremum decouples from its cosmic environment. We compute the probability distribution of the compensation size in this primordial field. We show that this distribution is redshift independent and can be used to model cosmic voids size distribution. We also derive the statistical distribution of the peak parameters introduced by Bardeen et al. and discuss their correlation with the cosmic environment. We show that small central extrema with low density are associated with narrow compensation regions with deep compensation density, while higher central extrema are preferentially located in larger but smoother over/under massive regions.
On the Limit Distribution of Lower Extreme Generalized Order Statistics
Indian Academy of Sciences (India)
In a wide subclass of generalized order statistics ( g O s ) , which contains most of the known and important models of ordered random variables, weak convergence of lower extremes are developed. A recent result of extreme value theory of m − g O s (as well as the classical extreme value theory of ordinary order statistics) ...
Larwin, Karen H.; Larwin, David A.
2011-01-01
Bootstrapping methods and random distribution methods are increasingly recommended as better approaches for teaching students about statistical inference in introductory-level statistics courses. The authors examined the effect of teaching undergraduate business statistics students using random distribution and bootstrapping simulations. It is the…
Fitting the Statistical Distribution for Daily Rainfall in Ibadan, Based ...
African Journals Online (AJOL)
PROF. O. E. OSUAGWU
2013-06-01
Jun 1, 2013 ... followed by normal and poisson model that has the same estimated rainfall amount for describing the daily rainfall in Ibadan metropolis. Keywords : scale parameter, asymptotically, exponential distribution, gamma distribution, poisson and kolmogorov-smirnov. .... Equation (7) can be simply written as.
Statistical Tests for Frequency Distribution of Mean Gravity Anomalies
African Journals Online (AJOL)
The hypothesis that a very large number of lOx 10mean gravity anomalies are normally distributed has been rejected at 5% Significance level based on the X2 and the unit normal deviate tests. However, the 50 equal area mean anomalies derived from the lOx 10data, have been found to be normally distributed at the same ...
Baijal, Shruti; Nakatani, Chie; van Leeuwen, Cees; Srinivasan, Narayanan
2013-06-07
Human observers show remarkable efficiency in statistical estimation; they are able, for instance, to estimate the mean size of visual objects, even if their number exceeds the capacity limits of focused attention. This ability has been understood as the result of a distinct mode of attention, i.e. distributed attention. Compared to the focused attention mode, working memory representations under distributed attention are proposed to be more compressed, leading to reduced working memory loads. An alternate proposal is that distributed attention uses less structured, feature-level representations. These would fill up working memory (WM) more, even when target set size is low. Using event-related potentials, we compared WM loading in a typical distributed attention task (mean size estimation) to that in a corresponding focused attention task (object recognition), using a measure called contralateral delay activity (CDA). Participants performed both tasks on 2, 4, or 8 different-sized target disks. In the recognition task, CDA amplitude increased with set size; notably, however, in the mean estimation task the CDA amplitude was high regardless of set size. In particular for set-size 2, the amplitude was higher in the mean estimation task than in the recognition task. The result showed that the task involves full WM loading even with a low target set size. This suggests that in the distributed attention mode, representations are not compressed, but rather less structured than under focused attention conditions. Copyright © 2012 Elsevier Ltd. All rights reserved.
Statistical Control Paradigm for Aerospace Structures Under Impulsive Disturbances
National Research Council Canada - National Science Library
Pham, Khanh D; Robertson, Lawrence M
2006-01-01
In this paper, the newly developed statistical control theory is revisited to autonomously control the satellite attitude as well as to provide a means of actively attenuating impulsive disturbances...
Asymptotic distribution of ∆AUC, NRIs, and IDI based on theory of U-statistics.
Demler, Olga V; Pencina, Michael J; Cook, Nancy R; D'Agostino, Ralph B
2017-09-20
The change in area under the curve (∆AUC), the integrated discrimination improvement (IDI), and net reclassification index (NRI) are commonly used measures of risk prediction model performance. Some authors have reported good validity of associated methods of estimating their standard errors (SE) and construction of confidence intervals, whereas others have questioned their performance. To address these issues, we unite the ∆AUC, IDI, and three versions of the NRI under the umbrella of the U-statistics family. We rigorously show that the asymptotic behavior of ∆AUC, NRIs, and IDI fits the asymptotic distribution theory developed for U-statistics. We prove that the ∆AUC, NRIs, and IDI are asymptotically normal, unless they compare nested models under the null hypothesis. In the latter case, asymptotic normality and existing SE estimates cannot be applied to ∆AUC, NRIs, or IDI. In the former case, SE formulas proposed in the literature are equivalent to SE formulas obtained from U-statistics theory if we ignore adjustment for estimated parameters. We use Sukhatme-Randles-deWet condition to determine when adjustment for estimated parameters is necessary. We show that adjustment is not necessary for SEs of the ∆AUC and two versions of the NRI when added predictor variables are significant and normally distributed. The SEs of the IDI and three-category NRI should always be adjusted for estimated parameters. These results allow us to define when existing formulas for SE estimates can be used and when resampling methods such as the bootstrap should be used instead when comparing nested models. We also use the U-statistic theory to develop a new SE estimate of ∆AUC. Copyright © 2017 John Wiley & Sons, Ltd. Copyright © 2017 John Wiley & Sons, Ltd.
Statistical analysis of rockfall volume distributions: Implications for rockfall dynamics
Dussauge, Carine; Grasso, Jean-Robert; Helmstetter, AgnèS.
2003-06-01
We analyze the volume distribution of natural rockfalls on different geological settings (i.e., calcareous cliffs in the French Alps, Grenoble area, and granite Yosemite cliffs, California Sierra) and different volume ranges (i.e., regional and worldwide catalogs). Contrary to previous studies that included several types of landslides, we restrict our analysis to rockfall sources which originated on subvertical cliffs. For the three data sets, we find that the rockfall volumes follow a power law distribution with a similar exponent value, within error bars. This power law distribution was also proposed for rockfall volumes that occurred along road cuts. All these results argue for a recurrent power law distribution of rockfall volumes on subvertical cliffs, for a large range of rockfall sizes (102-1010 m3), regardless of the geological settings and of the preexisting geometry of fracture patterns that are drastically different on the three studied areas. The power law distribution for rockfall volumes could emerge from two types of processes. First, the observed power law distribution of rockfall volumes is similar to the one reported for both fragmentation experiments and fragmentation models. This argues for the geometry of rock mass fragment sizes to possibly control the rockfall volumes. This way neither cascade nor avalanche processes would influence the rockfall volume distribution. Second, without any requirement of scale-invariant quenched heterogeneity patterns, the rock mass dynamics can arise from avalanche processes driven by fluctuations of the rock mass properties, e.g., cohesion or friction angle. This model may also explain the power law distribution reported for landslides involving unconsolidated materials. We find that the exponent values of rockfall volume on subvertical cliffs, 0.5 ± 0.2, is significantly smaller than the 1.2 ± 0.3 value reported for mixed landslide types. This change of exponents can be driven by the material strength, which
On the limit distribution of lower extreme generalized order statistics
Indian Academy of Sciences (India)
In testing the strength of materials, reliability analysis, lifetime studies, etc., the real- izations of experiments arise in nondecreasing order and therefore we need to consider several models of ascendingly ordered rv's. Theoretically, many of these models are con- tained in the gOs model, such as ordinary order statistics ...
Optimal skill distribution under convex skill costs
Directory of Open Access Journals (Sweden)
Tin Cheuk Leung
2018-03-01
Full Text Available This paper studies optimal distribution of skills in an optimal income tax framework with convex skill constraints. The problem is cast as a social planning problem where a redistributive planner chooses how to distribute a given amount of aggregate skills across people. We find that optimal skill distribution is either perfectly equal or perfectly unequal, but an interior level of skill inequality is never optimal.
Optimization of Distributed Crawler under Hadoop
Zhang Xiaochen; Xian Ming
2015-01-01
Web crawler is an important link in the data acquisition of the World Wide Web. It is necessary to optimize traditional methods so as to meet the current needs in the face of the explosive growth of data. This paper introduces the process and the model of the current distributed crawler based on Hadoop, analyzes reasons for influencing the crawling efficiency, and points out defects of the parameter setting, the Urls distribution and the operating model of distributed crawler. The working eff...
Dynamic Response to Pedestrian Loads with Statistical Frequency Distribution
DEFF Research Database (Denmark)
Krenk, Steen
2012-01-01
on the magnitude of the resulting response. A frequency representation of vertical pedestrian load is developed, and a compact explicit formula is developed for the magnitude of the resulting response, in terms of the damping ratio of the structure, the bandwidth of the pedestrian load, and the mean footfall...... frequency. The accuracy of the formula is verified by a statistical moment analysis using the Lyapunov equations....
statistical tests for frequency distribution of mean gravity anomalies
African Journals Online (AJOL)
ES Obe
1980-03-01
Mar 1, 1980 ... ABSTRACT. The hypothesis that a very large number of lOx 10mean gravity anomalies are normally distributed has been rejected at 5%. Significance level based on the X2 and the unit normal deviate tests. However, the 50 equal area mean anomalies derived from the lOx 10data, have been found to be ...
Xylitol production by Candida tropicalis under different statistically ...
African Journals Online (AJOL)
Nutritional and environmental conditions of the xylose utilizing yeast Candida tropicalis were optimized on a shake-flask scale using a statistical factorial design to maximize the production of xylitol. Effects of the three growth medium components (rice bran, ammonium sulfate and xylose) on the xylitol production were ...
Lu, Y.; Qin, X. S.; Xie, Y. J.
2016-02-01
An integrated statistical and data-driven (ISD) framework was proposed for analyzing river flows and flood frequencies in the Duhe River Basin, China, under climate change. The proposed framework involved four major components: (i) a hybrid model based on ASD (Automated regression-based Statistical Downscaling tool) and KNN (K-nearest neighbor) was used for downscaling rainfall and CDEN (Conditional Density Estimate Network) was applied for downscaling minimum temperature and relative humidity from global circulation models (GCMs) to local weather stations; (ii) Bayesian neural network (BNN) was used for simulating monthly river flows based on projected weather information; (iii) KNN was applied for converting monthly flow to daily time series; (iv) Generalized Extreme Value (GEV) distribution was adopted for flood frequency analysis. In this study, the variables from CGCM3 A2 and HadCM3 A2 scenarios were employed as the large-scale predictors. The results indicated that the maximum monthly and annual runoffs would both increase under CGCM3 and HadCM3 A2 emission scenarios at the middle and end of this century. The flood risk in the study area would generally increase with a widening uncertainty range. Compared with traditional approaches, the proposed framework takes the full advantages of a series of statistical and data-driven methods and offers a parsimonious way of projecting flood risks under climatic change conditions.
Metal Distribution and Mobility under alkaline conditions
International Nuclear Information System (INIS)
Dario, Maarten
2004-01-01
The adsorption of an element, expressed as its distribution between liquid (aquatic) and solid phases in the bio geosphere, largely determines its mobility and transport properties. This is of fundamental importance in the assessment of the performance of e.g. geologic repositories for hazardous elements like radionuclides. Geologic repositories for low and intermediate level nuclear waste will most likely be based on concrete constructions in a suitable bedrock, leading to a local chemical environment with pH well above 12. At this pH metal adsorption is very high, and thus the mobility is hindered. Organic complexing agents, such as natural humic matter from the ground and in the groundwater, as well as components in the waste (cleaning agents, degradation products from ion exchange resins and cellulose, cement additives etc.) would affect the sorption properties of the various elements in the waste. Trace element migration from a cementitious repository through the pH- and salinity gradient created around the repository would be affected by the presence and creation of particulate matter (colloids) that may serve as carriers that enhance the mobility. The objective of this thesis was to describe and quantify the sorption of some selected elements representative of spent nuclear fuel (Eu, Am) and other heavy metals (Zn, Cd, Hg) in a clay/cement environment (pH 10-13) and in the pH-gradient outside this environment. The potential of organic complexing agents and colloids to enhance metal migration was also investigated. It was shown that many organic ligands are able to reduce trace metal sorption under these conditions. It was not possible to calculate the effect of well-defined organic ligands on the metal sorption in a cement environment by using stability constants from the literature. A simple method for comparing the effect of different complexing agents on metal sorption is, however, suggested. The stability in terms of the particle size of suspended
Optimization of Distributed Crawler under Hadoop
Directory of Open Access Journals (Sweden)
Zhang Xiaochen
2015-01-01
Full Text Available Web crawler is an important link in the data acquisition of the World Wide Web. It is necessary to optimize traditional methods so as to meet the current needs in the face of the explosive growth of data. This paper introduces the process and the model of the current distributed crawler based on Hadoop, analyzes reasons for influencing the crawling efficiency, and points out defects of the parameter setting, the Urls distribution and the operating model of distributed crawler. The working efficiency is improved through the optimization of parameters configuration and the optimizing effect is further enhanced through the model modification. The experiment indicates that the working efficiency of the distributed crawler after the optimization is increased by 23%, which achieves the expected result.
Statistical mechanics of the distribution of charge on particles in complex plasmas
International Nuclear Information System (INIS)
Sodha, M S; Mishra, S K; Misra, Shikha
2011-01-01
This paper presents an analytical study of the distribution of charge on the particles in a complex plasma; the study is based on statistical mechanics and ensures that the charge on the particles is an integral multiple of the electronic charge. The formulation incorporates both the number and energy balance of electrons/ions. Three specific cases of charging of particles have been considered, namely (i) in a plasma in the absence of electron emission from the particles, (ii) in a complex plasma in thermal equilibrium and (iii) in a complex plasma irradiated by monochromatic radiation, causing photoelectric emission of electrons from the particles. The effect of various parameters on the charge distribution has also been investigated. This paper is in reasonably good agreement with the fluctuation theory for large values of Z (Ze is the charge on a particle). It is seen that under certain conditions, a significant number of oppositely charged particles occur in the complex plasma.
CDFTBL: A statistical program for generating cumulative distribution functions from data
International Nuclear Information System (INIS)
Eslinger, P.W.
1991-06-01
This document describes the theory underlying the CDFTBL code and gives details for using the code. The CDFTBL code provides an automated tool for generating a statistical cumulative distribution function that describes a set of field data. The cumulative distribution function is written in the form of a table of probabilities, which can be used in a Monte Carlo computer code. A a specific application, CDFTBL can be used to analyze field data collected for parameters required by the PORMC computer code. Section 2.0 discusses the mathematical basis of the code. Section 3.0 discusses the code structure. Section 4.0 describes the free-format input command language, while Section 5.0 describes in detail the commands to run the program. Section 6.0 provides example program runs, and Section 7.0 provides references. The Appendix provides a program source listing. 11 refs., 2 figs., 19 tabs
New distributions of the statistical time delay of electrical breakdown in nitrogen
International Nuclear Information System (INIS)
Markovic, V Lj; Gocic, S R; Stamenkovic, S N
2006-01-01
Two new distributions of the statistical time delay of electrical breakdown in nitrogen are reported in this paper. The Gaussian and Gauss-exponential distributions of statistical time delay have been obtained on the basis of thousands of time delay measurements on a gas tube with a plane-parallel electrode system. Distributions of the statistical time delay are theoretically founded on binomial distribution for the occurrence of initiating electrons and described by using simple analytical and numerical models. The shapes of distributions depend on the electron yields in the interelectrode space originating from residual states. It is shown that a distribution of the statistical time delay changes from exponential and Gauss-exponential to Gaussian distribution due to the influence of residual ionization
Statistical Processes Under Change: Enhancing Data Quality with Pretests
Radermacher, Walter; Sattelberger, Sabine
Statistical offices in Europe, in particular the Federal Statistical Office in Germany, are meeting users’ ever more demanding requirements with innovative and appropriate responses, such as the multiple sources mixed-mode design model. This combines various objectives: reducing survey costs and the burden on interviewees, and maximising data quality. The same improvements are also being sought by way of the systematic use of pretests to optimise survey documents. This paper provides a first impression of the many procedures available. An ideal pretest combines both quantitative and qualitative test methods. Quantitative test procedures can be used to determine how often particular input errors arise. The questionnaire is tested in the field in the corresponding survey mode. Qualitative test procedures can find the reasons for input errors. Potential interviewees are included in the questionnaire tests, and their feedback on the survey documentation is systematically analysed and used to upgrade the questionnaire. This was illustrated in our paper by an example from business statistics (“Umstellung auf die Wirtschaftszweigklassifikation 2008” - Change-over to the 2008 economic sector classification). This pretest not only gave important clues about how to improve the contents, but also helped to realistically estimate the organisational cost of the main survey.
Distribution, Statistics, and Resurfacing of Large Impact Basins on Mercury
Fassett, Caleb I.; Head, James W.; Baker, David M. H.; Chapman, Clark R.; Murchie, Scott L.; Neumann, Gregory A.; Oberst, Juergen; Prockter, Louise M.; Smith, David E.; Solomon, Sean C.;
2012-01-01
The distribution and geological history of large impact basins (diameter D greater than or equal to 300 km) on Mercury is important to understanding the planet's stratigraphy and surface evolution. It is also informative to compare the density of impact basins on Mercury with that of the Moon to understand similarities and differences in their impact crater and basin populations [1, 2]. A variety of impact basins were proposed on the basis of geological mapping with Mariner 10 data [e.g. 3]. This basin population can now be re-assessed and extended to the full planet, using data from the MErcury Surface, Space ENvironment, GEochemistry, and Ranging (MESSENGER) spacecraft. Note that small-to- medium-sized peak-ring basins on Mercury are being examined separately [4, 5]; only the three largest peak-ring basins on Mercury overlap with the size range we consider here. In this study, we (1) re-examine the large basins suggested on the basis of Mariner 10 data, (2) suggest additional basins from MESSENGER's global coverage of Mercury, (3) assess the size-frequency distribution of mercurian basins on the basis of these global observations and compare it to the Moon, and (4) analyze the implications of these observations for the modification history of basins on Mercury.
Modeling Malaria Vector Distribution under Climate Change Scenarios in Kenya
Ngaina, J. N.
2017-12-01
Projecting the distribution of malaria vectors under climate change is essential for planning integrated vector control strategies for sustaining elimination and preventing reintroduction of malaria. However, in Kenya, little knowledge exists on the possible effects of climate change on malaria vectors. Here we assess the potential impact of future climate change on locally dominant Anopheles vectors including Anopheles gambiae, Anopheles arabiensis, Anopheles merus, Anopheles funestus, Anopheles pharoensis and Anopheles nili. Environmental data (Climate, Land cover and elevation) and primary empirical geo-located species-presence data were identified. The principle of maximum entropy (Maxent) was used to model the species' potential distribution area under paleoclimate, current and future climates. The Maxent model was highly accurate with a statistically significant AUC value. Simulation-based estimates suggest that the environmentally suitable area (ESA) for Anopheles gambiae, An. arabiensis, An. funestus and An. pharoensis would increase under all two scenarios for mid-century (2016-2045), but decrease for end century (2071-2100). An increase in ESA of An. Funestus was estimated under medium stabilizing (RCP4.5) and very heavy (RCP8.5) emission scenarios for mid-century. Our findings can be applied in various ways such as the identification of additional localities where Anopheles malaria vectors may already exist, but has not yet been detected and the recognition of localities where it is likely to spread to. Moreover, it will help guide future sampling location decisions, help with the planning of vector control suites nationally and encourage broader research inquiry into vector species niche modeling
Charged-particle thermonuclear reaction rates: I. Monte Carlo method and statistical distributions
International Nuclear Information System (INIS)
Longland, R.; Iliadis, C.; Champagne, A.E.; Newton, J.R.; Ugalde, C.; Coc, A.; Fitzgerald, R.
2010-01-01
A method based on Monte Carlo techniques is presented for evaluating thermonuclear reaction rates. We begin by reviewing commonly applied procedures and point out that reaction rates that have been reported up to now in the literature have no rigorous statistical meaning. Subsequently, we associate each nuclear physics quantity entering in the calculation of reaction rates with a specific probability density function, including Gaussian, lognormal and chi-squared distributions. Based on these probability density functions the total reaction rate is randomly sampled many times until the required statistical precision is achieved. This procedure results in a median (Monte Carlo) rate which agrees under certain conditions with the commonly reported recommended 'classical' rate. In addition, we present at each temperature a low rate and a high rate, corresponding to the 0.16 and 0.84 quantiles of the cumulative reaction rate distribution. These quantities are in general different from the statistically meaningless 'minimum' (or 'lower limit') and 'maximum' (or 'upper limit') reaction rates which are commonly reported. Furthermore, we approximate the output reaction rate probability density function by a lognormal distribution and present, at each temperature, the lognormal parameters μ and σ. The values of these quantities will be crucial for future Monte Carlo nucleosynthesis studies. Our new reaction rates, appropriate for bare nuclei in the laboratory, are tabulated in the second paper of this issue (Paper II). The nuclear physics input used to derive our reaction rates is presented in the third paper of this issue (Paper III). In the fourth paper of this issue (Paper IV) we compare our new reaction rates to previous results.
Distribution-level electricity reliability: Temporal trends using statistical analysis
International Nuclear Information System (INIS)
Eto, Joseph H.; LaCommare, Kristina H.; Larsen, Peter; Todd, Annika; Fisher, Emily
2012-01-01
This paper helps to address the lack of comprehensive, national-scale information on the reliability of the U.S. electric power system by assessing trends in U.S. electricity reliability based on the information reported by the electric utilities on power interruptions experienced by their customers. The research analyzes up to 10 years of electricity reliability information collected from 155 U.S. electric utilities, which together account for roughly 50% of total U.S. electricity sales. We find that reported annual average duration and annual average frequency of power interruptions have been increasing over time at a rate of approximately 2% annually. We find that, independent of this trend, installation or upgrade of an automated outage management system is correlated with an increase in the reported annual average duration of power interruptions. We also find that reliance on IEEE Standard 1366-2003 is correlated with higher reported reliability compared to reported reliability not using the IEEE standard. However, we caution that we cannot attribute reliance on the IEEE standard as having caused or led to higher reported reliability because we could not separate the effect of reliance on the IEEE standard from other utility-specific factors that may be correlated with reliance on the IEEE standard. - Highlights: ► We assess trends in electricity reliability based on the information reported by the electric utilities. ► We use rigorous statistical techniques to account for utility-specific differences. ► We find modest declines in reliability analyzing interruption duration and frequency experienced by utility customers. ► Installation or upgrade of an OMS is correlated to an increase in reported duration of power interruptions. ► We find reliance in IEEE Standard 1366 is correlated with higher reported reliability.
Investigating the Statistical Distribution of Learning Coverage in MOOCs
Directory of Open Access Journals (Sweden)
Xiu Li
2017-11-01
Full Text Available Learners participating in Massive Open Online Courses (MOOC have a wide range of backgrounds and motivations. Many MOOC learners enroll in the courses to take a brief look; only a few go through the entire content, and even fewer are able to eventually obtain a certificate. We discovered this phenomenon after having examined 92 courses on both xuetangX and edX platforms. More specifically, we found that the learning coverage in many courses—one of the metrics used to estimate the learners’ active engagement with the online courses—observes a Zipf distribution. We apply the maximum likelihood estimation method to fit the Zipf’s law and test our hypothesis using a chi-square test. In the xuetangX dataset, the learning coverage in 53 of 76 courses fits Zipf’s law, but in all of 16 courses on the edX platform, the learning coverage rejects the Zipf’s law. The result from our study is expected to bring insight to the unique learning behavior on MOOC.
Derivation of some new distributions in statistical mechanics using maximum entropy approach
Directory of Open Access Journals (Sweden)
Ray Amritansu
2014-01-01
Full Text Available The maximum entropy principle has been earlier used to derive the Bose Einstein(B.E., Fermi Dirac(F.D. & Intermediate Statistics(I.S. distribution of statistical mechanics. The central idea of these distributions is to predict the distribution of the microstates, which are the particle of the system, on the basis of the knowledge of some macroscopic data. The latter information is specified in the form of some simple moment constraints. One distribution differs from the other in the way in which the constraints are specified. In the present paper, we have derived some new distributions similar to B.E., F.D. distributions of statistical mechanics by using maximum entropy principle. Some proofs of B.E. & F.D. distributions are shown, and at the end some new results are discussed.
International Nuclear Information System (INIS)
Zambra, M.; Favre, M.; Moreno, J.; Wyndham, E.; Chuaqui, H.; Choi, P.
1998-01-01
The charge formation processes in a hollow cathode region (HCR) of transient hollow cathode discharge have been studied at the final phase. The statistical distribution that describe different processes of ionization have been represented by Gaussian distributions. Nevertheless, was observed a better representation of these distributions when the pressure is near a minimum value, just before breakdown
ERROR DISTRIBUTION EVALUATION OF THE THIRD VANISHING POINT BASED ON RANDOM STATISTICAL SIMULATION
Directory of Open Access Journals (Sweden)
C. Li
2012-07-01
Full Text Available POS, integrated by GPS / INS (Inertial Navigation Systems, has allowed rapid and accurate determination of position and attitude of remote sensing equipment for MMS (Mobile Mapping Systems. However, not only does INS have system error, but also it is very expensive. Therefore, in this paper error distributions of vanishing points are studied and tested in order to substitute INS for MMS in some special land-based scene, such as ground façade where usually only two vanishing points can be detected. Thus, the traditional calibration approach based on three orthogonal vanishing points is being challenged. In this article, firstly, the line clusters, which parallel to each others in object space and correspond to the vanishing points, are detected based on RANSAC (Random Sample Consensus and parallelism geometric constraint. Secondly, condition adjustment with parameters is utilized to estimate nonlinear error equations of two vanishing points (VX, VY. How to set initial weights for the adjustment solution of single image vanishing points is presented. Solving vanishing points and estimating their error distributions base on iteration method with variable weights, co-factor matrix and error ellipse theory. Thirdly, under the condition of known error ellipses of two vanishing points (VX, VY and on the basis of the triangle geometric relationship of three vanishing points, the error distribution of the third vanishing point (VZ is calculated and evaluated by random statistical simulation with ignoring camera distortion. Moreover, Monte Carlo methods utilized for random statistical estimation are presented. Finally, experimental results of vanishing points coordinate and their error distributions are shown and analyzed.
Error Distribution Evaluation of the Third Vanishing Point Based on Random Statistical Simulation
Li, C.
2012-07-01
POS, integrated by GPS / INS (Inertial Navigation Systems), has allowed rapid and accurate determination of position and attitude of remote sensing equipment for MMS (Mobile Mapping Systems). However, not only does INS have system error, but also it is very expensive. Therefore, in this paper error distributions of vanishing points are studied and tested in order to substitute INS for MMS in some special land-based scene, such as ground façade where usually only two vanishing points can be detected. Thus, the traditional calibration approach based on three orthogonal vanishing points is being challenged. In this article, firstly, the line clusters, which parallel to each others in object space and correspond to the vanishing points, are detected based on RANSAC (Random Sample Consensus) and parallelism geometric constraint. Secondly, condition adjustment with parameters is utilized to estimate nonlinear error equations of two vanishing points (VX, VY). How to set initial weights for the adjustment solution of single image vanishing points is presented. Solving vanishing points and estimating their error distributions base on iteration method with variable weights, co-factor matrix and error ellipse theory. Thirdly, under the condition of known error ellipses of two vanishing points (VX, VY) and on the basis of the triangle geometric relationship of three vanishing points, the error distribution of the third vanishing point (VZ) is calculated and evaluated by random statistical simulation with ignoring camera distortion. Moreover, Monte Carlo methods utilized for random statistical estimation are presented. Finally, experimental results of vanishing points coordinate and their error distributions are shown and analyzed.
Higher Moments of Underlying Event Distributions
Xu, Zhen
2017-01-01
We perform an Underlying Event analysis for real data sets from pp collisions at center of mass energy $ \\sqrt{s}=5 $ and 13 TeV and pPb collisions at $ \\sqrt{s}=7 $ TeV at the LHC, together with the Monte Carlo data sets generated with Pythia8 and EPOS in the same conditions. The analysis is focused on the transverse region which is more sensitive to the Underlying Event, and performed as a function of the leading track transverse - momentum $p_t$ in each event. In our work, not only the average underlying event activity but also its fluctuation, namely its root mean square (RMS), Skewness and Kurtosis, are analyzed. We find that the particle density, energy density and their fluctuation magnitude (RMS) are suppressed at leading $p_t\\approx$ 5 GeV/c for all these cases, with EPOS having evident deviation of 10\\%-25\\%. The higher moments skewness and kurtosis decrease rapidly in low leading $p_t$ region, and follow an interesting Gaussian-like peak centered at leading $p_t\\approx$ 15 GeV/c.
On a New Class of Univariate Continuous Distributions that are Closed Under Inversion
Directory of Open Access Journals (Sweden)
Saleha Naghmi Habibullah
2006-07-01
Full Text Available Inverted probability distributions find applications in various real – life situations including econometrics, survey sampling, biological sciences and life – testing. Closure under inversion implies that the reciprocal of a continuous random variable X has the same probability function as the original random variable, allowing for a possible change in parameter values. To date, only a very few probability distributions have been found to possess the closure property. In this paper, an attempt has been made to generate a class of distributions that are closed under inversion, and to develop some statistical properties of this class of distributions.
International Nuclear Information System (INIS)
Gupta, S.S.; Panchapakesan, S.
1975-01-01
A quantile selection procedure in reliability problems pertaining to a restricted family of probability distributions is discussed. This family is assumed to be star-ordered with respect to the standard normal distribution folded at the origin. Motivation for this formulation of the problem is described. Both exact and asymptotic results dealing with the distribution of the maximum of ratios of order statistics from such a family are obtained and tables of the appropriate constants, percentiles of this statistic, are given in order to facilitate the use of the selection procedure
Directory of Open Access Journals (Sweden)
Naim L. Braha
2019-10-01
Full Text Available Let $(x_k$, for $k\\in \\mathbb{N}\\cup \\{0\\}$ be a sequence of real or complex numbers and set $(EC_{n}^{1}=\\frac{1}{2^n}\\sum_{j=0}^{n}{\\binom{n}{j}\\frac{1}{j+1}\\sum_{v=0}^{j}{x_v}},$ $n\\in \\mathbb{N}\\cup \\{0\\}.$ We present necessary and sufficient conditions, under which $st-\\lim_{}{x_k}= L$ follows from $st-\\lim_{}{(EC_{n}^{1}} = L,$ where L is a finite number. If $(x_k$ is a sequence of real numbers, then these are one-sided Tauberian conditions. If $(x_k$ is a sequence of complex numbers, then these are two-sided Tauberian conditions.
Derivation of reference distribution functions for Tokamak-plasmas by statistical thermodynamics
International Nuclear Information System (INIS)
Sonnino, G.; Peeters, P.; Nardone, P.; Cardinali, A.; Steinbrecher, G.; Sonnino, A.
2014-01-01
A general approach for deriving the expression of reference distribution functions by statistical thermodynamics is illustrated, and applied to the case of a magnetically confined plasma. The local equilibrium is defined by imposing the minimum entropy production, which applies only to the linear regime near a stationary thermodynamically non-equilibrium state and the maximum entropy principle under the scale invariance restrictions. This procedure may be adopted for a system subject to an arbitrary number of thermodynamic forces, however, for concreteness, we analyze, afterwards, a magnetically confined plasma subject to three thermodynamic forces, and three energy sources: (i) the total Ohmic heat, supplied by the transformer coil; (ii) the energy supplied by neutral beam injection (NBI); and (iii) the RF energy supplied by ion cyclotron resonant heating (ICRH) system which heats the minority population. In this limit case, we show that the derived expression of the distribution function is more general than that one, which is currently used for fitting the numerical steady-state solutions obtained by simulating the plasma by gyro-kinetic codes. An application to a simple model of fully ionized plasmas submitted to an external source is discussed. Through kinetic theory, we fixed the values of the free parameters linking them with the external power supplies. The singularity at low energy in the proposed distribution function is related to the intermittency in the turbulent plasma. (authors)
Project management under uncertainty beyond beta: The generalized bicubic distribution
Directory of Open Access Journals (Sweden)
José García Pérez
2016-01-01
Full Text Available The beta distribution has traditionally been employed in the PERT methodology and generally used for modeling bounded continuous random variables based on expert’s judgment. The impossibility of estimating four parameters from the three values provided by the expert when the beta distribution is assumed to be the underlying distribution has been widely debated. This paper presents the generalized bicubic distribution as a good alternative to the beta distribution since, when the variance depends on the mode, the generalized bicubic distribution approximates the kurtosis of the Gaussian distribution better than the beta distribution. In addition, this distribution presents good properties in the PERT methodology in relation to moderation and conservatism criteria. Two empirical applications are presented to demonstrate the adequateness of this new distribution.
Recurrence relations for higher moments of order statistics from doubly truncated Burr distribution
Directory of Open Access Journals (Sweden)
Narinder Pushkarna
2014-01-01
Full Text Available In this paper, we have obtained recurrence relations for higher moments of order statistics from doubly truncated Burr distribution, which enable one to obtain all the single, double (product and higher moments of any order of all order statistics for any sample size from doubly truncated Burr distribution in a simple recursive manner, thus generalizing the earlier work done by Khan and Khan (1987 and also by Pushkarna, Saran and Tiwari (2012.
Bellera, Carine A.; Julien, Marilyse; Hanley, James A.
2010-01-01
The Wilcoxon statistics are usually taught as nonparametric alternatives for the 1- and 2-sample Student-"t" statistics in situations where the data appear to arise from non-normal distributions, or where sample sizes are so small that we cannot check whether they do. In the past, critical values, based on exact tail areas, were…
STATISTICAL DISTRIBUTION PATTERNS IN MECHANICAL AND FATIGUE PROPERTIES OF METALLIC MATERIALS
Tatsuo, SAKAI; Masaki, NAKAJIMA; Keiro, TOKAJI; Norihiko, HASEGAWA; Department of Mechanical Engineering, Ritsumeikan University; Department of Mechanical Engineering, Toyota College of Technology; Department of Mechanical Engineering, Gifu University; Department of Mechanical Engineering, Gifu University
1997-01-01
Many papers on the statistical aspect of materials strength have been collected and reviewed by The Research Group for Statistical Aspects of Materials Strength.A book of "Statistical Aspects of Materials Strength" was written by this group, and published in 1992.Based on the experimental data compiled in this book, distribution patterns of mechanical properties are systematically surveyed paying an attention to metallic materials.Thus one can obtain the fundamental knowledge for a reliabilit...
Jiang, Quan; Zhong, Shan; Cui, Jie; Feng, Xia-Ting; Song, Leibo
2016-12-01
We investigated the statistical characteristics and probability distribution of the mechanical parameters of natural rock using triaxial compression tests. Twenty cores of Jinping marble were tested under each different levels of confining stress (i.e., 5, 10, 20, 30, and 40 MPa). From these full stress-strain data, we summarized the numerical characteristics and determined the probability distribution form of several important mechanical parameters, including deformational parameters, characteristic strength, characteristic strains, and failure angle. The statistical proofs relating to the mechanical parameters of rock presented new information about the marble's probabilistic distribution characteristics. The normal and log-normal distributions were appropriate for describing random strengths of rock; the coefficients of variation of the peak strengths had no relationship to the confining stress; the only acceptable random distribution for both Young's elastic modulus and Poisson's ratio was the log-normal function; and the cohesive strength had a different probability distribution pattern than the frictional angle. The triaxial tests and statistical analysis also provided experimental evidence for deciding the minimum reliable number of experimental sample and for picking appropriate parameter distributions to use in reliability calculations for rock engineering.
Zhang, Yu; Li, Fei; Zhang, Shengkai; Zhu, Tingting
2017-04-01
Synthetic Aperture Radar (SAR) is significantly important for polar remote sensing since it can provide continuous observations in all days and all weather. SAR can be used for extracting the surface roughness information characterized by the variance of dielectric properties and different polarization channels, which make it possible to observe different ice types and surface structure for deformation analysis. In November, 2016, Chinese National Antarctic Research Expedition (CHINARE) 33rd cruise has set sails in sea ice zone in Antarctic. Accurate leads spatial distribution in sea ice zone for routine planning of ship navigation is essential. In this study, the semantic relationship between leads and sea ice categories has been described by the Conditional Random Fields (CRF) model, and leads characteristics have been modeled by statistical distributions in SAR imagery. In the proposed algorithm, a mixture statistical distribution based CRF is developed by considering the contexture information and the statistical characteristics of sea ice for improving leads detection in Sentinel-1A dual polarization SAR imagery. The unary potential and pairwise potential in CRF model is constructed by integrating the posteriori probability estimated from statistical distributions. For mixture statistical distribution parameter estimation, Method of Logarithmic Cumulants (MoLC) is exploited for single statistical distribution parameters estimation. The iteration based Expectation Maximal (EM) algorithm is investigated to calculate the parameters in mixture statistical distribution based CRF model. In the posteriori probability inference, graph-cut energy minimization method is adopted in the initial leads detection. The post-processing procedures including aspect ratio constrain and spatial smoothing approaches are utilized to improve the visual result. The proposed method is validated on Sentinel-1A SAR C-band Extra Wide Swath (EW) Ground Range Detected (GRD) imagery with a
Statistical distribution of building lot frontage: application for Tokyo downtown districts
Usui, Hiroyuki
2018-03-01
The frontage of a building lot is the determinant factor of the residential environment. The statistical distribution of building lot frontages shows how the perimeters of urban blocks are shared by building lots for a given density of buildings and roads. For practitioners in urban planning, this is indispensable to identify potential districts which comprise a high percentage of building lots with narrow frontage after subdivision and to reconsider the appropriate criteria for the density of buildings and roads as residential environment indices. In the literature, however, the statistical distribution of building lot frontages and the density of buildings and roads has not been fully researched. In this paper, based on the empirical study in the downtown districts of Tokyo, it is found that (1) a log-normal distribution fits the observed distribution of building lot frontages better than a gamma distribution, which is the model of the size distribution of Poisson Voronoi cells on closed curves; (2) the statistical distribution of building lot frontages statistically follows a log-normal distribution, whose parameters are the gross building density, road density, average road width, the coefficient of variation of building lot frontage, and the ratio of the number of building lot frontages to the number of buildings; and (3) the values of the coefficient of variation of building lot frontages, and that of the ratio of the number of building lot frontages to that of buildings are approximately equal to 0.60 and 1.19, respectively.
Comparing simulated and theoretical sampling distributions of the U3 person-fit statistic
Emons, Wilco H.M.; Meijer, R.R.; Sijtsma, Klaas
2002-01-01
The accuracy with which the theoretical sampling distribution of van der Flier’s person-fit statistic U3 approaches the empirical U3 sampling distribution is affected by the item discrimination. A simulation study showed that for tests with a moderate or a strong mean item discrimination, the Type I
Comparing simulated and theoretical sampling distributions of the U3 person-fit statistic
Emons, W.H.M.; Meijer, R.R.; Sijtsma, K.
2002-01-01
The accuracy with which the theoretical sampling distribution of van der Flier's person-.t statistic U3 approaches the empirical U3 sampling distribution is affected by the item discrimination. A simulation study showed that for tests with a moderate or a strong mean item discrimination, the Type I
The Effect of Distributed Practice in Undergraduate Statistics Homework Sets: A Randomized Trial
Crissinger, Bryan R.
2015-01-01
Most homework sets in statistics courses are constructed so that students concentrate or "mass" their practice on a certain topic in one problem set. Distributed practice homework sets include review problems in each set so that practice on a topic is distributed across problem sets. There is a body of research that points to the…
Current state of the art for statistical modeling of species distributions [Chapter 16
Troy M. Hegel; Samuel A. Cushman; Jeffrey Evans; Falk Huettmann
2010-01-01
Over the past decade the number of statistical modelling tools available to ecologists to model species' distributions has increased at a rapid pace (e.g. Elith et al. 2006; Austin 2007), as have the number of species distribution models (SDM) published in the literature (e.g. Scott et al. 2002). Ten years ago, basic logistic regression (Hosmer and Lemeshow 2000)...
Links to sources of cancer-related statistics, including the Surveillance, Epidemiology and End Results (SEER) Program, SEER-Medicare datasets, cancer survivor prevalence data, and the Cancer Trends Progress Report.
DEFF Research Database (Denmark)
Hansen, Kurt Schaldemose
2007-01-01
associated with large excursions from the mean [2]. Thus, the more extreme turbulence excursions (i.e. the upper tail of the turbulence PDF) seem to follow an Exponential-like distribution rather than a Gaussian distribution, and a Gaussian estimate may under-predict the probability of large turbulence......The statistical distribution of extreme wind excursions above a mean level, for a specified recurrence period, is of crucial importance in relation to design of wind sensitive structures. This is particularly true for wind turbine structures. Based on an assumption of a Gaussian "mother......" distribution, Cartwright and Longuet-Higgens [1] derived an asymptotic expression for the distribution of the largest excursion from the mean level during an arbitrary recurrence period. From its inception, this celebrated expression has been widely used in wind engineering (as well as in off-shore engineering...
Statistical distribution of blood serotonin as a predictor of early autistic brain abnormalities
Directory of Open Access Journals (Sweden)
Janušonis Skirmantas
2005-07-01
Full Text Available Abstract Background A wide range of abnormalities has been reported in autistic brains, but these abnormalities may be the result of an earlier underlying developmental alteration that may no longer be evident by the time autism is diagnosed. The most consistent biological finding in autistic individuals has been their statistically elevated levels of 5-hydroxytryptamine (5-HT, serotonin in blood platelets (platelet hyperserotonemia. The early developmental alteration of the autistic brain and the autistic platelet hyperserotonemia may be caused by the same biological factor expressed in the brain and outside the brain, respectively. Unlike the brain, blood platelets are short-lived and continue to be produced throughout the life span, suggesting that this factor may continue to operate outside the brain years after the brain is formed. The statistical distributions of the platelet 5-HT levels in normal and autistic groups have characteristic features and may contain information about the nature of this yet unidentified factor. Results The identity of this factor was studied by using a novel, quantitative approach that was applied to published distributions of the platelet 5-HT levels in normal and autistic groups. It was shown that the published data are consistent with the hypothesis that a factor that interferes with brain development in autism may also regulate the release of 5-HT from gut enterochromaffin cells. Numerical analysis revealed that this factor may be non-functional in autistic individuals. Conclusion At least some biological factors, the abnormal function of which leads to the development of the autistic brain, may regulate the release of 5-HT from the gut years after birth. If the present model is correct, it will allow future efforts to be focused on a limited number of gene candidates, some of which have not been suspected to be involved in autism (such as the 5-HT4 receptor gene based on currently available clinical and
Bulk stress distributions in the pore space of sphere-packed beds under Darcy flow conditions.
Pham, Ngoc H; Voronov, Roman S; Tummala, Naga Rajesh; Papavassiliou, Dimitrios V
2014-03-01
In this paper, bulk stress distributions in the pore space of columns packed with spheres are numerically computed with lattice Boltzmann simulations. Three different ideally packed and one randomly packed configuration of the columns are considered under Darcy flow conditions. The stress distributions change when the packing type changes. In the Darcy regime, the normalized stress distribution for a particular packing type is independent of the pressure difference that drives the flow and presents a common pattern. The three parameter (3P) log-normal distribution is found to describe the stress distributions in the randomly packed beds within statistical accuracy. In addition, the 3P log-normal distribution is still valid when highly porous scaffold geometries rather than sphere beds are examined. It is also shown that the 3P log-normal distribution can describe the bulk stress distribution in consolidated reservoir rocks like Berea sandstone.
International Nuclear Information System (INIS)
2005-01-01
For the years 2004 and 2005 the figures shown in the tables of Energy Review are partly preliminary. The annual statistics published in Energy Review are presented in more detail in a publication called Energy Statistics that comes out yearly. Energy Statistics also includes historical time-series over a longer period of time (see e.g. Energy Statistics, Statistics Finland, Helsinki 2004.) The applied energy units and conversion coefficients are shown in the back cover of the Review. Explanatory notes to the statistical tables can be found after tables and figures. The figures presents: Changes in GDP, energy consumption and electricity consumption, Carbon dioxide emissions from fossile fuels use, Coal consumption, Consumption of natural gas, Peat consumption, Domestic oil deliveries, Import prices of oil, Consumer prices of principal oil products, Fuel prices in heat production, Fuel prices in electricity production, Price of electricity by type of consumer, Average monthly spot prices at the Nord pool power exchange, Total energy consumption by source and CO 2 -emissions, Supplies and total consumption of electricity GWh, Energy imports by country of origin in January-June 2003, Energy exports by recipient country in January-June 2003, Consumer prices of liquid fuels, Consumer prices of hard coal, natural gas and indigenous fuels, Price of natural gas by type of consumer, Price of electricity by type of consumer, Price of district heating by type of consumer, Excise taxes, value added taxes and fiscal charges and fees included in consumer prices of some energy sources and Energy taxes, precautionary stock fees and oil pollution fees
Distribution Line Parameter Estimation Under Consideration of Measurement Tolerances
DEFF Research Database (Denmark)
Prostejovsky, Alexander; Gehrke, Oliver; Kosek, Anna Magdalena
2016-01-01
State estimation and control approaches in electric distribution grids rely on precise electric models that may be inaccurate. This work presents a novel method of estimating distribution line parameters using only root mean square voltage and power measurements under consideration of measurement...
International Nuclear Information System (INIS)
2000-01-01
For the year 1999 and 2000, part of the figures shown in the tables of the Energy Review are preliminary or estimated. The annual statistics of the Energy Review appear in more detail from the publication Energiatilastot - Energy Statistics issued annually, which also includes historical time series over a longer period (see e.g., Energiatilastot 1998, Statistics Finland, Helsinki 1999, ISSN 0785-3165). The inside of the Review's back cover shows the energy units and the conversion coefficients used for them. Explanatory notes to the statistical tables can be found after tables and figures. The figures presents: Changes in the volume of GNP and energy consumption, Changes in the volume of GNP and electricity, Coal consumption, Natural gas consumption, Peat consumption, Domestic oil deliveries, Import prices of oil, Consumer prices of principal oil products, Fuel prices for heat production, Fuel prices for electricity production, Carbon dioxide emissions, Total energy consumption by source and CO 2 -emissions, Electricity supply, Energy imports by country of origin in January-March 2000, Energy exports by recipient country in January-March 2000, Consumer prices of liquid fuels, Consumer prices of hard coal, natural gas and indigenous fuels, Average electricity price by type of consumer, Price of district heating by type of consumer, Excise taxes, value added taxes and fiscal charges and fees included in consumer prices of some energy sources and Energy taxes and precautionary stock fees on oil products
International Nuclear Information System (INIS)
1999-01-01
For the year 1998 and the year 1999, part of the figures shown in the tables of the Energy Review are preliminary or estimated. The annual statistics of the Energy Review appear in more detail from the publication Energiatilastot - Energy Statistics issued annually, which also includes historical time series over a longer period (see e.g. Energiatilastot 1998, Statistics Finland, Helsinki 1999, ISSN 0785-3165). The inside of the Review's back cover shows the energy units and the conversion coefficients used for them. Explanatory notes to the statistical tables can be found after tables and figures. The figures presents: Changes in the volume of GNP and energy consumption, Changes in the volume of GNP and electricity, Coal consumption, Natural gas consumption, Peat consumption, Domestic oil deliveries, Import prices of oil, Consumer prices of principal oil products, Fuel prices for heat production, Fuel prices for electricity production, Carbon dioxide emissions, Total energy consumption by source and CO 2 -emissions, Electricity supply, Energy imports by country of origin in January-June 1999, Energy exports by recipient country in January-June 1999, Consumer prices of liquid fuels, Consumer prices of hard coal, natural gas and indigenous fuels, Average electricity price by type of consumer, Price of district heating by type of consumer, Excise taxes, value added taxes and fiscal charges and fees included in consumer prices of some energy sources and Energy taxes and precautionary stock fees on oil products
International Nuclear Information System (INIS)
2001-01-01
For the year 2000, part of the figures shown in the tables of the Energy Review are preliminary or estimated. The annual statistics of the Energy Review appear in more detail from the publication Energiatilastot - Energy Statistics issued annually, which also includes historical time series over a longer period (see e.g. Energiatilastot 1999, Statistics Finland, Helsinki 2000, ISSN 0785-3165). The inside of the Review's back cover shows the energy units and the conversion coefficients used for them. Explanatory notes to the statistical tables can be found after tables and figures. The figures presents: Changes in the volume of GNP and energy consumption, Changes in the volume of GNP and electricity, Coal consumption, Natural gas consumption, Peat consumption, Domestic oil deliveries, Import prices of oil, Consumer prices of principal oil products, Fuel prices for heat production, Fuel prices for electricity production, Carbon dioxide emissions from the use of fossil fuels, Total energy consumption by source and CO 2 -emissions, Electricity supply, Energy imports by country of origin in 2000, Energy exports by recipient country in 2000, Consumer prices of liquid fuels, Consumer prices of hard coal, natural gas and indigenous fuels, Average electricity price by type of consumer, Price of district heating by type of consumer, Excise taxes, value added taxes and fiscal charges and fees included in consumer prices of some energy sources and Energy taxes and precautionary stock fees on oil products
Handbook of tables for order statistics from lognormal distributions with applications
Balakrishnan, N
1999-01-01
Lognormal distributions are one of the most commonly studied models in the sta tistical literature while being most frequently used in the applied literature. The lognormal distributions have been used in problems arising from such diverse fields as hydrology, biology, communication engineering, environmental science, reliability, agriculture, medical science, mechanical engineering, material science, and pharma cology. Though the lognormal distributions have been around from the beginning of this century (see Chapter 1), much of the work concerning inferential methods for the parameters of lognormal distributions has been done in the recent past. Most of these methods of inference, particUlarly those based on censored samples, involve extensive use of numerical methods to solve some nonlinear equations. Order statistics and their moments have been discussed quite extensively in the literature for many distributions. It is very well known that the moments of order statistics can be derived explicitly only...
International Nuclear Information System (INIS)
2003-01-01
For the year 2002, part of the figures shown in the tables of the Energy Review are partly preliminary. The annual statistics of the Energy Review also includes historical time-series over a longer period (see e.g. Energiatilastot 2001, Statistics Finland, Helsinki 2002). The applied energy units and conversion coefficients are shown in the inside back cover of the Review. Explanatory notes to the statistical tables can be found after tables and figures. The figures presents: Changes in GDP, energy consumption and electricity consumption, Carbon dioxide emissions from fossile fuels use, Coal consumption, Consumption of natural gas, Peat consumption, Domestic oil deliveries, Import prices of oil, Consumer prices of principal oil products, Fuel prices in heat production, Fuel prices in electricity production, Price of electricity by type of consumer, Average monthly spot prices at the Nord pool power exchange, Total energy consumption by source and CO 2 -emissions, Supply and total consumption of electricity GWh, Energy imports by country of origin in January-June 2003, Energy exports by recipient country in January-June 2003, Consumer prices of liquid fuels, Consumer prices of hard coal, natural gas and indigenous fuels, Price of natural gas by type of consumer, Price of electricity by type of consumer, Price of district heating by type of consumer, Excise taxes, value added taxes and fiscal charges and fees included in consumer prices of some energy sources and Excise taxes, precautionary stock fees on oil pollution fees on energy products
International Nuclear Information System (INIS)
2000-01-01
For the year 1999 and 2000, part of the figures shown in the tables of the Energy Review are preliminary or estimated. The annual statistics of the Energy also includes historical time series over a longer period (see e.g., Energiatilastot 1999, Statistics Finland, Helsinki 2000, ISSN 0785-3165). The inside of the Review's back cover shows the energy units and the conversion coefficients used for them. Explanatory notes to the statistical tables can be found after tables and figures. The figures presents: Changes in the volume of GNP and energy consumption, Changes in the volume of GNP and electricity, Coal consumption, Natural gas consumption, Peat consumption, Domestic oil deliveries, Import prices of oil, Consumer prices of principal oil products, Fuel prices for heat production, Fuel prices for electricity production, Carbon dioxide emissions, Total energy consumption by source and CO 2 -emissions, Electricity supply, Energy imports by country of origin in January-June 2000, Energy exports by recipient country in January-June 2000, Consumer prices of liquid fuels, Consumer prices of hard coal, natural gas and indigenous fuels, Average electricity price by type of consumer, Price of district heating by type of consumer, Excise taxes, value added taxes and fiscal charges and fees included in consumer prices of some energy sources and Energy taxes and precautionary stock fees on oil products
International Nuclear Information System (INIS)
2004-01-01
For the year 2003 and 2004, the figures shown in the tables of the Energy Review are partly preliminary. The annual statistics of the Energy Review also includes historical time-series over a longer period (see e.g. Energiatilastot, Statistics Finland, Helsinki 2003, ISSN 0785-3165). The applied energy units and conversion coefficients are shown in the inside back cover of the Review. Explanatory notes to the statistical tables can be found after tables and figures. The figures presents: Changes in GDP, energy consumption and electricity consumption, Carbon dioxide emissions from fossile fuels use, Coal consumption, Consumption of natural gas, Peat consumption, Domestic oil deliveries, Import prices of oil, Consumer prices of principal oil products, Fuel prices in heat production, Fuel prices in electricity production, Price of electricity by type of consumer, Average monthly spot prices at the Nord pool power exchange, Total energy consumption by source and CO 2 -emissions, Supplies and total consumption of electricity GWh, Energy imports by country of origin in January-March 2004, Energy exports by recipient country in January-March 2004, Consumer prices of liquid fuels, Consumer prices of hard coal, natural gas and indigenous fuels, Price of natural gas by type of consumer, Price of electricity by type of consumer, Price of district heating by type of consumer, Excise taxes, value added taxes and fiscal charges and fees included in consumer prices of some energy sources and Excise taxes, precautionary stock fees on oil pollution fees
Directory of Open Access Journals (Sweden)
Tianhao Wu
2016-09-01
Full Text Available While firm-level and micro issue analysis become an important part in research of international trade, only a few work is concerned about the goodness-of-fit for size distribution of firms. In this paper, we revisit the statistical aspects of firm productivity and sales revenue, in order to compare different definitions of statistical distances. We first deduce the exact form of size distribution of firms by only implementing the assumptions of productivity and demand function, and then introduce the famous g-divergence as well as its statistical implications. We also do the simulation and calibration so as to compare those different divergences, moreover, tests the combined assumptions. We conclude that minimizing Pearson χ2 and Neyman χ2 produces similar results and minimizing Kullback-Leibler divergence is likely to take the expense of other distance measures. Additionally, selection among different statistical distances is much more significant than demand functions
Zhang, Y.; Li, F.; Zhang, S.; Hao, W.; Zhu, T.; Yuan, L.; Xiao, F.
2017-09-01
In this paper, Statistical Distribution based Conditional Random Fields (STA-CRF) algorithm is exploited for improving marginal ice-water classification. Pixel level ice concentration is presented as the comparison of methods based on CRF. Furthermore, in order to explore the effective statistical distribution model to be integrated into STA-CRF, five statistical distribution models are investigated. The STA-CRF methods are tested on 2 scenes around Prydz Bay and Adélie Depression, where contain a variety of ice types during melt season. Experimental results indicate that the proposed method can resolve sea ice edge well in Marginal Ice Zone (MIZ) and show a robust distinction of ice and water.
Directory of Open Access Journals (Sweden)
Y. Zhang
2017-09-01
Full Text Available In this paper, Statistical Distribution based Conditional Random Fields (STA-CRF algorithm is exploited for improving marginal ice-water classification. Pixel level ice concentration is presented as the comparison of methods based on CRF. Furthermore, in order to explore the effective statistical distribution model to be integrated into STA-CRF, five statistical distribution models are investigated. The STA-CRF methods are tested on 2 scenes around Prydz Bay and Adélie Depression, where contain a variety of ice types during melt season. Experimental results indicate that the proposed method can resolve sea ice edge well in Marginal Ice Zone (MIZ and show a robust distinction of ice and water.
Hardpan and maize root distribution under conservation and ...
African Journals Online (AJOL)
Hardpan and maize root distribution under conservation and conventional tillage in agro-ecological zone IIa, Zambia. ... There is no scientific basis for the recommendation given to farmers by agricultural extension workers to “break the hardpan” in fields under manual or animal tillage in the study areas. Key Words: Soil ...
Huang, N. E.; Long, S. R.
1980-01-01
Laboratory experiments were performed to measure the surface elevation probability density function and associated statistical properties for a wind-generated wave field. The laboratory data along with some limited field data were compared. The statistical properties of the surface elevation were processed for comparison with the results derived from the Longuet-Higgins (1963) theory. It is found that, even for the highly non-Gaussian cases, the distribution function proposed by Longuet-Higgins still gives good approximations.
Directory of Open Access Journals (Sweden)
Farztdinov Vadim
2012-11-01
Full Text Available Abstract Background Because of the large volume of data and the intrinsic variation of data intensity observed in microarray experiments, different statistical methods have been used to systematically extract biological information and to quantify the associated uncertainty. The simplest method to identify differentially expressed genes is to evaluate the ratio of average intensities in two different conditions and consider all genes that differ by more than an arbitrary cut-off value to be differentially expressed. This filtering approach is not a statistical test and there is no associated value that can indicate the level of confidence in the designation of genes as differentially expressed or not differentially expressed. At the same time the fold change by itself provide valuable information and it is important to find unambiguous ways of using this information in expression data treatment. Results A new method of finding differentially expressed genes, called distributional fold change (DFC test is introduced. The method is based on an analysis of the intensity distribution of all microarray probe sets mapped to a three dimensional feature space composed of average expression level, average difference of gene expression and total variance. The proposed method allows one to rank each feature based on the signal-to-noise ratio and to ascertain for each feature the confidence level and power for being differentially expressed. The performance of the new method was evaluated using the total and partial area under receiver operating curves and tested on 11 data sets from Gene Omnibus Database with independently verified differentially expressed genes and compared with the t-test and shrinkage t-test. Overall the DFC test performed the best – on average it had higher sensitivity and partial AUC and its elevation was most prominent in the low range of differentially expressed features, typical for formalin-fixed paraffin-embedded sample sets
Statistical γ-ray multiplicity distributions in Dy and Yb nuclei
International Nuclear Information System (INIS)
Tveter, T.S.; Bergholt, L.; Guttormsen, M.; Rekstad, J.
1994-03-01
The statistical γ-ray multiplicity distributions following the reactions 163 Dy( 3 He,αxn) 162-x Dy and 173 Yb( 3 He,αxn) 172-x Yb have been studied. The mean value and standard deviation have been extracted as functions of excitation energy. The method is based on the probability distribution of k-fold events, where an α-particle is observed in coincidence with signals in k γ-ray detectors. Techniques for isolating statistical γ-rays and subtracting random background, cross-talk and neutron contributions are discussed. 22 refs., 10 figs., 3 tabs
International Nuclear Information System (INIS)
Shnoll, S E; Zenchenko, T A; Zenchenko, K I; Pozharskii, E V; Kolombet, V A; Konradov, Alexander A
2000-01-01
Considered is the statistical ground of the certainty of cosmophysical effects on the fine structure of distributions governing the results of measurements in various physical processes. We show that the previously discussed effects of synchronous variations of histogram shapes in independent processes, and the periodical occurrence of histograms of a particular shape, do not depend on the form of the integral distribution. The adequacy of visual (expert) estimation when comparing the shapes of histograms as an alternative to the standard statistical methods is justified. (letters to the editors)
Poppe, L.J.; Eliason, A.H.; Hastings, M.E.
2004-01-01
Measures that describe and summarize sediment grain-size distributions are important to geologists because of the large amount of information contained in textural data sets. Statistical methods are usually employed to simplify the necessary comparisons among samples and quantify the observed differences. The two statistical methods most commonly used by sedimentologists to describe particle distributions are mathematical moments (Krumbein and Pettijohn, 1938) and inclusive graphics (Folk, 1974). The choice of which of these statistical measures to use is typically governed by the amount of data available (Royse, 1970). If the entire distribution is known, the method of moments may be used; if the next to last accumulated percent is greater than 95, inclusive graphics statistics can be generated. Unfortunately, earlier programs designed to describe sediment grain-size distributions statistically do not run in a Windows environment, do not allow extrapolation of the distribution's tails, or do not generate both moment and graphic statistics (Kane and Hubert, 1963; Collias et al., 1963; Schlee and Webster, 1967; Poppe et al., 2000)1.Owing to analytical limitations, electro-resistance multichannel particle-size analyzers, such as Coulter Counters, commonly truncate the tails of the fine-fraction part of grain-size distributions. These devices do not detect fine clay in the 0.6–0.1 μm range (part of the 11-phi and all of the 12-phi and 13-phi fractions). Although size analyses performed down to 0.6 μm microns are adequate for most freshwater and near shore marine sediments, samples from many deeper water marine environments (e.g. rise and abyssal plain) may contain significant material in the fine clay fraction, and these analyses benefit from extrapolation.The program (GSSTAT) described herein generates statistics to characterize sediment grain-size distributions and can extrapolate the fine-grained end of the particle distribution. It is written in Microsoft
Bigot, Jérémie; Cazelles, Elsa; Papadakis, Nicolas
2017-01-01
The notion of Sinkhorn divergence has recently gained popularity in machine learning and statistics, as it makes feasible the use of smoothed optimal transportation distances for data analysis. The Sinkhorn divergence allows the fast computation of an entropically regularized Wasserstein distance between two probability distributions supported on a finite metric space of (possibly) high-dimension. For data sampled from one or two unknown probability distributions, we derive central limit theo...
Directory of Open Access Journals (Sweden)
Fangling Pu
2013-05-01
Full Text Available This paper proposes the mixture of Alpha-stable (MAS distributions for modeling statistical property of Synthetic Aperture Radar (SAR images in a supervised Markovian classification algorithm. Our work is motivated by the fact that natural scenes consist of various reflectors with different types that are typically concentrated within a small area, and SAR images generally exhibit sharp peaks, heavy tails, and even multimodal statistical property, especially at high resolution. Unimodal distributions do not fit such statistical property well, and thus a multimodal approach is necessary. Driven by the multimodality and impulsiveness of high resolution SAR images histogram, we utilize the mixture of Alpha-stable distributions to describe such characteristics. A pseudo-simulated annealing (PSA estimator based on Markov chain Monte Carlo (MCMC is present to efficiently estimate model parameters of the mixture of Alpha-stable distributions. To validate the proposed PSA estimator, we apply it to simulated data and compare its performance to that of a state-of-the-art estimator. Finally, we exploit the MAS distributions and a Markovian context for SAR images classification. The effectiveness of the proposed classifier is demonstrated by experiments on TerraSAR-X images, which verifies the validity of the MAS distributions for modeling and classification of SAR images.
Fissure formation in coke. 3: Coke size distribution and statistical analysis
Energy Technology Data Exchange (ETDEWEB)
D.R. Jenkins; D.E. Shaw; M.R. Mahoney [CSIRO, North Ryde, NSW (Australia). Mathematical and Information Sciences
2010-07-15
A model of coke stabilization, based on a fundamental model of fissuring during carbonisation is used to demonstrate the applicability of the fissuring model to actual coke size distributions. The results indicate that the degree of stabilization is important in determining the size distribution. A modified form of the Weibull distribution is shown to provide a better representation of the whole coke size distribution compared to the Rosin-Rammler distribution, which is generally only fitted to the lump coke. A statistical analysis of a large number of experiments in a pilot scale coke oven shows reasonably good prediction of the coke mean size, based on parameters related to blend rank, amount of low rank coal, fluidity and ash. However, the prediction of measures of the spread of the size distribution is more problematic. The fissuring model, the size distribution representation and the statistical analysis together provide a comprehensive capability for understanding and predicting the mean size and distribution of coke lumps produced during carbonisation. 12 refs., 16 figs., 4 tabs.
International Nuclear Information System (INIS)
Kawasaki, Keiichi; Ishii, Kenji; Saito, Yoko; Oda, Keiichi; Kimura, Yuichi; Ishiwata, Kiichi
2008-01-01
In clinical cerebral 2-[ 18 F]fluoro-2-deoxy-D-glucose positron emission tomography (FDG-PET) studies, we sometimes encounter hyperglycemic patients with diabetes mellitus or patients who have not adhered to the fasting requirement. The objective of this study was to investigate the influence of mild hyperglycemia (plasma glucose range 110-160 mg/dl) on the cerebral FDG distribution patterns calculated by statistical parametric mapping (SPM). We studied 19 healthy subjects (mean age 66.2 years). First, all the subjects underwent FDG-PET scans in the fasting condition. Then, 9 of the 19 subjects (mean age 64.3 years) underwent the second FDG-PET scans in the mild hyperglycemic condition. The alterations in the FDG-PET scans were investigated using SPM- and region of interest (ROI)-based analyses. We used three reference regions: SPM global brain (SPMgb) used for SPM global mean calculation, the gray and white matter region computed from magnetic resonance image (MRIgw), and the cerebellar cortex (Cbll). The FDG uptake calculated as the standardized uptake value (average) in SPMgb, MRIgw, and Cbll regions in the mild hyperglycemic condition was 42.7%, 41.3%, and 40.0%, respectively, of that observed in the fasting condition. In SPM analysis, the mild hyperglycemia was found to affect the cerebral distribution patterns of FDG. The FDG uptake was relatively decreased in the gray matter, mainly in the frontal, temporal, and parietal association cortices, posterior cingulate, and precuneus in both SPMgb- and MRIgw-reference-based analyses. When Cbll was adopted as the reference region, those decrease patterns disappeared. The FDG uptake was relatively increased in the white matter, mainly in the centrum semiovale in all the reference-based analyses. It is noteworthy that the FDG distribution patterns were altered under mild hyperglycemia in SPM analysis. The decreased uptake patterns in SPMgb- (SPM default) and MRIgw-reference-based analyses resembled those observed in
Changes in tropical precipitation cluster size distributions under global warming
Neelin, J. D.; Quinn, K. M.
2016-12-01
The total amount of precipitation integrated across a tropical storm or other precipitation feature (contiguous clusters of precipitation exceeding a minimum rain rate) is a useful measure of the aggregate size of the disturbance. To establish baseline behavior in current climate, the probability distribution of cluster sizes from multiple satellite retrievals and National Center for Environmental Prediction (NCEP) reanalysis is compared to those from Coupled Model Intercomparison Project (CMIP5) models and the Geophysical Fluid Dynamics Laboratory high-resolution atmospheric model (HIRAM-360 and -180). With the caveat that a minimum rain rate threshold is important in the models (which tend to overproduce low rain rates), the models agree well with observations in leading properties. In particular, scale-free power law ranges in which the probability drops slowly with increasing cluster size are well modeled, followed by a rapid drop in probability of the largest clusters above a cutoff scale. Under the RCP 8.5 global warming scenario, the models indicate substantial increases in probability (up to an order of magnitude) of the largest clusters by the end of century. For models with continuous time series of high resolution output, there is substantial spread on when these probability increases for the largest precipitation clusters should be detectable, ranging from detectable within the observational period to statistically significant trends emerging only in the second half of the century. Examination of NCEP reanalysis and SSMI/SSMIS series of satellite retrievals from 1979 to present does not yield reliable evidence of trends at this time. The results suggest improvements in inter-satellite calibration of the SSMI/SSMIS retrievals could aid future detection.
Directory of Open Access Journals (Sweden)
Gökhan Gökdere
2014-05-01
Full Text Available In this paper, closed form expressions for the moments of the truncated Pareto order statistics are obtained by using conditional distribution. We also derive some results for the moments which will be useful for moment computations based on ordered data.
Directory of Open Access Journals (Sweden)
Wenzhi Wang
2016-07-01
Full Text Available Modeling the random fiber distribution of a fiber-reinforced composite is of great importance for studying the progressive failure behavior of the material on the micro scale. In this paper, we develop a new algorithm for generating random representative volume elements (RVEs with statistical equivalent fiber distribution against the actual material microstructure. The realistic statistical data is utilized as inputs of the new method, which is archived through implementation of the probability equations. Extensive statistical analysis is conducted to examine the capability of the proposed method and to compare it with existing methods. It is found that the proposed method presents a good match with experimental results in all aspects including the nearest neighbor distance, nearest neighbor orientation, Ripley’s K function, and the radial distribution function. Finite element analysis is presented to predict the effective elastic properties of a carbon/epoxy composite, to validate the generated random representative volume elements, and to provide insights of the effect of fiber distribution on the elastic properties. The present algorithm is shown to be highly accurate and can be used to generate statistically equivalent RVEs for not only fiber-reinforced composites but also other materials such as foam materials and particle-reinforced composites.
Directory of Open Access Journals (Sweden)
Chu He
2017-11-01
Full Text Available This paper proposes an innovative Mixture Statistical Distribution Based Multiple Component (MSDMC model for target detection in high spatial resolution Synthetic Aperture Radar (SAR images. Traditional detection algorithms usually ignore the spatial relationship among the target’s components. In the presented method, however, both the structural information and the statistical distribution are considered to better recognize the target. Firstly, the method based on compressed sensing reconstruction is used to recover the SAR image. Then, the multiple component model composed of a root filter and some corresponding part filters is applied to describe the structural information of the target. In the following step, mixture statistical distributions are utilised to discriminate the target from the background, and the Method of Logarithmic Cumulants (MoLC based Expectation Maximization (EM approach is adopted to estimate the parameters of the mixture statistical distribution model, which will be finally merged into the proposed MSDMC framework together with the multiple component model. In the experiment, the aeroplanes and the electrical power towers in TerraSAR-X SAR images are detected at three spatial resolutions. The results indicate that the presented MSDMC Model has potential for improving the detection performance compared with the state-of-the-art SAR target detection methods.
Scan statistics with local vote for target detection in distributed system
Luo, Junhai; Wu, Qi
2017-12-01
Target detection has occupied a pivotal position in distributed system. Scan statistics, as one of the most efficient detection methods, has been applied to a variety of anomaly detection problems and significantly improves the probability of detection. However, scan statistics cannot achieve the expected performance when the noise intensity is strong, or the signal emitted by the target is weak. The local vote algorithm can also achieve higher target detection rate. After the local vote, the counting rule is always adopted for decision fusion. The counting rule does not use the information about the contiguity of sensors but takes all sensors' data into consideration, which makes the result undesirable. In this paper, we propose a scan statistics with local vote (SSLV) method. This method combines scan statistics with local vote decision. Before scan statistics, each sensor executes local vote decision according to the data of its neighbors and its own. By combining the advantages of both, our method can obtain higher detection rate in low signal-to-noise ratio environment than the scan statistics. After the local vote decision, the distribution of sensors which have detected the target becomes more intensive. To make full use of local vote decision, we introduce a variable-step-parameter for the SSLV. It significantly shortens the scan period especially when the target is absent. Analysis and simulations are presented to demonstrate the performance of our method.
Statistical Analysis of Video Frame Size Distribution Originating from Scalable Video Codec (SVC
Directory of Open Access Journals (Sweden)
Sima Ahmadpour
2017-01-01
Full Text Available Designing an effective and high performance network requires an accurate characterization and modeling of network traffic. The modeling of video frame sizes is normally applied in simulation studies and mathematical analysis and generating streams for testing and compliance purposes. Besides, video traffic assumed as a major source of multimedia traffic in future heterogeneous network. Therefore, the statistical distribution of video data can be used as the inputs for performance modeling of networks. The finding of this paper comprises the theoretical definition of distribution which seems to be relevant to the video trace in terms of its statistical properties and finds the best distribution using both the graphical method and the hypothesis test. The data set used in this article consists of layered video traces generating from Scalable Video Codec (SVC video compression technique of three different movies.
Statistical Modelling of Synaptic Vesicles Distribution and Analysing their Physical Characteristics
DEFF Research Database (Denmark)
Khanmohammadi, Mahdieh
This Ph.D. thesis deals with mathematical and statistical modeling of synaptic vesicle distribution, shape, orientation and interactions. The first major part of this thesis treats the problem of determining the effect of stress on synaptic vesicle distribution and interactions. Serial section...... transmission electron microscopy is used to acquire images from two experimental groups of rats: 1) rats subjected to a behavioral model of stress and 2) rats subjected to sham stress as the control group. The synaptic vesicle distribution and interactions are modeled by employing a point process approach...... on differences of statistical measures in section and the same measures in between sections. Three-dimensional (3D) datasets are reconstructed by using image registration techniques and estimated thicknesses. We distinguish the effect of stress by estimating the synaptic vesicle densities and modeling...
Directory of Open Access Journals (Sweden)
Cabello Daniel R
1998-01-01
Full Text Available A statistical evaluation of the population dynamics of Panstrongylus geniculatus is based on a cohort experiment conducted under controlled laboratory conditions. Animals were fed on hen every 15 days. Egg incubation took 21 days; mean duration of 1st, 2nd, 3rd, 4th, and 5th instar nymphs was 25, 30, 58, 62, and 67 days, respectively; mean nymphal development time was 39 weeks and adult longevity was 72 weeks. Females reproduced during 30 weeks, producing an average of 61.6 eggs for female on its lifetime; the average number of eggs/female/week was 2.1. Total number of eggs produced by the cohort was 1379. Average hatch for the cohort was 88.9%; it was not affected by age of the mother. Age specific survival and reproduction tables were constructed. The following population parameters were evaluated, generation time was 36.1 weeks; net reproduction rate was 89.4; intrinsic rate of natural increase was 0.125; instantaneous birth and death rates were 0.163 and 0.039 respectively; finite rate of increase was 1.13; total reproductive value was 1196 and stable age distribution was 31.2% eggs, 64.7% nymphs and 4.1% adults. Finally the population characteristics of P. geniculatus lead to the conclusion that this species is a K strategist.
Spatial statistical analysis of basal stem root disease under natural field epidemic of oil palm
Kamu, Assis; Phin, Chong Khim; Seman, Idris Abu; Wan, Hoong Hak; Mun, Ho Chong
2015-02-01
Oil palm or scientifically known as Elaeis guineensis Jacq. is the most important commodity crop in Malaysia and has greatly contributed to the economy growth of the country. As far as disease is concerned in the industry, Basal Stem Rot (BSR) caused by Ganoderma boninence remains the most important disease. BSR disease is the most widely studied with information available for oil palm disease in Malaysia. However, there is still limited study on the spatial as well as temporal pattern or distribution of the disease especially under natural field epidemic condition in oil palm plantation. The objective of this study is to spatially identify the pattern of BSR disease under natural field epidemic using two geospatial analytical techniques, which are quadrat analysis for the first order properties of partial pattern analysis and nearest-neighbor analysis (NNA) for the second order properties of partial pattern analysis. Two study sites were selected with different age of tree. Both sites are located in Tawau, Sabah and managed by the same company. The results showed that at least one of the point pattern analysis used which is NNA (i.e. the second order properties of partial pattern analysis) has confirmed the disease is complete spatial randomness. This suggests the spread of the disease is not from tree to tree and the age of palm does not play a significance role in determining the spatial pattern of the disease. From the spatial pattern of the disease, it would help in the disease management program and for the industry in the future. The statistical modelling is expected to help in identifying the right model to estimate the yield loss of oil palm due to BSR disease in the future.
Large deflection analysis of skew plates under uniformly distributed ...
African Journals Online (AJOL)
The present paper deals with large deflection static behaviour of thin isotropic skew plates under uniformly distributed load for various mixed flexural boundary conditions. A variational method based on the principle of minimization of total potential energy has been used through assumed displacement fields. The results are ...
Water and nitrogen distribution in uncropped ridgetilled soil under ...
African Journals Online (AJOL)
A ridge-tillage configuration, with placement of nitrate nitrogen (NO3--N) or its source in the elevated portion of the ridge, can potentially isolate fertilizer from downward water flow and minimize nitrate leaching. In the experiment, the simultaneous distribution of water, nitrate, and ammonium under three ridge widths was ...
Coolant rate distribution in horizontal steam generator under natural circulation
Energy Technology Data Exchange (ETDEWEB)
Blagovechtchenski, A.; Leontieva, V.; Mitrioukhin, A. [St. Petersburg State Technical Univ. (Russian Federation)
1997-12-31
In the presentation the major factors determining the conditions of NCC (Natural Coolant Circulation) in the primary circuit and in particular conditions of coolant rate distribution on the horizontal tubes of PGV-1000 in NPP with VVER-1000 under NCC are considered. 5 refs.
International Nuclear Information System (INIS)
Pinotti, E.; Brenna, M.; Puppin, E.
2008-01-01
In magneto-optical Kerr measurements of the Barkhausen noise, a magnetization jump ΔM due to a domain reversal produces a variation ΔI of the intensity of a laser beam reflected by the sample, which is the physical quantity actually measured. Due to the non-uniform beam intensity profile, the magnitude of ΔI depends both on ΔM and on its position on the laser spot. This could distort the statistical distribution p(ΔI) of the measured ΔI with respect to the true distribution p(ΔM) of the magnetization jumps ΔM. In this work the exact relationship between the two distributions is derived in a general form, which will be applied to some possible beam profiles. It will be shown that in most cases the usual Gaussian beam produces a negligible statistical distortion. Moreover, for small ΔI the noise of the experimental setup can also distort the statistical distribution p(ΔI), by erroneously rejecting small ΔI as noise. This effect has been calculated for white noise, and it will be shown that it is relatively small but not totally negligible as the measured ΔI approaches the detection limit
International Nuclear Information System (INIS)
Gao, Li-Na; Liu, Fu-Hu; Lacey, Roy A.
2016-01-01
Experimental results of the transverse-momentum distributions of φ mesons and Ω hyperons produced in gold-gold (Au-Au) collisions with different centrality intervals, measured by the STAR Collaboration at different energies (7.7, 11.5, 19.6, 27, and 39 GeV) in the beam energy scan (BES) program at the relativistic heavy-ion collider (RHIC), are approximately described by the single Erlang distribution and the two-component Schwinger mechanism. Moreover, the STAR experimental transverse-momentum distributions of negatively charged particles, produced in Au-Au collisions at RHIC BES energies, are approximately described by the two-component Erlang distribution and the single Tsallis statistics. The excitation functions of free parameters are obtained from the fit to the experimental data. A weak softest point in the string tension in Ω hyperon spectra is observed at 7.7 GeV. (orig.)
Energy Technology Data Exchange (ETDEWEB)
Gao, Li-Na; Liu, Fu-Hu [Shanxi University, Institute of Theoretical Physics, Shanxi (China); Lacey, Roy A. [Stony Brook University, Departments of Chemistry and Physics, Stony Brook, NY (United States)
2016-05-15
Experimental results of the transverse-momentum distributions of φ mesons and Ω hyperons produced in gold-gold (Au-Au) collisions with different centrality intervals, measured by the STAR Collaboration at different energies (7.7, 11.5, 19.6, 27, and 39 GeV) in the beam energy scan (BES) program at the relativistic heavy-ion collider (RHIC), are approximately described by the single Erlang distribution and the two-component Schwinger mechanism. Moreover, the STAR experimental transverse-momentum distributions of negatively charged particles, produced in Au-Au collisions at RHIC BES energies, are approximately described by the two-component Erlang distribution and the single Tsallis statistics. The excitation functions of free parameters are obtained from the fit to the experimental data. A weak softest point in the string tension in Ω hyperon spectra is observed at 7.7 GeV. (orig.)
Assessing Regional Scale Variability in Extreme Value Statistics Under Altered Climate Scenarios
Energy Technology Data Exchange (ETDEWEB)
Brunsell, Nathaniel [Univ. of Kansas, Lawrence, KS (United States); Mechem, David [Univ. of Kansas, Lawrence, KS (United States); Ma, Chunsheng [Wichita State Univ., KS (United States)
2015-02-20
Recent studies have suggested that low-frequency modes of climate variability can significantly influence regional climate. The climatology associated with extreme events has been shown to be particularly sensitive. This has profound implications for droughts, heat waves, and food production. We propose to examine regional climate simulations conducted over the continental United States by applying a recently developed technique which combines wavelet multi–resolution analysis with information theory metrics. This research is motivated by two fundamental questions concerning the spatial and temporal structure of extreme events. These questions are 1) what temporal scales of the extreme value distributions are most sensitive to alteration by low-frequency climate forcings and 2) what is the nature of the spatial structure of variation in these timescales? The primary objective is to assess to what extent information theory metrics can be useful in characterizing the nature of extreme weather phenomena. Specifically, we hypothesize that (1) changes in the nature of extreme events will impact the temporal probability density functions and that information theory metrics will be sensitive these changes and (2) via a wavelet multi–resolution analysis, we will be able to characterize the relative contribution of different timescales on the stochastic nature of extreme events. In order to address these hypotheses, we propose a unique combination of an established regional climate modeling approach and advanced statistical techniques to assess the effects of low-frequency modes on climate extremes over North America. The behavior of climate extremes in RCM simulations for the 20th century will be compared with statistics calculated from the United States Historical Climatology Network (USHCN) and simulations from the North American Regional Climate Change Assessment Program (NARCCAP). This effort will serve to establish the baseline behavior of climate extremes, the
Aiba, Masahiro; Katabuchi, Masatoshi; Takafumi, Hino; Matsuzaki, Shin-Ichiro S; Sasaki, Takehiro; Hiura, Tsutom
2013-12-01
Numerous studies have revealed the existence of nonrandom trait distribution patterns as a sign of environmental filtering and/or biotic interactions in a community assembly process. A number of metrics with various algorithms have been used to detect these patterns without any clear guidelines. Although some studies have compared their statistical powers, the differences in performance among the metrics under the conditions close to actual studies are not clear. Therefore, the performances of five metrics of convergence and 16 metrics of divergence under alternative conditions were comparatively analyzed using a suite of simulated communities. We focused particularly on the robustness of the performances to conditions that are often uncertain and uncontrollable in actual studies; e.g., atypical trait distribution patterns stemming from the operation of multiple assembly mechanisms, a scaling of trait-function relationships, and a sufficiency of analyzed traits. Most tested metrics, for either convergence or divergence, had sufficient statistical power to distinguish nonrandom trait distribution patterns without uncertainty. However, the performances of the metrics were considerably influenced by both atypical trait distribution patterns and other uncertainties. Influences from these uncertainties varied among the metrics of different algorithms and their performances were often complementary. Therefore, under the uncertainties of an assembly process, the selection of appropriate metrics and the combined use of complementary metrics are critically important to reliably distinguish nonrandom patterns in a trait distribution. We provide a tentative list of recommended metrics for future studies.
International Nuclear Information System (INIS)
Ballini, J.-P.; Cazes, P.; Turpin, P.-Y.
1976-01-01
Analysing the histogram of anode pulse amplitudes allows a discussion of the hypothesis that has been proposed to account for the statistical processes of secondary multiplication in a photomultiplier. In an earlier work, good agreement was obtained between experimental and reconstructed spectra, assuming a first dynode distribution including two Poisson distributions of distinct mean values. This first approximation led to a search for a method which could give the weights of several Poisson distributions of distinct mean values. Three methods have been briefly exposed: classical linear regression, constraint regression (d'Esopo's method), and regression on variables subject to error. The use of these methods gives an approach of the frequency function which represents the dispersion of the punctual mean gain around the whole first dynode mean gain value. Comparison between this function and the one employed in Polya distribution allows the statement that the latter is inadequate to describe the statistical process of secondary multiplication. Numerous spectra obtained with two kinds of photomultiplier working under different physical conditions have been analysed. Then two points are discussed: - Does the frequency function represent the dynode structure and the interdynode collection process. - Is the model (the multiplication process of all dynodes but the first one, is Poissonian) valid whatever the photomultiplier and the utilization conditions. (Auth.)
Estimation of current density distribution under electrodes for external defibrillation
Directory of Open Access Journals (Sweden)
Papazov Sava P
2002-12-01
Full Text Available Abstract Background Transthoracic defibrillation is the most common life-saving technique for the restoration of the heart rhythm of cardiac arrest victims. The procedure requires adequate application of large electrodes on the patient chest, to ensure low-resistance electrical contact. The current density distribution under the electrodes is non-uniform, leading to muscle contraction and pain, or risks of burning. The recent introduction of automatic external defibrillators and even wearable defibrillators, presents new demanding requirements for the structure of electrodes. Method and Results Using the pseudo-elliptic differential equation of Laplace type with appropriate boundary conditions and applying finite element method modeling, electrodes of various shapes and structure were studied. The non-uniformity of the current density distribution was shown to be moderately improved by adding a low resistivity layer between the metal and tissue and by a ring around the electrode perimeter. The inclusion of openings in long-term wearable electrodes additionally disturbs the current density profile. However, a number of small-size perforations may result in acceptable current density distribution. Conclusion The current density distribution non-uniformity of circular electrodes is about 30% less than that of square-shaped electrodes. The use of an interface layer of intermediate resistivity, comparable to that of the underlying tissues, and a high-resistivity perimeter ring, can further improve the distribution. The inclusion of skin aeration openings disturbs the current paths, but an appropriate selection of number and size provides a reasonable compromise.
Czech Academy of Sciences Publication Activity Database
Netopilík, Miloš; Kratochvíl, Pavel
2006-01-01
Roč. 55, č. 2 (2006), s. 196-203 ISSN 0959-8103 R&D Projects: GA AV ČR IAA100500501; GA AV ČR IAA4050403; GA AV ČR IAA4050409; GA ČR GA203/03/0617 Institutional research plan: CEZ:AV0Z40500505 Keywords : statistical branching * tetrafunctional branch points * molecular-weight distribution Subject RIV: CD - Macromolecular Chemistry Impact factor: 1.475, year: 2006
Statistical analysis of secondary particle distributions in relativistic nucleus-nucleus collisions
Mcguire, Stephen C.
1987-01-01
The use is described of several statistical techniques to characterize structure in the angular distributions of secondary particles from nucleus-nucleus collisions in the energy range 24 to 61 GeV/nucleon. The objective of this work was to determine whether there are correlations between emitted particle intensity and angle that may be used to support the existence of the quark gluon plasma. The techniques include chi-square null hypothesis tests, the method of discrete Fourier transform analysis, and fluctuation analysis. We have also used the method of composite unit vectors to test for azimuthal asymmetry in a data set of 63 JACEE-3 events. Each method is presented in a manner that provides the reader with some practical detail regarding its application. Of those events with relatively high statistics, Fe approaches 0 at 55 GeV/nucleon was found to possess an azimuthal distribution with a highly non-random structure. No evidence of non-statistical fluctuations was found in the pseudo-rapidity distributions of the events studied. It is seen that the most effective application of these methods relies upon the availability of many events or single events that possess very high multiplicities.
Galaxies distribution in the universe: large-scale statistics and structures
International Nuclear Information System (INIS)
Maurogordato, Sophie
1988-01-01
This research thesis addresses the distribution of galaxies in the Universe, and more particularly large scale statistics and structures. Based on an assessment of the main used statistical techniques, the author outlines the need to develop additional tools to correlation functions in order to characterise the distribution. She introduces a new indicator: the probability of a volume randomly tested in the distribution to be void. This allows a characterisation of void properties at the work scales (until 10h -1 Mpc) in the Harvard Smithsonian Center for Astrophysics Redshift Survey, or CfA catalog. A systematic analysis of statistical properties of different sub-samples has then been performed with respect to the size and location, luminosity class, and morphological type. This analysis is then extended to different scenarios of structure formation. A program of radial speed measurements based on observations allows the determination of possible relationships between apparent structures. The author also presents results of the search for south extensions of Perseus supernova [fr
Shasha, Dennis
2010-01-01
Statistics is the activity of inferring results about a population given a sample. Historically, statistics books assume an underlying distribution to the data (typically, the normal distribution) and derive results under that assumption. Unfortunately, in real life, one cannot normally be sure of the underlying distribution. For that reason, this book presents a distribution-independent approach to statistics based on a simple computational counting idea called resampling. This book explains the basic concepts of resampling, then systematically presents the standard statistical measures along
Using a Statistical Approach to Anticipate Leaf Wetness Duration Under Climate Change in France
Huard, F.; Imig, A. F.; Perrin, P.
2014-12-01
Leaf wetness plays a major role in the development of fungal plant diseases. Leaf wetness duration (LWD) above a threshold value is determinant for infection and can be seen as a good indicator of impact of climate on infection occurrence and risk. As LWD is not widely measured, several methods, based on physics and empirical approach, have been developed to estimate it from weather data. Many LWD statistical models do exist, but the lack of standard for measurements require reassessments. A new empirical LWD model, called MEDHI (Modèle d'Estimation de la Durée d'Humectation à l'Inra) was developed for french configuration for wetness sensors (angle : 90°, height : 50 cm). This deployment is different from what is usually recommended from constructors or authors in other countries (angle from 10 to 60°, height from 10 to 150 cm…). MEDHI is a decision support system based on hourly climatic conditions at time steps n and n-1 taking account relative humidity, rainfall and previously simulated LWD. Air temperature, relative humidity, wind speed, rain and LWD data from several sensors with 2 configurations were measured during 6 months in Toulouse and Avignon (South West and South East of France) to calibrate MEDHI. A comparison of empirical models : NHRH (RH threshold), DPD (dew point depression), CART (classification and regression tree analysis dependant on RH, wind speed and dew point depression) and MEDHI, using meteorological and LWD measurements obtained during 5 months in Toulouse, showed that the development of this new model MEHDI was definitely better adapted to French conditions. In the context of climate change, MEDHI was used for mapping the evolution of leaf wetness duration in France from 1950 to 2100 with the French regional climate model ALADIN under different Representative Concentration Pathways (RCPs) and using a QM (Quantile-Mapping) statistical downscaling method. Results give information on the spatial distribution of infection risks
Yang, Jaw-Yen; Yan, Chih-Yuan; Diaz, Manuel; Huang, Juan-Chen; Li, Zhihui; Zhang, Hanxin
2014-01-08
The ideal quantum gas dynamics as manifested by the semiclassical ellipsoidal-statistical (ES) equilibrium distribution derived in Wu et al. (Wu et al . 2012 Proc. R. Soc. A 468 , 1799-1823 (doi:10.1098/rspa.2011.0673)) is numerically studied for particles of three statistics. This anisotropic ES equilibrium distribution was derived using the maximum entropy principle and conserves the mass, momentum and energy, but differs from the standard Fermi-Dirac or Bose-Einstein distribution. The present numerical method combines the discrete velocity (or momentum) ordinate method in momentum space and the high-resolution shock-capturing method in physical space. A decoding procedure to obtain the necessary parameters for determining the ES distribution is also devised. Computations of two-dimensional Riemann problems are presented, and various contours of the quantities unique to this ES model are illustrated. The main flow features, such as shock waves, expansion waves and slip lines and their complex nonlinear interactions, are depicted and found to be consistent with existing calculations for a classical gas.
Yang, Jaw-Yen; Yan, Chih-Yuan; Diaz, Manuel; Huang, Juan-Chen; Li, Zhihui; Zhang, Hanxin
2014-01-01
The ideal quantum gas dynamics as manifested by the semiclassical ellipsoidal-statistical (ES) equilibrium distribution derived in Wu et al. (Wu et al. 2012 Proc. R. Soc. A 468, 1799–1823 (doi:10.1098/rspa.2011.0673)) is numerically studied for particles of three statistics. This anisotropic ES equilibrium distribution was derived using the maximum entropy principle and conserves the mass, momentum and energy, but differs from the standard Fermi–Dirac or Bose–Einstein distribution. The present numerical method combines the discrete velocity (or momentum) ordinate method in momentum space and the high-resolution shock-capturing method in physical space. A decoding procedure to obtain the necessary parameters for determining the ES distribution is also devised. Computations of two-dimensional Riemann problems are presented, and various contours of the quantities unique to this ES model are illustrated. The main flow features, such as shock waves, expansion waves and slip lines and their complex nonlinear interactions, are depicted and found to be consistent with existing calculations for a classical gas. PMID:24399919
Audio analysis of statistically instantaneous signals with mixed Gaussian probability distributions
Naik, Ganesh R.; Wang, Wenwu
2012-10-01
In this article, a novel method is proposed to measure the separation qualities of statistically instantaneous audio signals with mixed Gaussian probability distributions. This study evaluates the impact of the Probability Distribution Function (PDF) of the mixed signals on the outcomes of both sub- and super-Gaussian distributions. Different Gaussian measures are evaluated by using various spectral-distortion measures. It aims to compare the different audio mixtures from both super-Gaussian and sub-Gaussian perspectives. Extensive computer simulation confirms that the separated sources always have super-Gaussian characteristics irrespective of the PDF of the signals or mixtures. The result based on the objective measures demonstrates the effectiveness of source separation in improving the quality of the separated audio sources.
Statistics of TRMM Data Archive and Distribution at the Goddard DAAC
Rui, H.; Teng, B.; Chiu, L.; Serafino, G.; Hrubiak, P.; Bonk, J.
2001-12-01
The Tropical Rainfall Measuring Mission (TRMM) is a joint mission of the National Aeronautics and Space Administration (NASA) and the National Space Development Agency (NASDA) of Japan to monitor and study tropical and subtropical rainfall systems. TRMM has been acquiring data from shortly after its launch on November 28, 1997 to the present. All TRMM standard products are processed by the TRMM Science Data and Information System (TSDIS) and archived and distributed by the Goddard Distributed Active Archive Center (GDAAC). In addition to the standard products (accessible via http://lake.nascom.nasa.gov/data/dataset/TRMM/index.html), the GDAAC generates and/or maintains a set of derived TRMM products (e.g., satellite coincidence subsets, parameter subsets, resampled gridded subsets, GIS-compatible files) to facilitate use of TRMM data by the general public. TRMM data are reprocessed with improved science algorithms approximately once per year, currently at version 5. The average operating altitude for TRMM was moved from 350 kilometers to 403 kilometers during the period from August 7 to 24, 2001, which will significantly extend the mission lifetime for TRMM. The GDAAC stores archive and distribution information on TRMM standard and derived products in a database. In order to better understand the data usage patterns and requirements of TRMM users, statistics are routinely derived from the database for the entire TRMM data set or for specific groups of data products. For example, the total cumulative distribution and archive of TRMM satellite standard products (as of August 2001) are 2,722,479 and 420,573, respectively, in terms of file numbers; and 64.5 TB and 11.4 TB, respectively, in terms of file volumes. The Utilization Rate (UR), defined as the ratio of the number of distributed files to the number of archived files, of these satellite products is 6.5 (not including anonymous ftp distribution). Overall, the UR has increased steadily as TRMM progressed, and the
Statistical distributions of earthquakes and related non-linear features in seismic waves
International Nuclear Information System (INIS)
Apostol, B.-F.
2006-01-01
A few basic facts in the science of the earthquakes are briefly reviewed. An accumulation, or growth, model is put forward for the focal mechanisms and the critical focal zone of the earthquakes, which relates the earthquake average recurrence time to the released seismic energy. The temporal statistical distribution for average recurrence time is introduced for earthquakes, and, on this basis, the Omori-type distribution in energy is derived, as well as the distribution in magnitude, by making use of the semi-empirical Gutenberg-Richter law relating seismic energy to earthquake magnitude. On geometric grounds, the accumulation model suggests the value r = 1/3 for the Omori parameter in the power-law of energy distribution, which leads to β = 1,17 for the coefficient in the Gutenberg-Richter recurrence law, in fair agreement with the statistical analysis of the empirical data. Making use of this value, the empirical Bath's law is discussed for the average magnitude of the aftershocks (which is 1.2 less than the magnitude of the main seismic shock), by assuming that the aftershocks are relaxation events of the seismic zone. The time distribution of the earthquakes with a fixed average recurrence time is also derived, the earthquake occurrence prediction is discussed by means of the average recurrence time and the seismicity rate, and application of this discussion to the seismic region Vrancea, Romania, is outlined. Finally, a special effect of non-linear behaviour of the seismic waves is discussed, by describing an exact solution derived recently for the elastic waves equation with cubic anharmonicities, its relevance, and its connection to the approximate quasi-plane waves picture. The properties of the seismic activity accompanying a main seismic shock, both like foreshocks and aftershocks, are relegated to forthcoming publications. (author)
Fitting Statistical Distributions Functions on Ozone Concentration Data at Coastal Areas
International Nuclear Information System (INIS)
Muhammad Yazid Nasir; Nurul Adyani Ghazali; Muhammad Izwan Zariq Mokhtar; Norhazlina Suhaimi
2016-01-01
Ozone is known as one of the pollutant that contributes to the air pollution problem. Therefore, it is important to carry out the study on ozone. The objective of this study is to find the best statistical distribution for ozone concentration. There are three distributions namely Inverse Gaussian, Weibull and Lognormal were chosen to fit one year hourly average ozone concentration data in 2010 at Port Dickson and Port Klang. Maximum likelihood estimation (MLE) method was used to estimate the parameters to develop the probability density function (PDF) graph and cumulative density function (CDF) graph. Three performance indicators (PI) that are normalized absolute error (NAE), prediction accuracy (PA), and coefficient of determination (R 2 ) were used to determine the goodness-of-fit criteria of the distribution. Result shows that Weibull distribution is the best distribution with the smallest error measure value (NAE) at Port Klang and Port Dickson is 0.08 and 0.31, respectively. The best score for highest adequacy measure (PA: 0.99) with the value of R 2 is 0.98 (Port Klang) and 0.99 (Port Dickson). These results provide useful information to local authorities for prediction purpose. (author)
Directory of Open Access Journals (Sweden)
Borovik A.V.
2017-03-01
Full Text Available An electronic database has been created for 123801 solar flares that occurred on the Sun over the period from 1972 to 2010. It is based on catalogs of the Solar Geophysical Data (SGD and Quarterly Bulletin on Solar Activity. A software package has been used for statistical data preprocessing. The first results revealed a number of new fea-tures in the distribution of parameters of solar flares, which differ from those obtained previously. We have found that more than 90 % of all solar flares are low-power. The most numerous class comprises SF flares (64 %. Flare activity shows a pronounced cyclicity and high correlation with Wolf numbers. The highest correlation coefficients indicate S and 1 solar flares. There is also a high correlation between individual flare classes: S and 1, 1 and (2–4. The results obtained in [Mitra et al., 1972], which provide evidence of the prevalence of SN solar flares (47 % and the existence of significant peaks for SN and 1N flares, have not been confirmed. The distribution of the number of solar flares with increasing optical importance smoothly decreases without significant deviations. With increasing optical importance, solar flares are gradually redistributed toward an increase in brightness class. The excess of the number of SN and 1N solar flares present in the distributions obtained in [Mitra et al., 1972] are most likely associated with poor statistics.
Distribution patterns of Mimbres ceramics using INAA and multivariate statistical methods
International Nuclear Information System (INIS)
Dahlin, E.S.; Carlson, D.L.; Shafer, H.J.; James, W.D.
2007-01-01
The distribution patterns of Classic Mimbres black-on-white bowls and jars were determined by instrumental neutron activation analysis to identify vessel movement between geographically defined regions and between villages within individual regions of southwestern New Mexico. The data set produced and utilized by the various multivariate statistical treatments included multielement neutron activation analysis results for 288 ceramic and clay samples from 15 sites in the Gila, Mimbres and Rio Grande valleys of southwest New Mexico. The results indicate that bowls were more frequently exchanged than jars and distribution frequencies between regions were lower than between villages. Two statistical approaches to the data were compared. In one, cluster analysis of the compositional data was used to form homogeneous groups and the distribution of those groups across sites and regions was examined. In the second, discriminant analysis was used to look for significant differences in composition between regions and sites. The significance of predetermining groups based on collection location as opposed to blind group formation from hierarchical cluster analysis was evaluated in terms of its potential to lead to different interpretations of the data. (author)
Study on Shale’s Dynamic Damage Constitutive Model Based on Statistical Distribution
Directory of Open Access Journals (Sweden)
Jianjun Liu
2015-01-01
Full Text Available The dynamic constitutive model of shale is basic for shale gas reservoir reforming. In order to investigate the dynamic mechanism of shale, a new dynamic damage constitutive model of shale under uniaxial impact load was established based on the statistical damage theory and the laboratory test results of deformation and the damage characteristics under the action of SHPB impact. Compared with the theoretical results, the model can describe shale’s mechanical attributes and reveal the fracture damage mechanism as well. The results will provide theoretical basis for hydraulic fracturing on shale and other dynamic reforming technics.
International Nuclear Information System (INIS)
Ayodele, T.R.; Ogunjuyigbe, A.S.O.
2015-01-01
In this paper, probability distribution of clearness index is proposed for the prediction of global solar radiation. First, the clearness index is obtained from the past data of global solar radiation, then, the parameters of the appropriate distribution that best fit the clearness index are determined. The global solar radiation is thereafter predicted from the clearness index using inverse transformation of the cumulative distribution function. To validate the proposed method, eight years global solar radiation data (2000–2007) of Ibadan, Nigeria are used to determine the parameters of appropriate probability distribution for clearness index. The calculated parameters are then used to predict the future monthly average global solar radiation for the following year (2008). The predicted values are compared with the measured values using four statistical tests: the Root Mean Square Error (RMSE), MAE (Mean Absolute Error), MAPE (Mean Absolute Percentage Error) and the coefficient of determination (R 2 ). The proposed method is also compared to the existing regression models. The results show that logistic distribution provides the best fit for clearness index of Ibadan and the proposed method is effective in predicting the monthly average global solar radiation with overall RMSE of 0.383 MJ/m 2 /day, MAE of 0.295 MJ/m 2 /day, MAPE of 2% and R 2 of 0.967. - Highlights: • Distribution of clearnes index is proposed for prediction of global solar radiation. • The clearness index is obtained from the past data of global solar radiation. • The parameters of distribution that best fit the clearness index are determined. • Solar radiation is predicted from the clearness index using inverse transformation. • The method is effective in predicting the monthly average global solar radiation.
Directory of Open Access Journals (Sweden)
M.M. Mohie El-Din
2011-10-01
Full Text Available In this paper, two sample Bayesian prediction intervals for order statistics (OS are obtained. This prediction is based on a certain class of the inverse exponential-type distributions using a right censored sample. A general class of prior density functions is used and the predictive cumulative function is obtained in the two samples case. The class of the inverse exponential-type distributions includes several important distributions such the inverse Weibull distribution, the inverse Burr distribution, the loglogistic distribution, the inverse Pareto distribution and the inverse paralogistic distribution. Special cases of the inverse Weibull model such as the inverse exponential model and the inverse Rayleigh model are considered.
Relative mass distributions of neutron-rich thermally fissile nuclei within a statistical model
Kumar, Bharat; Kannan, M. T. Senthil; Balasubramaniam, M.; Agrawal, B. K.; Patra, S. K.
2017-09-01
We study the binary mass distribution for the recently predicted thermally fissile neutron-rich uranium and thorium nuclei using a statistical model. The level density parameters needed for the study are evaluated from the excitation energies of the temperature-dependent relativistic mean field formalism. The excitation energy and the level density parameter for a given temperature are employed in the convolution integral method to obtain the probability of the particular fragmentation. As representative cases, we present the results for the binary yields of 250U and 254Th. The relative yields are presented for three different temperatures: T =1 , 2, and 3 MeV.
Directory of Open Access Journals (Sweden)
Jihye Ryu
2018-04-01
Full Text Available The field of enacted/embodied cognition has emerged as a contemporary attempt to connect the mind and body in the study of cognition. However, there has been a paucity of methods that enable a multi-layered approach tapping into different levels of functionality within the nervous systems (e.g., continuously capturing in tandem multi-modal biophysical signals in naturalistic settings. The present study introduces a new theoretical and statistical framework to characterize the influences of cognitive demands on biophysical rhythmic signals harnessed from deliberate, spontaneous and autonomic activities. In this study, nine participants performed a basic pointing task to communicate a decision while they were exposed to different levels of cognitive load. Within these decision-making contexts, we examined the moment-by-moment fluctuations in the peak amplitude and timing of the biophysical time series data (e.g., continuous waveforms extracted from hand kinematics and heart signals. These spike-trains data offered high statistical power for personalized empirical statistical estimation and were well-characterized by a Gamma process. Our approach enabled the identification of different empirically estimated families of probability distributions to facilitate inference regarding the continuous physiological phenomena underlying cognitively driven decision-making. We found that the same pointing task revealed shifts in the probability distribution functions (PDFs of the hand kinematic signals under study and were accompanied by shifts in the signatures of the heart inter-beat-interval timings. Within the time scale of an experimental session, marked changes in skewness and dispersion of the distributions were tracked on the Gamma parameter plane with 95% confidence. The results suggest that traditional theoretical assumptions of stationarity and normality in biophysical data from the nervous systems are incongruent with the true statistical nature of
Monitoring And Analyzing Distributed Cluster Performance And Statistics Of Atlas Job Flow
Ramprakash, S
2005-01-01
The ATLAS experiment is a High Energy Physics experiment that utilizes the services of Grid3 now migrating to the Open Science Grid (OSG). This thesis provides monitoring and analysis of performance and statistical data from individual distributed clusters that combine to form the ATLAS Grid and will ultimately be used to make scheduling decisions on this Grid. The system developed in this thesis uses a layered architecture such that predicted future developments or changes brought to the existing Grid infrastructure can easily utilize this work with minimum or no changes. The starting point of the system is based on the existing scheduling that is being done manually for ATLAS job flow. We have provided additional functionality based on the requirements of the High Energy Physics ATLAS team of physicists at UTA. The system developed in this thesis has successfully monitored and analyzed distributed cluster performance at three sites and is waiting for access to monitor data from three more sites. (Abstract s...
Energy Technology Data Exchange (ETDEWEB)
Palmintier, Bryan S [National Renewable Energy Laboratory (NREL), Golden, CO (United States); Bugbee, Bruce [National Renewable Energy Laboratory (NREL), Golden, CO (United States); Gotseff, Peter [National Renewable Energy Laboratory (NREL), Golden, CO (United States)
2017-10-03
Capturing technical and economic impacts of solar photovoltaics (PV) and other distributed energy resources (DERs) on electric distribution systems can require high-time resolution (e.g. 1 minute), long-duration (e.g. 1 year) simulations. However, such simulations can be computationally prohibitive, particularly when including complex control schemes in quasi-steady-state time series (QSTS) simulation. Various approaches have been used in the literature to down select representative time segments (e.g. days), but typically these are best suited for lower time resolutions or consider only a single data stream (e.g. PV production) for selection. We present a statistical approach that combines stratified sampling and bootstrapping to select representative days while also providing a simple method to reassemble annual results. We describe the approach in the context of a recent study with a utility partner. This approach enables much faster QSTS analysis by simulating only a subset of days, while maintaining accurate annual estimates.
Statistical distribution of EMIC wave spectra: Observations from Van Allen Probes
Zhang, X.-J.; Li, W.; Thorne, R. M.; Angelopoulos, V.; Bortnik, J.; Kletzing, C. A.; Kurth, W. S.; Hospodarsky, G. B.
2016-12-01
It has been known that electromagnetic ion cyclotron (EMIC) waves can precipitate ultrarelativistic electrons through cyclotron resonant scattering. However, the overall effectiveness of this mechanism has yet to be quantified, because it is difficult to obtain the global distribution of EMIC waves that usually exhibit limited spatial presence. We construct a statistical distribution of EMIC wave frequency spectra and their intensities based on Van Allen Probes measurements from September 2012 to December 2015. Our results show that as the ratio of plasma frequency over electron gyrofrequency increases, EMIC wave power becomes progressively dominated by the helium band. There is a pronounced dawn-dusk asymmetry in the wave amplitude and the frequency spectrum. The frequency spectrum does not follow the commonly used single-peak Gaussian function. Incorporating these realistic EMIC wave frequency spectra into radiation belt models is expected to improve the quantification of EMIC wave scattering effects in ultrarelativistic electron dynamics.
Antiprotons production of propagating cosmic rays under distributed reacceleration
International Nuclear Information System (INIS)
Simon, M.; Heinbach, U.; Koch, C.
1987-01-01
The available measurements on the cosmic ray anti p/p-ratio show an excess of antiprotons above predictions derived in the framework of the standard picture of cosmic ray origin and propagation. We calculated the anti p production from collisions of cosmic rays with the interstellar gas under the condition of distributed reacceleration. It could be shown that the calculated anti p/p-ratio is enhanced compared to that derived from the 'leaky box' model but it remains difficult to bring it into agreement with the data by reasonable astrophysical assumptions. (orig.)
Kim, Michael T; Chen, Yan; Marhoul, Joseph; Jacobson, Fred
2014-07-16
Trastuzumab emtansine (Kadcyla) is a recently approved antibody-drug conjugate produced by attachment of the anti-tubulin drug, DM1, to lysine amines via the SMCC linker. The resulting product exhibits a drug load distribution from 0 to 8 drugs per antibody that can be quantified using mass spectrometry. Different statistical models were tested against the experimental data derived from samples produced during process characterization studies to determine best fit. The Poisson distribution gives the best correlation for samples manufactured using the target process conditions (yielding the target average drug to antibody ratio (DAR) of 3.5) as well as those produced under conditions that exceed the allowed manufacturing ranges and yield products with average DAR values that are significantly different from the target (i.e., ≤3.0 or ≥4.0). The Poisson distribution establishes a link between average DAR values and drug load distributions, implying that measurement and control of the former (i.e., via a simple UV spectrophotometric method) could be used to indirectly control the latter in trastuzumab emtansine.
International Nuclear Information System (INIS)
Luneva, K.V.; Kryshev, A.I.; Nikitin, A.I.; Kryshev, I.I.
2010-01-01
The article presents the results of statistical analysis of radiation monitoring data of river system Techa-Iset'-Tobol-Irtysh contamination. A short description of analyzable data and the territory under consideration was given. The distribution-free statistic methods, used for comparative analysis, were described. Reasons of the methods selection and their application features were given. Comparative data analysis with traditional statistics methods was presented. Reliable decrease of 90 Sr specific activity in the river system object to object was determined, which is the evidence of the radionuclide transportation in the river system Techa-Iset'-Tobol-Irtysh [ru
Directory of Open Access Journals (Sweden)
Hsueh-Hsien Chang
2017-04-01
Full Text Available This paper proposes statistical feature extraction methods combined with artificial intelligence (AI approaches for fault locations in non-intrusive single-line-to-ground fault (SLGF detection of low voltage distribution systems. The input features of the AI algorithms are extracted using statistical moment transformation for reducing the dimensions of the power signature inputs measured by using non-intrusive fault monitoring (NIFM techniques. The data required to develop the network are generated by simulating SLGF using the Electromagnetic Transient Program (EMTP in a test system. To enhance the identification accuracy, these features after normalization are given to AI algorithms for presenting and evaluating in this paper. Different AI techniques are then utilized to compare which identification algorithms are suitable to diagnose the SLGF for various power signatures in a NIFM system. The simulation results show that the proposed method is effective and can identify the fault locations by using non-intrusive monitoring techniques for low voltage distribution systems.
International Nuclear Information System (INIS)
Heinrich, S.
2006-01-01
Nucleus fission process is a very complex phenomenon and, even nowadays, no realistic models describing the overall process are available. The work presented here deals with a theoretical description of fission fragments distributions in mass, charge, energy and deformation. We have reconsidered and updated the B.D. Wilking Scission Point model. Our purpose was to test if this statistic model applied at the scission point and by introducing new results of modern microscopic calculations allows to describe quantitatively the fission fragments distributions. We calculate the surface energy available at the scission point as a function of the fragments deformations. This surface is obtained from a Hartree Fock Bogoliubov microscopic calculation which guarantee a realistic description of the potential dependence on the deformation for each fragment. The statistic balance is described by the level densities of the fragment. We have tried to avoid as much as possible the input of empirical parameters in the model. Our only parameter, the distance between each fragment at the scission point, is discussed by comparison with scission configuration obtained from full dynamical microscopic calculations. Also, the comparison between our results and experimental data is very satisfying and allow us to discuss the success and limitations of our approach. We finally proposed ideas to improve the model, in particular by applying dynamical corrections. (author)
Strong field line shapes and photon statistics from a single molecule under anomalous noise.
Sanda, Frantisek
2009-10-01
We revisit the line-shape theory of a single molecule with anomalous stochastic spectral diffusion. Waiting time profiles for bath induced spectral jumps in the ground and excited states become different when a molecule, probed by continuous-wave laser field, reaches the steady state. This effect is studied for the stationary dichotomic continuous-time-random-walk spectral diffusion of a single two-level chromophore with power-law distributions of waiting times. Correlated waiting time distributions, line shapes, two-point fluorescence correlation function, and Mandel Q parameter are calculated for arbitrary magnitude of laser field. We extended previous weak field results and examined the breakdown of the central limit theorem in photon statistics, indicated by asymptotic power-law growth of Mandel Q parameter. Frequency profile of the Mandel Q parameter identifies the peaks of spectrum, which are related to anomalous spectral diffusion dynamics.
Distributed and organized decision making under resource boundedness
International Nuclear Information System (INIS)
Sawaragi, Tetsuo
1994-01-01
The coming bottleneck to be overcome in the era of the distributed and open-architectured environment will be the establishment of the rational design and coordination of the total system where multiple decision makers, problem solvers and automated machinery components coexist interacting with each other. In such an environment, they are not achieving some absolute standard of performance with unlimited amounts of resources nor with simple algorithms, but is doing as well as possible given what resources one has. In this article, we focus on the potentials of decision theory as a tool for tackling with the limited rationality under resource boundedness. First, the bottlenecks for establishing the organized and distributed decision making are summarized, and the importance of the formalization of decision activities of intelligent agents is stressed to establish an efficient and effective cooperation by distributed and organized decision making and/or problem solving. Some of the practical systems developed based on such a principle are reviewed briefly with respect to the real-time man-machine collaboration and the cooperative computational framework for the intelligent mobile robots. (author)
Han, Fang; Liu, Han
2016-01-01
Correlation matrices play a key role in many multivariate methods (e.g., graphical model estimation and factor analysis). The current state-of-the-art in estimating large correlation matrices focuses on the use of Pearson's sample correlation matrix. Although Pearson's sample correlation matrix enjoys various good properties under Gaussian models, it is not an effective estimator when facing heavy-tailed distributions. As a robust alternative, Han and Liu [J. Am. Stat. Assoc. 109 (2015) 275-2...
Han, Fang; Liu, Han
2017-02-01
Correlation matrix plays a key role in many multivariate methods (e.g., graphical model estimation and factor analysis). The current state-of-the-art in estimating large correlation matrices focuses on the use of Pearson's sample correlation matrix. Although Pearson's sample correlation matrix enjoys various good properties under Gaussian models, its not an effective estimator when facing heavy-tail distributions with possible outliers. As a robust alternative, Han and Liu (2013b) advocated the use of a transformed version of the Kendall's tau sample correlation matrix in estimating high dimensional latent generalized correlation matrix under the transelliptical distribution family (or elliptical copula). The transelliptical family assumes that after unspecified marginal monotone transformations, the data follow an elliptical distribution. In this paper, we study the theoretical properties of the Kendall's tau sample correlation matrix and its transformed version proposed in Han and Liu (2013b) for estimating the population Kendall's tau correlation matrix and the latent Pearson's correlation matrix under both spectral and restricted spectral norms. With regard to the spectral norm, we highlight the role of "effective rank" in quantifying the rate of convergence. With regard to the restricted spectral norm, we for the first time present a "sign subgaussian condition" which is sufficient to guarantee that the rank-based correlation matrix estimator attains the optimal rate of convergence. In both cases, we do not need any moment condition.
A statistical analysis of North East Atlantic (submicron aerosol size distributions
Directory of Open Access Journals (Sweden)
M. Dall'Osto
2011-12-01
Full Text Available The Global Atmospheric Watch research station at Mace Head (Ireland offers the possibility to sample some of the cleanest air masses being imported into Europe as well as some of the most polluted being exported out of Europe. We present a statistical cluster analysis of the physical characteristics of aerosol size distributions in air ranging from the cleanest to the most polluted for the year 2008. Data coverage achieved was 75% throughout the year. By applying the Hartigan-Wong k-Means method, 12 clusters were identified as systematically occurring. These 12 clusters could be further combined into 4 categories with similar characteristics, namely: coastal nucleation category (occurring 21.3 % of the time, open ocean nucleation category (occurring 32.6% of the time, background clean marine category (occurring 26.1% of the time and anthropogenic category (occurring 20% of the time aerosol size distributions. The coastal nucleation category is characterised by a clear and dominant nucleation mode at sizes less than 10 nm while the open ocean nucleation category is characterised by a dominant Aitken mode between 15 nm and 50 nm. The background clean marine aerosol exhibited a clear bimodality in the sub-micron size distribution, with although it should be noted that either the Aitken mode or the accumulation mode may dominate the number concentration. However, peculiar background clean marine size distributions with coarser accumulation modes are also observed during winter months. By contrast, the continentally-influenced size distributions are generally more monomodal (accumulation, albeit with traces of bimodality. The open ocean category occurs more often during May, June and July, corresponding with the North East (NE Atlantic high biological period. Combined with the relatively high percentage frequency of occurrence (32.6%, this suggests that the marine biota is an important source of new nano aerosol particles in NE Atlantic Air.
Pitch-class distribution modulates the statistical learning of atonal chord sequences.
Daikoku, Tatsuya; Yatomi, Yutaka; Yumoto, Masato
2016-10-01
The present study investigated whether neural responses could demonstrate the statistical learning of chord sequences and how the perception underlying a pitch class can affect the statistical learning of chord sequences. Neuromagnetic responses to two chord sequences of augmented triads that were presented every 0.5s were recorded from fourteen right-handed participants. One sequence was a series of 360 chord triplets, each of which consisted of three chords in the same pitch class (clustered pitch-classes sequences). The other sequence was a series of 360 chord triplets, each of which consisted of three chords in different pitch classes (dispersed pitch-classes sequences). The order of the triplets was constrained by a first-order Markov stochastic model such that a forthcoming triplet was statistically defined by the most recent triplet (80% for one; 20% for the other two). We performed a repeated-measures ANOVA with the peak amplitude and latency of the P1m, N1m and P2m. In the clustered pitch-classes sequences, the P1m responses to the triplets that appeared with higher transitional probability were significantly reduced compared with those with lower transitional probability, whereas no significant result was detected in the dispersed pitch-classes sequences. Neuromagnetic significance was concordant with the results of familiarity interviews conducted after each learning session. The P1m response is a useful index for the statistical learning of chord sequences. Domain-specific perception based on the pitch class may facilitate the domain-general statistical learning of chord sequences. Copyright © 2016 Elsevier Inc. All rights reserved.
Statistical modelling of the snow depth distribution in open alpine terrain
Directory of Open Access Journals (Sweden)
T. Grünewald
2013-08-01
Full Text Available The spatial distribution of alpine snow covers is characterised by large variability. Taking this variability into account is important for many tasks including hydrology, glaciology, ecology or natural hazards. Statistical modelling is frequently applied to assess the spatial variability of the snow cover. For this study, we assembled seven data sets of high-resolution snow-depth measurements from different mountain regions around the world. All data were obtained from airborne laser scanning near the time of maximum seasonal snow accumulation. Topographic parameters were used to model the snow depth distribution on the catchment-scale by applying multiple linear regressions. We found that by averaging out the substantial spatial heterogeneity at the metre scales, i.e. individual drifts and aggregating snow accumulation at the landscape or hydrological response unit scale (cell size 400 m, that 30 to 91% of the snow depth variability can be explained by models that are calibrated to local conditions at the single study areas. As all sites were sparsely vegetated, only a few topographic variables were included as explanatory variables, including elevation, slope, the deviation of the aspect from north (northing, and a wind sheltering parameter. In most cases, elevation, slope and northing are very good predictors of snow distribution. A comparison of the models showed that importance of parameters and their coefficients differed among the catchments. A "global" model, combining all the data from all areas investigated, could only explain 23% of the variability. It appears that local statistical models cannot be transferred to different regions. However, models developed on one peak snow season are good predictors for other peak snow seasons.
Toward the detection of gravitational waves under non-Gaussian noises I. Locally optimal statistic.
Yokoyama, Jun'ichi
2014-01-01
After reviewing the standard hypothesis test and the matched filter technique to identify gravitational waves under Gaussian noises, we introduce two methods to deal with non-Gaussian stationary noises. We formulate the likelihood ratio function under weakly non-Gaussian noises through the Edgeworth expansion and strongly non-Gaussian noises in terms of a new method we call Gaussian mapping where the observed marginal distribution and the two-body correlation function are fully taken into account. We then apply these two approaches to Student's t-distribution which has a larger tails than Gaussian. It is shown that while both methods work well in the case the non-Gaussianity is small, only the latter method works well for highly non-Gaussian case.
The statistical fluctuation study of quantum key distribution in means of uncertainty principle
Liu, Dunwei; An, Huiyao; Zhang, Xiaoyu; Shi, Xuemei
2018-03-01
Laser defects in emitting single photon, photon signal attenuation and propagation of error cause our serious headaches in practical long-distance quantum key distribution (QKD) experiment for a long time. In this paper, we study the uncertainty principle in metrology and use this tool to analyze the statistical fluctuation of the number of received single photons, the yield of single photons and quantum bit error rate (QBER). After that we calculate the error between measured value and real value of every parameter, and concern the propagation error among all the measure values. We paraphrase the Gottesman-Lo-Lutkenhaus-Preskill (GLLP) formula in consideration of those parameters and generate the QKD simulation result. In this study, with the increase in coding photon length, the safe distribution distance is longer and longer. When the coding photon's length is N = 10^{11}, the safe distribution distance can be almost 118 km. It gives a lower bound of safe transmission distance than without uncertainty principle's 127 km. So our study is in line with established theory, but we make it more realistic.
Directory of Open Access Journals (Sweden)
Seifu Hagos
Full Text Available Understanding the spatial distribution of stunting and underlying factors operating at meso-scale is of paramount importance for intervention designing and implementations. Yet, little is known about the spatial distribution of stunting and some discrepancies are documented on the relative importance of reported risk factors. Therefore, the present study aims at exploring the spatial distribution of stunting at meso- (district scale, and evaluates the effect of spatial dependency on the identification of risk factors and their relative contribution to the occurrence of stunting and severe stunting in a rural area of Ethiopia.A community based cross sectional study was conducted to measure the occurrence of stunting and severe stunting among children aged 0-59 months. Additionally, we collected relevant information on anthropometric measures, dietary habits, parent and child-related demographic and socio-economic status. Latitude and longitude of surveyed households were also recorded. Local Anselin Moran's I was calculated to investigate the spatial variation of stunting prevalence and identify potential local pockets (hotspots of high prevalence. Finally, we employed a Bayesian geo-statistical model, which accounted for spatial dependency structure in the data, to identify potential risk factors for stunting in the study area.Overall, the prevalence of stunting and severe stunting in the district was 43.7% [95%CI: 40.9, 46.4] and 21.3% [95%CI: 19.5, 23.3] respectively. We identified statistically significant clusters of high prevalence of stunting (hotspots in the eastern part of the district and clusters of low prevalence (cold spots in the western. We found out that the inclusion of spatial structure of the data into the Bayesian model has shown to improve the fit for stunting model. The Bayesian geo-statistical model indicated that the risk of stunting increased as the child's age increased (OR 4.74; 95% Bayesian credible interval [BCI]:3
NDVI statistical distribution of pasture areas at different times in the Community of Madrid (Spain)
Martín-Sotoca, Juan J.; Saa-Requejo, Antonio; Díaz-Ambrona, Carlos G. H.; Tarquis, Ana M.
2015-04-01
The severity of drought has many implications for society, including its impacts on the water supply, water pollution, reservoir management and ecosystem. However, its impacts on rain-fed agriculture are especially direct. Because of the importance of drought, there have been many attempts to characterize its severity, resulting in the numerous drought indices that have been developed (Niemeyer 2008). 'Biomass index' based on satellite image derived Normalized Difference Vegetation Index (NDVI) has been used in countries like United States of America, Canada and Spain for pasture and forage crops for some years (Rao, 2010). This type of agricultural insurance is named as 'index-based insurance' (IBI). IBI is perceived to be substantially less costly to operate and manage than multiple peril insurance. IBI contracts pay indemnities based not on the actual yield (or revenue) losses experienced by the insurance purchaser but rather based on realized NDVI values (historical data) that is correlated with farm-level losses (Xiaohui Deng et al., 2008). Definition of when drought event occurs is defined on NDVI threshold values mainly based in statistical parameters, average and standard deviation that characterize a normal distribution. In this work a pasture area at the north of Community of Madrid (Spain) has been delimited. Then, NDVI historical data was reconstructed based on remote sensing imaging MODIS, with 500x500m2 resolution. A statistical analysis of the NDVI histograms at consecutives 46 intervals of that area was applied to search for the best statistical distribution based on the maximum likelihood criteria. The results show that the normal distribution is not the optimal representation when IBI is available; the implications in the context of crop insurance are discussed (Martín-Sotoca, 2014). References Kolli N Rao. 2010. Index based Crop Insurance. Agriculture and Agricultural Science Procedia 1, 193-203. Martín-Sotoca, J.J. (2014) Estructura Espacial
Perneger, Thomas V; Combescure, Christophe
2017-07-01
Published P-values provide a window into the global enterprise of medical research. The aim of this study was to use the distribution of published P-values to estimate the relative frequencies of null and alternative hypotheses and to seek irregularities suggestive of publication bias. This cross-sectional study included P-values published in 120 medical research articles in 2016 (30 each from the BMJ, JAMA, Lancet, and New England Journal of Medicine). The observed distribution of P-values was compared with expected distributions under the null hypothesis (i.e., uniform between 0 and 1) and the alternative hypothesis (strictly decreasing from 0 to 1). P-values were categorized according to conventional levels of statistical significance and in one-percent intervals. Among 4,158 recorded P-values, 26.1% were highly significant (P journals are not a random sample of null and alternative hypotheses but that selective reporting is prevalent. In particular, significant results are about twice as likely to be reported as nonsignificant results. Copyright © 2017 Elsevier Inc. All rights reserved.
DEFF Research Database (Denmark)
Conradsen, Knut; Nielsen, Allan Aasbjerg; Schou, Jesper
2003-01-01
. Based on this distribution, a test statistic for equality of two such matrices and an associated asymptotic probability for obtaining a smaller value of the test statistic are derived and applied successfully to change detection in polarimetric SAR data. In a case study, EMISAR L-band data from April 17...... to HH, VV, or HV data alone, the derived test statistic reduces to the well-known gamma likelihood-ratio test statistic. The derived test statistic and the associated significance value can be applied as a line or edge detector in fully polarimetric SAR data also....
Moigne, Le N.; Oever, van den M.J.A.; Budtova, T.
2011-01-01
Using high resolution optical microscopy coupled with image analysis software and statistical methods, fibre length and aspect ratio distributions in polypropylene composites were characterized. Three types of fibres, flax, sisal and wheat straw, were studied. Number and surface weighted
Node vulnerability of water distribution networks under cascading failures
International Nuclear Information System (INIS)
Shuang, Qing; Zhang, Mingyuan; Yuan, Yongbo
2014-01-01
Water distribution networks (WDNs) are important in modern lifeline system. Its stability and reliability are critical for guaranteeing high living quality and continuous operation of urban functions. The aim of this paper is to evaluate the nodal vulnerability of WDNs under cascading failures. Vulnerability is defined to analyze the effects of the consequent failures. A cascading failure is a step-by-step process which is quantitatively investigated by numerical simulation with intentional attack. Monitored pressures in different nodes and flows in different pipes have been used to estimate the network topological structure and the consequences of nodal failure. Based on the connectivity loss of topological structure, the nodal vulnerability has been evaluated. A load variation function is established to record the nodal failure reason and describe the relative differences between the load and the capacity. The proposed method is validated by an illustrative example. The results revealed that the network vulnerability should be evaluated with the consideration of hydraulic analysis and network topology. In the case study, 70.59% of the node failures trigger the cascading failures with different failure processes. It is shown that the cascading failures result in severe consequences in WDNs. - Highlights: • The aim of this paper is to evaluate the nodal vulnerability of water distribution networks under cascading failures. • Monitored pressures and flows have been used to estimate the network topological structure and the consequences of nodal failure. • Based on the connectivity loss of topological structure, the nodal vulnerability has been evaluated. • A load variation function is established to record the failure reason and describe the relative differences between load and capacity. • The results show that 70.59% of the node failures trigger the cascading failures with different failure processes
Discharge current distribution in stratified soil under impulse discharge
Eniola Fajingbesi, Fawwaz; Shahida Midi, Nur; Elsheikh, Elsheikh M. A.; Hajar Yusoff, Siti
2017-06-01
The mobility of charge particles traversing a material defines its electrical properties. Soil (earth) have long been the universal grounding before and after the inception of active ground systems for electrical appliance purpose due to it semi-conductive properties. The soil can thus be modelled as a single material exhibiting semi-complex inductive-reactive impedance. Under impulse discharge such as lightning strikes to soil this property of soil could result in electric potential level fluctuation ranging from ground potential rise/fall to electromagnetic pulse coupling that could ultimately fail connected electrical appliance. In this work we have experimentally model the soil and lightning discharge using point to plane electrode setup to observe the current distribution characteristics at different soil conductivity [mS/m] range. The result presented from this research indicate above 5% shift in conductivity before and after discharge which is significant for consideration when dealing with grounding designs. The current distribution in soil have also be successfully observed and analysed from experimental result using mean current magnitude in relation to electrode distance and location, current density variation with depth all showing strong correlation with theoretical assumptions of a semi-complex impedance material.
Statistical theory on the analytical form of cloud particle size distributions
Wu, Wei; McFarquhar, Greg
2017-11-01
Several analytical forms of cloud particle size distributions (PSDs) have been used in numerical modeling and remote sensing retrieval studies of clouds and precipitation, including exponential, gamma, lognormal, and Weibull distributions. However, there is no satisfying physical explanation as to why certain distribution forms preferentially occur instead of others. Theoretically, the analytical form of a PSD can be derived by directly solving the general dynamic equation, but no analytical solutions have been found yet. Instead of using a process level approach, the use of the principle of maximum entropy (MaxEnt) for determining the analytical form of PSDs from the perspective of system is examined here. Here, the issue of variability under coordinate transformations that arises using the Gibbs/Shannon definition of entropy is identified, and the use of the concept of relative entropy to avoid these problems is discussed. Focusing on cloud physics, the four-parameter generalized gamma distribution is proposed as the analytical form of a PSD using the principle of maximum (relative) entropy with assumptions on power law relations between state variables, scale invariance and a further constraint on the expectation of one state variable (e.g. bulk water mass). DOE ASR.
2014-09-24
Stereo under Sequential Optimal Sampling: A Statistical Analysis Framework for Search Space Reduction Yilin Wang, Ke Wang, Enrique Dunn, Jan-Michael...100 Patch size 1 10 100 Re du nd an cy 0.1 10 20 30 40 50 60 70 80 90 100 Patch size 0.04 0.06 0.08 0.1 0.12 0.14 0.16 0.18 Sa m pl in gR at io 0 0.02
Pérez, Darío G; Funes, Gustavo
2012-12-03
Under the Geometrics Optics approximation is possible to estimate the covariance between the displacements of two thin beams after they have propagated through a turbulent medium. Previous works have concentrated in long propagation distances to provide models for the wandering statistics. These models are useful when the separation between beams is smaller than the propagation path-regardless of the characteristics scales of the turbulence. In this work we give a complete model for these covariances, behavior introducing absolute limits to the validity of former approximations. Moreover, these generalizations are established for non-Kolmogorov atmospheric models.
hardpan and maize root distribution under conservation and ...
African Journals Online (AJOL)
ACSS
2016-08-23
Bray-1) according to UNZA (1998). Mid-row samples were selected to reduce impacts of variable lime and fertiliser applications. Statistical analysis. Data were tested statistically using pairwise Student's t-tests, with confidence.
Wu, John Z; Herzog, Walter; Federico, Salvatore
2016-04-01
The distribution of collagen fibers across articular cartilage layers is statistical in nature. Based on the concepts proposed in previous models, we developed a methodology to include the statistically distributed fibers across the cartilage thickness in the commercial FE software COMSOL which avoids extensive routine programming. The model includes many properties that are observed in real cartilage: finite hyperelastic deformation, depth-dependent collagen fiber concentration, depth- and deformation-dependent permeability, and statistically distributed collagen fiber orientation distribution across the cartilage thickness. Numerical tests were performed using confined and unconfined compressions. The model predictions on the depth-dependent strain distributions across the cartilage layer are consistent with the experimental data in the literature.
Distributed Monitoring of the R(sup 2) Statistic for Linear Regression
Bhaduri, Kanishka; Das, Kamalika; Giannella, Chris R.
2011-01-01
The problem of monitoring a multivariate linear regression model is relevant in studying the evolving relationship between a set of input variables (features) and one or more dependent target variables. This problem becomes challenging for large scale data in a distributed computing environment when only a subset of instances is available at individual nodes and the local data changes frequently. Data centralization and periodic model recomputation can add high overhead to tasks like anomaly detection in such dynamic settings. Therefore, the goal is to develop techniques for monitoring and updating the model over the union of all nodes data in a communication-efficient fashion. Correctness guarantees on such techniques are also often highly desirable, especially in safety-critical application scenarios. In this paper we develop DReMo a distributed algorithm with very low resource overhead, for monitoring the quality of a regression model in terms of its coefficient of determination (R2 statistic). When the nodes collectively determine that R2 has dropped below a fixed threshold, the linear regression model is recomputed via a network-wide convergecast and the updated model is broadcast back to all nodes. We show empirically, using both synthetic and real data, that our proposed method is highly communication-efficient and scalable, and also provide theoretical guarantees on correctness.
Directory of Open Access Journals (Sweden)
Helen eBuckler
2016-05-01
Full Text Available Morphophonological alternations, such as the voicing alternation that arises in a morphological paradigm due to final-devoicing in Dutch, are notoriously difficult for children to acquire. This has previously been attributed to their unpredictability. In fact, the presence or absence of a voicing alternation is partly predictable if the phonological context of the word is taken into account, and adults have been shown to use this information (Ernestus & Baayen, 2003. This study investigates whether voicing alternations are predictable from the child’s input, and whether children can make use of this information. A corpus study of child-directed speech establishes that the likelihood of a stem-final obstruent alternating is somewhat predictable on the basis of the phonological properties of the stem. In Experiment 1 Dutch 3-year-olds’ production accuracy in a plural-elicitation task is shown to be sensitive to the distributional statistics. However, distributional properties do not play a role in children’s sensitivity to mispronunciations of voicing in a Preferential Looking Task in Experiment 2.
Particle size distribution in ambient air of Delhi and its statistical analysis.
Chelani, A B; Gajghate, D G; Chalapatirao, C V; Devotta, S
2010-07-01
Particle size distribution in ambient air has been studied in an urban city, Delhi. Different activity sites namely; kerbside, industrial and residential were selected for the study. The statistical analysis was carried out to study the frequency distribution and sources of different particle size fractions. The dominance of coarse particles attributed to local activities was observed at all the sites. It was observed that at kerbside sites, up to 52% of the particles were lower respiratory tract and up to 47% of the particles were upper respiratory tract particles. At residential and industrial sites, up to 40% and 31% were lower and upper respiratory tract particles, respectively. Factor analysis results indicated auto-exhaust as the dominant source of particulate matter at two of the kerbside sites. Resuspended dust was dominant at remaining two kerbside and residential sites. It was inferred using geometric standard deviation of particle size fractions that these were from different sources at residential and industrial site and from similar sources at three of the kerbside sites.
Statistics on Near Wall Structures and Shear Stress Distribution from 3D Holographic Measurement.
Sheng, J.; Malkiel, E.; Katz, J.
2007-11-01
Digital Holographic Microscopy performs 3D velocity measurement in the near-wall region of a turbulent boundary layer in a square channel over a smooth wall at Reτ=1,400. Resolution of ˜1μm over a sample volume of 1.5x2x1.5mm (x^+=50, y^+=60, z^+=50) is sufficient for resolving buffer layer and lower log layer structures, and for measuring instantaneous wall shear stress distributions from velocity gradients in the viscous sublayer. Results, based on 700 instantaneous realizations, provide detailed statistics on the spatial distribution of both wall stress components along with characteristic flow structures. Conditional sampling based on maxima and minima of wall shear stresses, as well as examination of instantaneous flow structures, lead to development of a conceptual model for a characteristic flow phenomenon that seems to generating extreme stress events. This structure develops as an initially spanwise vortex element rises away from the surface, due to local disturbance, causing a local stress minimum. Due to increasing velocity with elevation, this element bends downstream, forming a pair of inclined streamwise vortices, aligned at 45^0 to freestream, with ejection-like flow between them. Entrainment of high streamwise momentum on the outer sides of this vortex pair generates streamwise shear stress maxima, 70 δν downstream, which are displaced laterally by 35 δν from the local minimum.
Mealing, Nicole; Hayen, Andrew; Newall, Anthony T
2016-06-08
It is important to assess the impact a vaccination programme has on the burden of disease after it is implemented. For example, this may reveal herd immunity effects or vaccine-induced shifts in the incidence of disease or in circulating strains or serotypes of the pathogen. In this article we summarise the key features of infectious diseases that need to be considered when trying to detect any changes in the burden of diseases at a population level as a result of vaccination efforts. We outline the challenges of using routine surveillance databases to monitor infectious diseases, such as the identification of diseased cases and the availability of vaccination status for cases. We highlight the complexities in modelling the underlying patterns in infectious disease rates (e.g. presence of autocorrelation) and discuss the main statistical methods that can be used to control for periodicity (e.g. seasonality) and autocorrelation when assessing the impact of vaccination programmes on burden of disease (e.g. cosinor terms, generalised additive models, autoregressive processes and moving averages). For some analyses, there may be multiple methods that can be used, but it is important for authors to justify the method chosen and discuss any limitations. We present a case study review of the statistical methods used in the literature to assess the rotavirus vaccination programme impact in Australia. The methods used varied and included generalised linear models and descriptive statistics. Not all studies accounted for autocorrelation and seasonality, which can have a major influence on results. We recommend that future analyses consider the strength and weakness of alternative statistical methods and justify their choice. Copyright © 2016 Elsevier Ltd. All rights reserved.
An, Z. L.; Chen, T.; Cheng, D. L.; Chen, T. H.; Y Wang, Z.
2017-12-01
In this work, the prediction on average tensile strength of 316L stainless steel is statistically analyzed by Weibull distribution method. Direct diffusion bonding of 316L-SS was performed at high temperature of 550°C and 8 tension tests were carried out. The results obtained vary between 87.8MPa and 160.8MPa. The probability distribution of material failure is obtained by using the Weibull distribution.
International Nuclear Information System (INIS)
Zhang Yu; Wang Guangyi; Lu Xinmiao; Hu Yongcai; Xu Jiangtao
2016-01-01
The random telegraph signal noise in the pixel source follower MOSFET is the principle component of the noise in the CMOS image sensor under low light. In this paper, the physical and statistical model of the random telegraph signal noise in the pixel source follower based on the binomial distribution is set up. The number of electrons captured or released by the oxide traps in the unit time is described as the random variables which obey the binomial distribution. As a result, the output states and the corresponding probabilities of the first and the second samples of the correlated double sampling circuit are acquired. The standard deviation of the output states after the correlated double sampling circuit can be obtained accordingly. In the simulation section, one hundred thousand samples of the source follower MOSFET have been simulated, and the simulation results show that the proposed model has the similar statistical characteristics with the existing models under the effect of the channel length and the density of the oxide trap. Moreover, the noise histogram of the proposed model has been evaluated at different environmental temperatures. (paper)
International Nuclear Information System (INIS)
Nam, Cheol; Choi, Byeong Kwon; Jeong, Yong Hwan; Jung, Youn Ho
2001-01-01
During the last decade, the failure behavior of high-burnup fuel rods under RIA has been an extensive concern since observations of fuel rod failures at low enthalpy. Of great importance is placed on failure prediction of fuel rod in the point of licensing criteria and safety in extending burnup achievement. To address the issue, a statistics-based methodology is introduced to predict failure probability of irradiated fuel rods. Based on RIA simulation results in literature, a failure enthalpy correlation for irradiated fuel rod is constructed as a function of oxide thickness, fuel burnup, and pulse width. From the failure enthalpy correlation, a single damage parameter, equivalent enthalpy, is defined to reflect the effects of the three primary factors as well as peak fuel enthalpy. Moreover, the failure distribution function with equivalent enthalpy is derived, applying a two-parameter Weibull statistical model. Using these equations, the sensitivity analysis is carried out to estimate the effects of burnup, corrosion, peak fuel enthalpy, pulse width and cladding materials used
Morgenthaler, George W.
1989-01-01
The ability to launch-on-time and to send payloads into space has progressed dramatically since the days of the earliest missile and space programs. Causes for delay during launch, i.e., unplanned 'holds', are attributable to several sources: weather, range activities, vehicle conditions, human performance, etc. Recent developments in space program, particularly the need for highly reliable logistic support of space construction and the subsequent planned operation of space stations, large unmanned space structures, lunar and Mars bases, and the necessity of providing 'guaranteed' commercial launches have placed increased emphasis on understanding and mastering every aspect of launch vehicle operations. The Center of Space Construction has acquired historical launch vehicle data and is applying these data to the analysis of space launch vehicle logistic support of space construction. This analysis will include development of a better understanding of launch-on-time capability and simulation of required support systems for vehicle assembly and launch which are necessary to support national space program construction schedules. In this paper, the author presents actual launch data on unscheduled 'hold' distributions of various launch vehicles. The data have been supplied by industrial associate companies of the Center for Space Construction. The paper seeks to determine suitable probability models which describe these historical data and that can be used for several purposes such as: inputs to broader simulations of launch vehicle logistic space construction support processes and the determination of which launch operations sources cause the majority of the unscheduled 'holds', and hence to suggest changes which might improve launch-on-time. In particular, the paper investigates the ability of a compound distribution probability model to fit actual data, versus alternative models, and recommends the most productive avenues for future statistical work.
Statistical Analysis of Wave Climate Data Using Mixed Distributions and Extreme Wave Prediction
Directory of Open Access Journals (Sweden)
Wei Li
2016-05-01
Full Text Available The investigation of various aspects of the wave climate at a wave energy test site is essential for the development of reliable and efficient wave energy conversion technology. This paper presents studies of the wave climate based on nine years of wave observations from the 2005–2013 period measured with a wave measurement buoy at the Lysekil wave energy test site located off the west coast of Sweden. A detailed analysis of the wave statistics is investigated to reveal the characteristics of the wave climate at this specific test site. The long-term extreme waves are estimated from applying the Peak over Threshold (POT method on the measured wave data. The significant wave height and the maximum wave height at the test site for different return periods are also compared. In this study, a new approach using a mixed-distribution model is proposed to describe the long-term behavior of the significant wave height and it shows an impressive goodness of fit to wave data from the test site. The mixed-distribution model is also applied to measured wave data from four other sites and it provides an illustration of the general applicability of the proposed model. The methodologies used in this paper can be applied to general wave climate analysis of wave energy test sites to estimate extreme waves for the survivability assessment of wave energy converters and characterize the long wave climate to forecast the wave energy resource of the test sites and the energy production of the wave energy converters.
Directory of Open Access Journals (Sweden)
Emmanouil Styvaktakis
2007-01-01
Full Text Available This paper presents the two main types of classification methods for power quality disturbances based on underlying causes: deterministic classification, giving an expert system as an example, and statistical classification, with support vector machines (a novel method as an example. An expert system is suitable when one has limited amount of data and sufficient power system expert knowledge; however, its application requires a set of threshold values. Statistical methods are suitable when large amount of data is available for training. Two important issues to guarantee the effectiveness of a classifier, data segmentation, and feature extraction are discussed. Segmentation of a sequence of data recording is preprocessing to partition the data into segments each representing a duration containing either an event or a transition between two events. Extraction of features is applied to each segment individually. Some useful features and their effectiveness are then discussed. Some experimental results are included for demonstrating the effectiveness of both systems. Finally, conclusions are given together with the discussion of some future research directions.
Li, Xiaojun; Yu, Benxu; Ji, Yucheng; Lu, Jiaxin; Yuan, Shouqi
2017-02-01
Centrifugal pumps are often used in operating conditions where they can be susceptible to premature failure. The cavitation phenomenon is a common fault in centrifugal pumps and is associated with undesired effects. Among the numerous cavitation detection methods, the measurement of suction pressure fluctuation is one of the most used methods to detect or diagnose the degree of cavitation in a centrifugal pump. In this paper, a closed loop was established to investigate the pump cavitation phenomenon, the statistical parameters for PDF (Probability Density Function), Variance and RMS (Root Mean Square) were used to analyze the relationship between the cavitation performance and the suction pressure signals during the development of cavitation. It is found that the statistical parameters used in this research are able to capture critical cavitation condition and cavitation breakdown condition, whereas difficult for the detection of incipient cavitation in the pump. At part-load conditions, the pressure fluctuations at the impeller inlet show more complexity than the best efficiency point (BEP). Amplitude of PDF values of suction pressure increased steeply when the flow rate dropped to 40 m3/h (the design flow rate was 60 m3/h). One possible reason is that the flow structure in the impeller channel promotes an increase of the cavitation intensity when the flow rate is reduced to a certain degree. This shows that it is necessary to find the relationship between the cavitation instabilities and flow instabilities when centrifugal pumps operate under part-load flow rates.
Tight finite-key analysis for passive decoy-state quantum key distribution under general attacks
Zhou, Chun; Bao, Wan-Su; Li, Hong-Wei; Wang, Yang; Li, Yuan; Yin, Zhen-Qiang; Chen, Wei; Han, Zheng-Fu
2014-05-01
For quantum key distribution (QKD) using spontaneous parametric-down-conversion sources (SPDCSs), the passive decoy-state protocol has been proved to be efficiently close to the theoretical limit of an infinite decoy-state protocol. In this paper, we apply a tight finite-key analysis for the passive decoy-state QKD using SPDCSs. Combining the security bound based on the uncertainty principle with the passive decoy-state protocol, a concise and stringent formula for calculating the key generation rate for QKD using SPDCSs is presented. The simulation shows that the secure distance under our formula can reach up to 182 km when the number of sifted data is 1010. Our results also indicate that, under the same deviation of statistical fluctuation due to finite-size effects, the passive decoy-state QKD with SPDCSs can perform as well as the active decoy-state QKD with a weak coherent source.
Under the hood of IRIS's Distributed REU Site
Hubenthal, M.; Taber, J.
2014-12-01
Since 1998 the IRIS Undergraduate Internship Program has provided research experiences for up to 15 students each summer. Through this 9 to 11 week internship program, students take part in an intensive week-long preparatory course, and work with leaders in seismological research, in both lab-base and field-based settings, to produce research products worthy of presentation and recognition at large professional conferences. The IRIS internship program employs a distributed REU model that has been demonstrated to bond students into a cohort, and maintain group cohesion despite students conducting their research at geographically distributed sites. Over the past 16 years the program has encountered numerous anticipated and unanticipated challenges. The primary challenges have involved exploring how to modify the REU-system to produce outcomes that are better aligned with our programmatic goals. For example, some questions we have attempted to address include: How can the success of an REU site be measured? How do you find, interest, and recruit under-represented minorities into a geophysics program? Can the program increase the probability of interns receiving some minimal level of mentoring across the program? While it is likely that no single answer to these questions exists, we have developed and piloted a number of strategies. These approaches have been developed through a process of identifying relevant research results from other REUs and combing this information with data from our own programmatic evaluations. This data informs the development of specific changes within our program which are then measured as a feedback. We will present our current strategies to address each questions along with measures of their effectiveness. In addition to broad scale systematic issues, we have also placed significant effort into responding to smaller, process challenges that all REU sites face. These range from simple logistical issues (e.g. liability), to educational
Statistical inference on censored data for targeted clinical trials under enrichment design.
Chen, Chen-Fang; Lin, Jr-Rung; Liu, Jen-Pei
2013-01-01
For the traditional clinical trials, inclusion and exclusion criteria are usually based on some clinical endpoints; the genetic or genomic variability of the trial participants are not totally utilized in the criteria. After completion of the human genome project, the disease targets at the molecular level can be identified and can be utilized for the treatment of diseases. However, the accuracy of diagnostic devices for identification of such molecular targets is usually not perfect. Some of the patients enrolled in targeted clinical trials with a positive result for the molecular target might not have the specific molecular targets. As a result, the treatment effect may be underestimated in the patient population truly with the molecular target. To resolve this issue, under the exponential distribution, we develop inferential procedures for the treatment effects of the targeted drug based on the censored endpoints in the patients truly with the molecular targets. Under an enrichment design, we propose using the expectation-maximization algorithm in conjunction with the bootstrap technique to incorporate the inaccuracy of the diagnostic device for detection of the molecular targets on the inference of the treatment effects. A simulation study was conducted to empirically investigate the performance of the proposed methods. Simulation results demonstrate that under the exponential distribution, the proposed estimator is nearly unbiased with adequate precision, and the confidence interval can provide adequate coverage probability. In addition, the proposed testing procedure can adequately control the size with sufficient power. On the other hand, when the proportional hazard assumption is violated, additional simulation studies show that the type I error rate is not controlled at the nominal level and is an increasing function of the positive predictive value. A numerical example illustrates the proposed procedures. Copyright © 2013 John Wiley & Sons, Ltd.
On the Distribution of the Peña Rodríguez Portmanteau Statistic
Directory of Open Access Journals (Sweden)
Serge B. Provost
2012-07-01
Full Text Available v\\:* {behavior:url(#default#VML;} o\\:* {behavior:url(#default#VML;} w\\:* {behavior:url(#default#VML;} .shape {behavior:url(#default#VML;} Normal 0 false false false EN-US X-NONE X-NONE /* Style Definitions */ table.MsoNormalTable {mso-style-name:"Table Normal"; mso-tstyle-rowband-size:0; mso-tstyle-colband-size:0; mso-style-noshow:yes; mso-style-priority:99; mso-style-parent:""; mso-padding-alt:0in 5.4pt 0in 5.4pt; mso-para-margin:0in; mso-para-margin-bottom:.0001pt; mso-pagination:widow-orphan; font-size:10.0pt; font-family:"Times New Roman","serif";} Peña and Rodríguez (2002 introduced a portmanteau test for time series which turns out to be more powerful than those proposed by Ljung and Box (1986 and Monti (1994, and approximated its distribution by means of a two-parameter gamma random variable. A polynomially adjusted beta approximation is proposed in this paper. This approximant is based on the moments of the statistic, which can be estimated by simulation or determined by symbolic computations or numerical integration. Various types of time series processes such as AR(1, MA(1, ARMA(2,2 are being considered. The proposed approximation turns out to be nearly exact.
Riandry, M. A.; Ismet, I.; Akhsan, H.
2017-09-01
This study aims to produce a valid and practical statistical physics course handout on distribution function materials based on STEM. Rowntree development model is used to produce this handout. The model consists of three stages: planning, development and evaluation stages. In this study, the evaluation stage used Tessmer formative evaluation. It consists of 5 stages: self-evaluation, expert review, one-to-one evaluation, small group evaluation and field test stages. However, the handout is limited to be tested on validity and practicality aspects, so the field test stage is not implemented. The data collection technique used walkthroughs and questionnaires. Subjects of this study are students of 6th and 8th semester of academic year 2016/2017 Physics Education Study Program of Sriwijaya University. The average result of expert review is 87.31% (very valid category). One-to-one evaluation obtained the average result is 89.42%. The result of small group evaluation is 85.92%. From one-to-one and small group evaluation stages, averagestudent response to this handout is 87,67% (very practical category). Based on the results of the study, it can be concluded that the handout is valid and practical.
Aryan, H.; Yearby, K.; Balikhin, M. A.; Krasnoselskikh, V.; Agapitov, O. V.
2013-12-01
The interaction of gyroresonant wave particles with chorus waves largely determine the dynamics of the Earth's radiation belts that effects the acceleration and loss of radiation belt electrons. The common approach is to present model waves distribution in the inner magnetosphere under different values of geomagnetic activity as expressed by the geomagnetic indices. However it is known that solar wind parameters such as bulk velocity (V) and density (n) are more effective in the control of high energy fluxes at the geostationary orbit. Therefore in the present study the set of parameters of the wave distribution is expanded to include the solar wind parameters in addition to the geomagnetic indices. The present study examines almost four years (01, January, 2004 to 29, September, 2007) of Cluster STAFF-SA, Double Star TC1 and OMNI data in order to present a combined model of wave magnetic field intensities for the chorus waves as a function of magnetic local time (MLT), L-shell (L*), geomagnetic activity, and solar wind velocity and density. Generally, the largest wave intensities are observed during average solar wind velocities (3006cm-3. On the other hand the wave intensity is lower and limited between 06:00 to 18:00 MLT for V700kms-1.
Lee, Chieh-Han; Yu, Hwa-Lung
2014-05-01
Dengue fever has been recognized as the most important widespread vector-borne infectious disease in recent decades. Over 40% of the world's population is risk from dengue and about 50-100 million people are infected world wide annually. Previous studies have found that dengue fever is highly correlated with climate covariates. Thus, the potential effects of global climate change on dengue fever are crucial to epidemic concern, in particular, the transmission of the disease. This present study investigated the nonlinearity of time-delayed impact of climate on spatio-temporal variations of dengue fever in the southern Taiwan during 1998 to 2011. A distributed lag nonlinear model (DLNM) is used to assess the nonlinear lagged effects of meteorology. The statistically significant meteorological factors are considered, including weekly minimum temperature and maximum 24-hour rainfall. The relative risk and the distribution of dengue fever then predict under various climate change scenarios. The result shows that the relative risk is similar for different scenarios. In addition, the impact of rainfall on the incidence risk is higher than temperature. Moreover, the incidence risk is associated to spatially population distribution. The results can be served as practical reference for environmental regulators for the epidemic prevention under climate change scenarios.
Asymptotic distribution for goodness-of-fit statistics in a sequence of multinomial models
Czech Academy of Sciences Publication Activity Database
Vajda, Igor; Gyorfi, L.
2002-01-01
Roč. 56, č. 1 (2002), s. 57-67 ISSN 0167-7152 R&D Projects: GA AV ČR IAA1075101 Institutional research plan: CEZ:AV0Z1075907 Keywords : goodness-of-fit statistics * disparity statistics * goodnes-of-fit tests Subject RIV: BB - Applied Statistics, Operational Research Impact factor: 0.364, year: 2002
Understanding the Sampling Distribution and Its Use in Testing Statistical Significance.
Breunig, Nancy A.
Despite the increasing criticism of statistical significance testing by researchers, particularly in the publication of the 1994 American Psychological Association's style manual, statistical significance test results are still popular in journal articles. For this reason, it remains important to understand the logic of inferential statistics. A…
Voltage Control in Distributed Generation under Measurement Falsification Attacks
Ma, M.; Herdeiro Teixeira, A.M.; van den Berg, J.; Palensky, P.
2017-01-01
Low-voltage distribution grids experience a rising penetration of inverter-based, distributed generation. In order to not only contribute to but also solve voltage problems, these inverters are increasingly asked to participate in intelligent grid controls. Communicating inverters implement
Directory of Open Access Journals (Sweden)
V. E. Merzlikin
2015-01-01
Full Text Available The article deals with the search for optimal parameter estimation of the parameters of the process of homogenization of dairy products. Provides a theoretical basis for relationship of the relaxation time of the fat globules and attenuation coefficient of ultrasonic oscillations in dairy products. Suggested from the measured acoustic properties of milk to make the calculations of the mass distribution of fat globules. Studies on the proof of this hypothesis. Morphological analysis procedure carried out for homogenized milk samples at different pressures, as well as homogenized. As a result of research obtained distribution histogram of fat globules in dependence on the homogenization pressure. Also performed acoustic studies to obtain the frequency characteristics of loss modulus as a function of homogenization pressure. For further research the choice of method for approximating dependences is obtained using statistical moments of distributions. The parameters for the approximation of the distribution of fat globules and loss modulus versus pressure homogenization were obtained. Was carried out to test the hypothesis on the relationship parameters of approximation of the distribution of the fat globules and loss modulus as a function of pressure homogenization. Correlation analysis showed a clear dependence of the first and second statistical moment distributions of the pressure homogenization. The obtain ed dependence is consistent with the physical meaning of the first two moments of a statistical distribution. Correlation analysis was carried out according to the statistical moments of the distribution of the fat globules from moments of loss modulus. It is concluded that the possibility of ultrasonic testing the degree of homogenization and mass distribution of the fat globules of milk products.
Girling, Alan J; Hemming, Karla
2016-06-15
In stepped cluster designs the intervention is introduced into some (or all) clusters at different times and persists until the end of the study. Instances include traditional parallel cluster designs and the more recent stepped-wedge designs. We consider the precision offered by such designs under mixed-effects models with fixed time and random subject and cluster effects (including interactions with time), and explore the optimal choice of uptake times. The results apply both to cross-sectional studies where new subjects are observed at each time-point, and longitudinal studies with repeat observations on the same subjects. The efficiency of the design is expressed in terms of a 'cluster-mean correlation' which carries information about the dependency-structure of the data, and two design coefficients which reflect the pattern of uptake-times. In cross-sectional studies the cluster-mean correlation combines information about the cluster-size and the intra-cluster correlation coefficient. A formula is given for the 'design effect' in both cross-sectional and longitudinal studies. An algorithm for optimising the choice of uptake times is described and specific results obtained for the best balanced stepped designs. In large studies we show that the best design is a hybrid mixture of parallel and stepped-wedge components, with the proportion of stepped wedge clusters equal to the cluster-mean correlation. The impact of prior uncertainty in the cluster-mean correlation is considered by simulation. Some specific hybrid designs are proposed for consideration when the cluster-mean correlation cannot be reliably estimated, using a minimax principle to ensure acceptable performance across the whole range of unknown values. © 2016 The Authors. Statistics in Medicine published by John Wiley & Sons Ltd. © 2016 The Authors. Statistics in Medicine published by John Wiley & Sons Ltd.
Energy Technology Data Exchange (ETDEWEB)
Kwag, Shinyoung [North Carolina State University, Raleigh, NC 27695 (United States); Korea Atomic Energy Research Institute, Daejeon 305-353 (Korea, Republic of); Gupta, Abhinav, E-mail: agupta1@ncsu.edu [North Carolina State University, Raleigh, NC 27695 (United States)
2017-04-15
Highlights: • This study presents the development of Bayesian framework for probabilistic risk assessment (PRA) of structural systems under multiple hazards. • The concepts of Bayesian network and Bayesian inference are combined by mapping the traditionally used fault trees into a Bayesian network. • The proposed mapping allows for consideration of dependencies as well as correlations between events. • Incorporation of Bayesian inference permits a novel way for exploration of a scenario that is likely to result in a system level “vulnerability.” - Abstract: Conventional probabilistic risk assessment (PRA) methodologies (USNRC, 1983; IAEA, 1992; EPRI, 1994; Ellingwood, 2001) conduct risk assessment for different external hazards by considering each hazard separately and independent of each other. The risk metric for a specific hazard is evaluated by a convolution of the fragility and the hazard curves. The fragility curve for basic event is obtained by using empirical, experimental, and/or numerical simulation data for a particular hazard. Treating each hazard as an independently can be inappropriate in some cases as certain hazards are statistically correlated or dependent. Examples of such correlated events include but are not limited to flooding induced fire, seismically induced internal or external flooding, or even seismically induced fire. In the current practice, system level risk and consequence sequences are typically calculated using logic trees to express the causative relationship between events. In this paper, we present the results from a study on multi-hazard risk assessment that is conducted using a Bayesian network (BN) with Bayesian inference. The framework can consider statistical dependencies among risks from multiple hazards, allows updating by considering the newly available data/information at any level, and provide a novel way to explore alternative failure scenarios that may exist due to vulnerabilities.
International Nuclear Information System (INIS)
Kwag, Shinyoung; Gupta, Abhinav
2017-01-01
Highlights: • This study presents the development of Bayesian framework for probabilistic risk assessment (PRA) of structural systems under multiple hazards. • The concepts of Bayesian network and Bayesian inference are combined by mapping the traditionally used fault trees into a Bayesian network. • The proposed mapping allows for consideration of dependencies as well as correlations between events. • Incorporation of Bayesian inference permits a novel way for exploration of a scenario that is likely to result in a system level “vulnerability.” - Abstract: Conventional probabilistic risk assessment (PRA) methodologies (USNRC, 1983; IAEA, 1992; EPRI, 1994; Ellingwood, 2001) conduct risk assessment for different external hazards by considering each hazard separately and independent of each other. The risk metric for a specific hazard is evaluated by a convolution of the fragility and the hazard curves. The fragility curve for basic event is obtained by using empirical, experimental, and/or numerical simulation data for a particular hazard. Treating each hazard as an independently can be inappropriate in some cases as certain hazards are statistically correlated or dependent. Examples of such correlated events include but are not limited to flooding induced fire, seismically induced internal or external flooding, or even seismically induced fire. In the current practice, system level risk and consequence sequences are typically calculated using logic trees to express the causative relationship between events. In this paper, we present the results from a study on multi-hazard risk assessment that is conducted using a Bayesian network (BN) with Bayesian inference. The framework can consider statistical dependencies among risks from multiple hazards, allows updating by considering the newly available data/information at any level, and provide a novel way to explore alternative failure scenarios that may exist due to vulnerabilities.
Directory of Open Access Journals (Sweden)
Shouxiang Wang
2017-12-01
Full Text Available The estimation of losses of distribution feeders plays a crucial guiding role for the planning, design, and operation of a distribution system. This paper proposes a novel estimation method of statistical line loss of distribution feeders using the feeder cluster technique and modified eXtreme Gradient Boosting (XGBoost algorithm that is based on the characteristic data of feeders that are collected in the smart power distribution and utilization system. In order to enhance the applicability and accuracy of the estimation model, k-medoids algorithm with weighting distance for clustering distribution feeders is proposed. Meanwhile, a variable selection method for clustering distribution feeders is discussed, considering the correlation and validity of variables. This paper next modifies the XGBoost algorithm by adding a penalty function in consideration of the effect of the theoretical value to the loss function for the estimation of statistical line loss of distribution feeders. The validity of the proposed methodology is verified by 762 distribution feeders in the Shanghai distribution system. The results show that the XGBoost method has higher accuracy than decision tree, neural network, and random forests by comparison of Root Mean Square Error (RMSE, Mean Absolute Percentage Error (MAPE, and Absolute Percentage Error (APE indexes. In particular, the theoretical value can significantly improve the reasonability of estimated results.
Gerdes, Lars; Busch, Ulrich; Pecoraro, Sven
2014-12-14
According to Regulation (EU) No 619/2011, trace amounts of non-authorised genetically modified organisms (GMO) in feed are tolerated within the EU if certain prerequisites are met. Tolerable traces must not exceed the so-called 'minimum required performance limit' (MRPL), which was defined according to the mentioned regulation to correspond to 0.1% mass fraction per ingredient. Therefore, not yet authorised GMO (and some GMO whose approvals have expired) have to be quantified at very low level following the qualitative detection in genomic DNA extracted from feed samples. As the results of quantitative analysis can imply severe legal and financial consequences for producers or distributors of feed, the quantification results need to be utterly reliable. We developed a statistical approach to investigate the experimental measurement variability within one 96-well PCR plate. This approach visualises the frequency distribution as zygosity-corrected relative content of genetically modified material resulting from different combinations of transgene and reference gene Cq values. One application of it is the simulation of the consequences of varying parameters on measurement results. Parameters could be for example replicate numbers or baseline and threshold settings, measurement results could be for example median (class) and relative standard deviation (RSD). All calculations can be done using the built-in functions of Excel without any need for programming. The developed Excel spreadsheets are available (see section 'Availability of supporting data' for details). In most cases, the combination of four PCR replicates for each of the two DNA isolations already resulted in a relative standard deviation of 15% or less. The aims of the study are scientifically based suggestions for minimisation of uncertainty of measurement especially in -but not limited to- the field of GMO quantification at low concentration levels. Four PCR replicates for each of the two DNA isolations
Aryan, Homayon; Yearby, Keith; Balikhin, Michael; Agapitov, Oleksiy; Krasnoselskikh, Vladimir; Boynton, Richard
2014-08-01
Energetic electrons within the Earth's radiation belts represent a serious hazard to geostationary satellites. The interactions of electrons with chorus waves play an important role in both the acceleration and loss of radiation belt electrons. The common approach is to present model wave distributions in the inner magnetosphere under different values of geomagnetic activity as expressed by the geomagnetic indices. However, it has been shown that only around 50% of geomagnetic storms increase flux of relativistic electrons at geostationary orbit while 20% causes a decrease and the remaining 30% has relatively no effect. This emphasizes the importance of including solar wind parameters such as bulk velocity (V), density (n), flow pressure (P), and the vertical interplanetary magnetic field component (Bz) that are known to be predominately effective in the control of high energy fluxes at the geostationary orbit. Therefore, in the present study the set of parameters of the wave distributions is expanded to include the solar wind parameters in addition to the geomagnetic activity. The present study examines almost 4 years (1 January 2004 to 29 September 2007) of Spatio-Temporal Analysis of Field Fluctuation data from Double Star TC1 combined with geomagnetic indices and solar wind parameters from OMNI database in order to present a comprehensive model of wave magnetic field intensities for the chorus waves as a function of magnetic local time, L shell (L), magnetic latitude (λm), geomagnetic activity, and solar wind parameters. Generally, the results indicate that the intensity of chorus emission is not only dependent upon geomagnetic activity but also dependent on solar wind parameters with velocity and southward interplanetary magnetic field Bs (Bz < 0), evidently the most influential solar wind parameters. The largest peak chorus intensities in the order of 50 pT are observed during active conditions, high solar wind velocities, low solar wind densities, high
Distributed Multisensor Data Fusion under Unknown Correlation and Data Inconsistency
Directory of Open Access Journals (Sweden)
Muhammad Abu Bakr
2017-10-01
Full Text Available The paradigm of multisensor data fusion has been evolved from a centralized architecture to a decentralized or distributed architecture along with the advancement in sensor and communication technologies. These days, distributed state estimation and data fusion has been widely explored in diverse fields of engineering and control due to its superior performance over the centralized one in terms of flexibility, robustness to failure and cost effectiveness in infrastructure and communication. However, distributed multisensor data fusion is not without technical challenges to overcome: namely, dealing with cross-correlation and inconsistency among state estimates and sensor data. In this paper, we review the key theories and methodologies of distributed multisensor data fusion available to date with a specific focus on handling unknown correlation and data inconsistency. We aim at providing readers with a unifying view out of individual theories and methodologies by presenting a formal analysis of their implications. Finally, several directions of future research are highlighted.
Distributed Multisensor Data Fusion under Unknown Correlation and Data Inconsistency.
Bakr, Muhammad Abu; Lee, Sukhan
2017-10-27
The paradigm of multisensor data fusion has been evolved from a centralized architecture to a decentralized or distributed architecture along with the advancement in sensor and communication technologies. These days, distributed state estimation and data fusion has been widely explored in diverse fields of engineering and control due to its superior performance over the centralized one in terms of flexibility, robustness to failure and cost effectiveness in infrastructure and communication. However, distributed multisensor data fusion is not without technical challenges to overcome: namely, dealing with cross-correlation and inconsistency among state estimates and sensor data. In this paper, we review the key theories and methodologies of distributed multisensor data fusion available to date with a specific focus on handling unknown correlation and data inconsistency. We aim at providing readers with a unifying view out of individual theories and methodologies by presenting a formal analysis of their implications. Finally, several directions of future research are highlighted.
Statistical Mechanics and the Climatology of the Arctic Sea Ice Thickness Distribution
Toppaladoddi, Srikanth; Wettlaufer, J. S.
2017-05-01
We study the seasonal changes in the thickness distribution of Arctic sea ice, g( h), under climate forcing. Our analytical and numerical approach is based on a Fokker-Planck equation for g( h) (Toppaladoddi and Wettlaufer in Phys Rev Lett 115(14):148501, 2015), in which the thermodynamic growth rates are determined using observed climatology. In particular, the Fokker-Planck equation is coupled to the observationally consistent thermodynamic model of Eisenman and Wettlaufer (Proc Natl Acad Sci USA 106:28-32, 2009). We find that due to the combined effects of thermodynamics and mechanics, g( h) spreads during winter and contracts during summer. This behavior is in agreement with recent satellite observations from CryoSat-2 (Kwok and Cunningham in Philos Trans R Soc A 373(2045):20140157, 2015). Because g( h) is a probability density function, we quantify all of the key moments (e.g., mean thickness, fraction of thin/thick ice, mean albedo, relaxation time scales) as greenhouse-gas radiative forcing, Δ F_0, increases. The mean ice thickness decays exponentially with Δ F_0, but much slower than do solely thermodynamic models. This exhibits the crucial role that ice mechanics plays in maintaining the ice cover, by redistributing thin ice to thick ice-far more rapidly than can thermal growth alone.
Kutsumi, M.; terada, K.; Tajima, F.; Kitano, T.
2012-12-01
In order to find physical and chemical environment factors which relate to the fish fauna distribution, we investigated the temporal and spatial change of water qualities and fish distributions in Kaname river, Japan. We investigated the fish distribution, physical water parameters (temperature, salinity, dissolved oxygen, Chl-a and turbidity) and chemical water parameters (nitrate, nitrite, ammonia, orthophosphoric and suspended solids). We conducted the multivariate analyses using these observational data and discussed the relationship between water environment parameters and fish habitat distribution.
Wiegel, F.W.; Perelson, Alan S.
1982-01-01
When placed in suspension red blood cells adhere face-to-face and form long, cylindrical, and sometimes branched structures called rouleaux. We use methods developed in statistical mechanics to compute various statistical properties describing the size and shape of rouleaux in thermodynamic
Qu, Long; Nettleton, Dan; Dekkers, Jack C M
2012-12-01
Given a large number of t-statistics, we consider the problem of approximating the distribution of noncentrality parameters (NCPs) by a continuous density. This problem is closely related to the control of false discovery rates (FDR) in massive hypothesis testing applications, e.g., microarray gene expression analysis. Our methodology is similar to, but improves upon, the existing approach by Ruppert, Nettleton, and Hwang (2007, Biometrics, 63, 483-495). We provide parametric, nonparametric, and semiparametric estimators for the distribution of NCPs, as well as estimates of the FDR and local FDR. In the parametric situation, we assume that the NCPs follow a distribution that leads to an analytically available marginal distribution for the test statistics. In the nonparametric situation, we use convex combinations of basis density functions to estimate the density of the NCPs. A sequential quadratic programming procedure is developed to maximize the penalized likelihood. The smoothing parameter is selected with the approximate network information criterion. A semiparametric estimator is also developed to combine both parametric and nonparametric fits. Simulations show that, under a variety of situations, our density estimates are closer to the underlying truth and our FDR estimates are improved compared with alternative methods. Data-based simulations and the analyses of two microarray datasets are used to evaluate the performance in realistic situations. © 2012, The International Biometric Society.
Khaemba, W.M.; Stein, A.
2001-01-01
This study illustrates the use of modern statistical procedures for better wildlife management by addressing three key issues: determination of abundance, modeling of animal distributions and variability of diversity in space and time. Prior information in Markov Chain Monte Carlo (MCMC) methods is
Directory of Open Access Journals (Sweden)
E. Farg
2017-04-01
Full Text Available Traditional methods for center pivot evaluation depend on the water depth distribution along the pivot arm. Estimation and mapping the water depth under pivot irrigation systems using remote sensing data is essential for calculating the coefficient of uniformity (CU of water distribution. This study focuses on estimating and mapping water depth using Landsat OLI 8 satellite data integrated with Heerman and Hein (1968 modified equation for center pivot evaluation. Landsat OLI 8 image was geometrically and radiometrically corrected to calculate the vegetation and water indices (NDVI and NDWI in addition to land surface temperature. Results of the statistical analysis showed that the collected water depth in catchment cans is also highly correlated negatively with NDVI. On the other hand water, depth was positively correlated with NDWI and LST. Multi-linear regression analysis using stepwise selection method was applied to estimate and map the water depth distribution. The results showed R2 and adjusted R2 0.93 and 0.88 respectively. Study area or field level verification was applied for estimation equation with correlation 0.93 between the collected water depth and estimated values.
International Nuclear Information System (INIS)
Foray, G.; Descamps-Mandine, A.; R’Mili, M.; Lamon, J.
2012-01-01
The present paper investigates glass fibre flaw size distributions. Two commercial fibre grades (HP and HD) mainly used in cement-based composite reinforcement were studied. Glass fibre fractography is a difficult and time consuming exercise, and thus is seldom carried out. An approach based on tensile tests on multifilament bundles and examination of the fibre surface by atomic force microscopy (AFM) was used. Bundles of more than 500 single filaments each were tested. Thus a statistically significant database of failure data was built up for the HP and HD glass fibres. Gaussian flaw distributions were derived from the filament tensile strength data or extracted from the AFM images. The two distributions were compared. Defect sizes computed from raw AFM images agreed reasonably well with those derived from tensile strength data. Finally, the pertinence of a Gaussian distribution was discussed. The alternative Pareto distribution provided a fair approximation when dealing with AFM flaw size.
Mehrvand, Masoud; Baghanam, Aida Hosseini; Razzaghzadeh, Zahra; Nourani, Vahid
2017-04-01
Since statistical downscaling methods are the most largely used models to study hydrologic impact studies under climate change scenarios, nonlinear regression models known as Artificial Intelligence (AI)-based models such as Artificial Neural Network (ANN) and Support Vector Machine (SVM) have been used to spatially downscale the precipitation outputs of Global Climate Models (GCMs). The study has been carried out using GCM and station data over GCM grid points located around the Peace-Tampa Bay watershed weather stations. Before downscaling with AI-based model, correlation coefficient values have been computed between a few selected large-scale predictor variables and local scale predictands to select the most effective predictors. The selected predictors are then assessed considering grid location for the site in question. In order to increase AI-based downscaling model accuracy pre-processing has been developed on precipitation time series. In this way, the precipitation data derived from various GCM data analyzed thoroughly to find the highest value of correlation coefficient between GCM-based historical data and station precipitation data. Both GCM and station precipitation time series have been assessed by comparing mean and variances over specific intervals. Results indicated that there is similar trend between GCM and station precipitation data; however station data has non-stationary time series while GCM data does not. Finally AI-based downscaling model have been applied to several GCMs with selected predictors by targeting local precipitation time series as predictand. The consequences of recent step have been used to produce multiple ensembles of downscaled AI-based models.
DEFF Research Database (Denmark)
Missov, Trifon I.; Schöley, Jonas
Missov and Finkelstein (2011) prove an Abelian and its corresponding Tauberian theorem regarding distributions for modeling unobserved heterogeneity in fixed-frailty mixture models. The main property of such distributions is the regular variation at zero of their densities. According...... to this criterion admissible distributions are, for example, the gamma, the beta, the truncated normal, the log-logistic and the Weibull, while distributions like the log-normal and the inverse Gaussian do not satisfy this condition. In this article we show that models with admissible frailty distributions...... and a Gompertz baseline provide a better fit to adult human mortality data than the corresponding models with non-admissible frailty distributions. We implement estimation procedures for mixture models with a Gompertz baseline and frailty that follows a gamma, truncated normal, log-normal, or inverse Gaussian...
SAMDIST A Computer Code for Calculating Statistical Distributions for R-Matrix Resonance Parameters
Leal, L C
1995-01-01
The: SAMDIST computer code has been developed to calculate distribution of resonance parameters of the Reich-Moore R-matrix type. The program assumes the parameters are in the format compatible with that of the multilevel R-matrix code SAMMY. SAMDIST calculates the energy-level spacing distribution, the resonance width distribution, and the long-range correlation of the energy levels. Results of these calculations are presented in both graphic and tabular forms.
Statistical distribution of partial widths in the microscopic theory of nuclear reactions
International Nuclear Information System (INIS)
Bunakov, V.E.; Ogloblin, S.G.
1978-01-01
Using the microscopic theory of nuclear reaction the distribution function of neutron reduced partial widths is obtained. It is shown that the distribution of reduced partial widths of a radiative transition is of the same form. The distribution obtained differs from the Porter-Thomas law for neutron widths only in the presence of intermediate structures. It is noteworthy that the presence of an intermediate structure leads to a greater dispersion
Distributed Secure Coordinated Control for Multiagent Systems Under Strategic Attacks.
Feng, Zhi; Wen, Guanghui; Hu, Guoqiang
2017-05-01
This paper studies a distributed secure consensus tracking control problem for multiagent systems subject to strategic cyber attacks modeled by a random Markov process. A hybrid stochastic secure control framework is established for designing a distributed secure control law such that mean-square exponential consensus tracking is achieved. A connectivity restoration mechanism is considered and the properties on attack frequency and attack length rate are investigated, respectively. Based on the solutions of an algebraic Riccati equation and an algebraic Riccati inequality, a procedure to select the control gains is provided and stability analysis is studied by using Lyapunov's method.. The effect of strategic attacks on discrete-time systems is also investigated. Finally, numerical examples are provided to illustrate the effectiveness of theoretical analysis.
Optimal Power Flow for Distribution Systems under Uncertain Forecasts: Preprint
Energy Technology Data Exchange (ETDEWEB)
Dall' Anese, Emiliano; Baker, Kyri; Summers, Tyler
2016-12-01
The paper focuses on distribution systems featuring renewable energy sources and energy storage devices, and develops an optimal power flow (OPF) approach to optimize the system operation in spite of forecasting errors. The proposed method builds on a chance-constrained multi-period AC OPF formulation, where probabilistic constraints are utilized to enforce voltage regulation with a prescribed probability. To enable a computationally affordable solution approach, a convex reformulation of the OPF task is obtained by resorting to i) pertinent linear approximations of the power flow equations, and ii) convex approximations of the chance constraints. Particularly, the approximate chance constraints provide conservative bounds that hold for arbitrary distributions of the forecasting errors. An adaptive optimization strategy is then obtained by embedding the proposed OPF task into a model predictive control framework.
Kleibergen, F.R.; Kleijn, R.; Paap, R.
2000-01-01
We propose a novel Bayesian test under a (noninformative) Jeffreys'priorspecification. We check whether the fixed scalar value of the so-calledBayesian Score Statistic (BSS) under the null hypothesis is aplausiblerealization from its known and standardized distribution under thealternative. Unlike
Spatial distributions at equilibrium under heterogeneous transient subdiffusion.
Berry, Hugues; Soula, Hédi A
2014-01-01
Experimental measurements of the mobility of macromolecules, especially proteins, in cells and their membranes consistently report transient subdiffusion with possibly position-dependent-non-homogeneous-properties. However, the spatiotemporal dynamics of protein mobility when transient subdiffusion is restricted to a subregion of space is still unclear. Here, we investigated the spatial distribution at equilibrium of proteins undergoing transient subdiffusion due to continuous-time random walks (CTRW) in a restricted subregion of a two-dimensional space. Our Monte-Carlo simulations suggest that this process leads to a non-homogeneous spatial distribution of the proteins at equilibrium, where proteins increasingly accumulate in the CTRW subregion as its anomalous properties are increasingly marked. In the case of transient CTRW, we show that this accumulation is dictated by the asymptotic Brownian regime and not by the initial anomalous transient dynamics. Moreover, our results also show that this dominance of the asymptotic Brownian regime cannot be simply generalized to other scenarios of transient subdiffusion. In particular, non-homogeneous transient subdiffusion due to hindrance by randomly-located immobile obstacles does not lead to such a strong local accumulation. These results suggest that, even though they exhibit the same time-dependence of the mean-squared displacement, the different scenarios proposed to account for subdiffusion in the cell lead to different protein distribution in space, even at equilibrium and without coupling with reaction.
Spatial distributions at equilibrium under heterogeneous transient subdiffusion
Directory of Open Access Journals (Sweden)
Hugues eBerry
2014-11-01
Full Text Available Experimental measurements of the mobility of macromolecules, especially proteins, in cells and their membranes consistently report transient subdiffusion with possibly position-dependent -- nonhomogeneous -- properties. However, the spatiotemporal dynamics of protein mobility when transient subdiffusion is restricted to a subregion of space is still unclear. Here, we investigated the spatial distribution at equilibrium of proteins undergoing transient subdiffusion due to continuous-time random walks (CTRW in a restricted subregion of a two-dimensional space. Our Monte-Carlo simulations suggest that this process leads to a nonhomogeneous spatial distribution of the proteins at equilibrium, where proteins increasingly accumulate in the CTRW subregion as its anomalous properties are increasingly marked. In the case of transient CTRW, we show that this accumulation is dictated by the asymptotic Brownian regime and not by the initial anomalous transient dynamics. Moreover, our results also show that this dominance of the asymptotic Brownian regime cannot be simply generalized to other scenarios of transient subdiffusion. In particular, nonhomogeneous transient subdiffusion due to hindrance by randomly-located immobile obstacles does not lead to such a strong local accumulation. These results suggest that, even though they exhibit the same time-dependence of the mean-squared displacement, the different scenarios proposed to account for subdiffusion in the cell lead to different protein distribution in space, even at equilibrium and without coupling with reaction.
Schneider, Markus P. A.
This dissertation contributes to two areas in economics: the understanding of the distribution of earned income and to Bayesian analysis of distributional data. Recently, physicists claimed that the distribution of earned income is exponential (see Yakovenko, 2009). The first chapter explores the perspective that the economy is a statistical mechanical system and the implication for labor market outcomes is considered critically. The robustness of the empirical results that lead to the physicists' claims, the significance of the exponential distribution in statistical mechanics, and the case for a conservation law in economics are discussed. The conclusion reached is that physicists' conception of the economy is too narrow even within their chosen framework, but that their overall approach is insightful. The dual labor market theory of segmented labor markets is invoked to understand why the observed distribution may be a mixture of distributional components, corresponding to different generating mechanisms described in Reich et al. (1973). The application of informational entropy in chapter II connects this work to Bayesian analysis and maximum entropy econometrics. The analysis follows E. T. Jaynes's treatment of Wolf's dice data, but is applied to the distribution of earned income based on CPS data. The results are calibrated to account for rounded survey responses using a simple simulation, and answer the graphical analyses by physicists. The results indicate that neither the income distribution of all respondents nor of the subpopulation used by physicists appears to be exponential. The empirics do support the claim that a mixture with exponential and log-normal distributional components ts the data. In the final chapter, a log-linear model is used to fit the exponential to the earned income distribution. Separating the CPS data by gender and marital status reveals that the exponential is only an appropriate model for a limited number of subpopulations, namely
The Profit Distribution of Supply Chain under E-Commerce
Directory of Open Access Journals (Sweden)
Jiang-Hua Zhang
2014-01-01
Full Text Available With the development of e-commerce, its influence on supply chain and supply chain management is becoming increasingly significant too. In this paper, the literature on the supply chain profit is reviewed first, and then a two-level and four-party supply chain which consists of a supplier, an e-commerce platform, third-party logistics, and demander is taken into consideration. The profit function of supply chain under e-commerce is formulated by taking the price of product and the maximum supply amount under certain investment as decision-making variables and taking the expected value of random variables of price as the setting sales quantity. Finally, the existence of maximum profit in the supply chain is proved in the model, and the coordination of supply chain under e-commerce environment can be achieved by setting coordination parameters when the relevant cost parameters of supply chain members satisfy certain conditions.
International Nuclear Information System (INIS)
Song, Myung Sub; Kim, Song Hyun; Kim, Jong Kyung; Noh, Jae Man
2013-01-01
The uncertainty evaluation with statistical method is performed by repetition of transport calculation with sampling the directly perturbed nuclear data. Hence, the reliable uncertainty result can be obtained by analyzing the results of the numerous transport calculations. One of the problems in the uncertainty analysis with the statistical approach is known as that the cross section sampling from the normal (Gaussian) distribution with relatively large standard deviation leads to the sampling error of the cross sections such as the sampling of the negative cross section. Some collection methods are noted; however, the methods can distort the distribution of the sampled cross sections. In this study, a sampling method of the nuclear data is proposed by using lognormal distribution. After that, the criticality calculations with sampled nuclear data are performed and the results are compared with that from the normal distribution which is conventionally used in the previous studies. In this study, the statistical sampling method of the cross section with the lognormal distribution was proposed to increase the sampling accuracy without negative sampling error. Also, a stochastic cross section sampling and writing program was developed. For the sensitivity and uncertainty analysis, the cross section sampling was pursued with the normal and lognormal distribution. The uncertainties, which are caused by covariance of (n,.) cross sections, were evaluated by solving GODIVA problem. The results show that the sampling method with lognormal distribution can efficiently solve the negative sampling problem referred in the previous studies. It is expected that this study will contribute to increase the accuracy of the sampling-based uncertainty analysis
Tsui, Po-Hsiang; Wan, Yung-Liang; Tai, Dar-In; Shu, Yu-Chen
2015-08-01
Ultrasound Nakagami imaging has recently attracted interest as an imaging technique for analyzing envelope statistics. Because the presence of structures has a strong effect on estimation of the Nakagami parameter, previous studies have indicated that Nakagami imaging should be used specifically for characterization of soft tissues with fewer structures, such as liver tissues. Typically, changes in the properties of the liver parenchyma cause the backscattered statistics to transform from a Rayleigh distribution to a pre-Rayleigh distribution, and this transformation can be visualized using a Nakagami imaging technique. However, different estimators result in different estimated values; thus, the performance of a Nakagami image may depend on the type of estimator used. This study explored the effects of various estimators on ultrasound Nakagami imaging to describe the backscattered statistics as they change from a Rayleigh distribution to a pre-Rayleigh distribution. Simulations and clinical measurements involving patients with liver fibrosis (n = 85) yielded image data that were used to construct B-mode and conventional Nakagami images based on the moment estimator (denoted as mINV images) and maximum-likelihood estimator (denoted as mML images). In addition, novel window-modulated compounding Nakagami images based on the moment estimator (denoted as mWMC images) were also obtained. The means and standard deviations of the Nakagami parameters were examined as a function of the backscattered statistics. The experimental results indicate that the mINV, mML and mWMC images enabled quantitative visualization of the change in backscattered statistics from a Rayleigh distribution to a pre-Rayleigh distribution. Importantly, the mWMC image is superior to both mINV and mML images because it simultaneously realizes sensitive detection of the backscattered statistics and a reduction of estimation variance for image smoothness improvement. We therefore recommend using m
Phosphorus distribution in sandy soil profile under drip irrigation system
International Nuclear Information System (INIS)
El-Gendy, R.W.; Rizk, M.A.; Abd El Moniem, M.; Abdel-Aziz, H.A.; Fahmi, A.E.
2009-01-01
This work aims at to studying the impact of irrigation water applied using drip irrigation system in sandy soil with snap bean on phosphorus distribution. This experiment was carried out in soils and water research department farm, nuclear research center, atomic energy authority, cairo, Egypt. Snap bean was cultivated in sandy soil and irrigated with 50,37.5 and 25 cm water in three water treatments represented 100, 75 and 50% ETc. Phosphorus distribution and direction of soil water movement had been detected in three sites on the dripper line (S1,S2 and S3 at 0,12.5 and 25 cm distance from dripper). Phosphorus fertilizer (super phosphate, 15.5% P 2 O 5 in rate 300 kg/fed)was added before cultivation. Neutron probe was used to detect the water distribution and movement at the three site along soil profile. Soil samples were collected before p-addition, at end developing, mid, and late growth stages to determine residual available phosphorus. The obtained data showed that using 50 cm water for irrigation caused an increase in P-concentration till 75 cm depth in the three sites of 100% etc treatment, and covered P-requirements of snap bean for all growth stages. As for 37.5 and 25 cm irrigation water cannot cover all growth stages for P-requirements of snap bean. It could be concluded that applied irrigation water could drive the residual P-levels till 75 cm depth in the three sites. Yield of the crop had been taken as an indicator as an indicator profile. Yield showed good response according to water quantities and P-transportation within the soil profile
Li, Qizhai; Hu, Jiyuan; Ding, Juan; Zheng, Gang
2014-04-01
A classical approach to combine independent test statistics is Fisher's combination of $p$-values, which follows the $\\chi ^2$ distribution. When the test statistics are dependent, the gamma distribution (GD) is commonly used for the Fisher's combination test (FCT). We propose to use two generalizations of the GD: the generalized and the exponentiated GDs. We study some properties of mis-using the GD for the FCT to combine dependent statistics when one of the two proposed distributions are true. Our results show that both generalizations have better control of type I error rates than the GD, which tends to have inflated type I error rates at more extreme tails. In practice, common model selection criteria (e.g. Akaike information criterion/Bayesian information criterion) can be used to help select a better distribution to use for the FCT. A simple strategy of the two generalizations of the GD in genome-wide association studies is discussed. Applications of the results to genetic pleiotrophic associations are described, where multiple traits are tested for association with a single marker.
International Nuclear Information System (INIS)
Gomez, Miryam; Saldarriaga, Julio; Correa, Mauricio; Posada, Enrique; Castrillon M, Francisco Javier
2007-01-01
Sand fields, constructions, carbon boilers, roads, and biologic sources are air-contaminant-constituent factors in down town Valle de Aburra, among others. the distribution of road contribution data to total suspended particles according to the source receptor model MCF, source correlation modeling, is nearly a gamma distribution. Chi-square goodness of fit is used to model statistically. This test for goodness of fit also allows estimating the parameters of the distribution utilizing maximum likelihood method. As convergence criteria, the estimation maximization algorithm is used. The mean of road contribution data to total suspended particles according to the source receptor model MCF, is straightforward and validates the road contribution factor to the atmospheric pollution of the zone under study
Assessing mechanical vulnerability in water distribution networks under multiple failures
Berardi, Luigi; Ugarelli, Rita; Røstum, Jon; Giustolisi, Orazio
2014-03-01
Understanding mechanical vulnerability of water distribution networks (WDN) is of direct relevance for water utilities since it entails two different purposes. On the one hand, it might support the identification of severe failure scenarios due to external causes (e.g., natural or intentional events) which result into the most critical consequences on WDN supply capacity. On the other hand, it aims at figure out the WDN portions which are more prone to be affected by asset disruptions. The complexity of such analysis stems from the number of possible scenarios with single and multiple simultaneous shutdowns of asset elements leading to modifications of network topology and insufficient water supply to customers. In this work, the search for the most disruptive combinations of multiple asset failure events is formulated and solved as a multiobjective optimization problem. The higher vulnerability failure scenarios are detected as those causing the lower supplied demand due to the lower number of simultaneous failures. The automatic detection of WDN topology, subsequent to the detachments of failed elements, is combined with pressure-driven analysis. The methodology is demonstrated on a real water distribution network. Results show that, besides the failures causing the detachment of reservoirs, tanks, or pumps, there are other different topological modifications which may cause severe WDN service disruptions. Such information is of direct relevance to support planning asset enhancement works and improve the preparedness to extreme events.
Distributionally Robust Joint Chance Constrained Problem under Moment Uncertainty
Directory of Open Access Journals (Sweden)
Ke-wei Ding
2014-01-01
Full Text Available We discuss and develop the convex approximation for robust joint chance constraints under uncertainty of first- and second-order moments. Robust chance constraints are approximated by Worst-Case CVaR constraints which can be reformulated by a semidefinite programming. Then the chance constrained problem can be presented as semidefinite programming. We also find that the approximation for robust joint chance constraints has an equivalent individual quadratic approximation form.
Smooth conditional distribution function and quantiles under random censorship.
Leconte, Eve; Poiraud-Casanova, Sandrine; Thomas-Agnan, Christine
2002-09-01
We consider a nonparametric random design regression model in which the response variable is possibly right censored. The aim of this paper is to estimate the conditional distribution function and the conditional alpha-quantile of the response variable. We restrict attention to the case where the response variable as well as the explanatory variable are unidimensional and continuous. We propose and discuss two classes of estimators which are smooth with respect to the response variable as well as to the covariate. Some simulations demonstrate that the new methods have better mean square error performances than the generalized Kaplan-Meier estimator introduced by Beran (1981) and considered in the literature by Dabrowska (1989, 1992) and Gonzalez-Manteiga and Cadarso-Suarez (1994).
Distributed Generation Investment by a Microgrid under Uncertainty
Energy Technology Data Exchange (ETDEWEB)
Marnay, Chris; Siddiqui, Afzal; Marnay, Chris
2008-08-11
This paper examines a California-based microgrid?s decision to invest in a distributed generation (DG) unit fuelled by natural gas. While the long-term natural gas generation cost is stochastic, we initially assume that the microgrid may purchase electricity at a fixed retail rate from its utility. Using the real options approach, we find a natural gas generation cost threshold that triggers DG investment. Furthermore, the consideration of operational flexibility by the microgrid increases DG investment, while the option to disconnect from the utility is not attractive. By allowing the electricity price to be stochastic, we next determine an investment threshold boundary and find that high electricity price volatility relative to that of natural gas generation cost delays investment while simultaneously increasing the value of the investment. We conclude by using this result to find the implicit option value of the DG unit when two sources of uncertainty exist.
Distributed generation investment by a microgrid under uncertainty
International Nuclear Information System (INIS)
Siddiqui, Afzal S.; Marnay, Chris
2008-01-01
This paper examines a California-based microgrid's decision to invest in a distributed generation (DG) unit fuelled by natural gas. While the long-term natural gas generation cost is stochastic, we initially assume that the microgrid may purchase electricity at a fixed retail rate from its utility. Using the real options approach, we find a natural gas generation cost threshold that triggers DG investment. Furthermore, the consideration of operational flexibility by the microgrid increases DG investment, while the option to disconnect from the utility is not attractive. By allowing the electricity price to be stochastic, we next determine an investment threshold boundary and find that high electricity price volatility relative to that of natural gas generation cost delays investment while simultaneously increasing the value of the investment. We conclude by using this result to find the implicit option value of the DG unit when two sources of uncertainty exist. (author)
One of the main uses of biomarker measurements is to compare different populations to each other and to assess risk in comparison to established parameters. This is most often done using summary statistics such as central tendency, variance components, confidence intervals, excee...
Statistical PERT: Improvements in the Determination of the Project Completion Time Distribution
1978-08-01
Mathematics White Hfll Ithaca, Ntv York 14850 Attn: Prot, j. Kiefer University of Wisconsin Department of Statistics Madison, Wisconsin 53706 Attn...Ithaca, New York 19850 Attn: Prof. R.E. Bechhofer Mrs. Barbara Eaudi Univ. Program Coordinator, B.E. NASA Johnson Space Center Houston, TX 77058
Sample Size Requirements for Assessing Statistical Moments of Simulated Crop Yield Distributions
Lehmann, N.; Finger, R.; Klein, T.; Calanca, P.
2013-01-01
Mechanistic crop growth models are becoming increasingly important in agricultural research and are extensively used in climate change impact assessments. In such studies, statistics of crop yields are usually evaluated without the explicit consideration of sample size requirements. The purpose of
Investment and Upgrade in Distributed Generation under Uncertainty
Energy Technology Data Exchange (ETDEWEB)
Siddiqui, Afzal; Maribu, Karl
2008-08-18
The ongoing deregulation of electricity industries worldwide is providing incentives for microgrids to use small-scale distributed generation (DG) and combined heat and power (CHP) applications via heat exchangers (HXs) to meet local energy loads. Although the electric-only efficiency of DG is lower than that of central-station production, relatively high tariff rates and the potential for CHP applications increase the attraction of on-site generation. Nevertheless, a microgrid contemplatingthe installation of gas-fired DG has to be aware of the uncertainty in the natural gas price. Treatment of uncertainty via real options increases the value of the investment opportunity, which then delays the adoption decision as the opportunity cost of exercising the investment option increases as well. In this paper, we take the perspective of a microgrid that can proceed in a sequential manner with DG capacity and HX investment in order to reduce its exposure to risk from natural gas price volatility. In particular, with the availability of the HX, the microgrid faces a tradeoff between reducing its exposure to the natural gas price and maximising its cost savings. By varying the volatility parameter, we find that the microgrid prefers a direct investment strategy for low levels of volatility and a sequential one for higher levels of volatility.
Investment and upgrade in distributed generation under uncertainty
International Nuclear Information System (INIS)
Siddiqui, Afzal S.; Maribu, Karl
2009-01-01
The ongoing deregulation of electricity industries worldwide is providing incentives for microgrids to use small-scale distributed generation (DG) and combined heat and power (CHP) applications via heat exchangers (HXs) to meet local energy loads. Although the electric-only efficiency of DG is lower than that of central-station production, relatively high tariff rates and the potential for CHP applications increase the attraction of on-site generation. Nevertheless, a microgrid contemplating the installation of gas-fired DG has to be aware of the uncertainty in the natural gas price. Treatment of uncertainty via real options increases the value of the investment opportunity, which then delays the adoption decision as the opportunity cost of exercising the investment option increases as well. In this paper, we take the perspective of a microgrid that can proceed in a sequential manner with DG capacity and HX investment in order to reduce its exposure to risk from natural gas price volatility. In particular, with the availability of the HX, the microgrid faces a tradeoff between reducing its exposure to the natural gas price and maximising its cost savings. By varying the volatility parameter, we find that the microgrid prefers a direct investment strategy for low levels of volatility and a sequential one for higher levels of volatility. (author)
Some aspects of statistical distribution of trace element concentrations in biomedical samples
Energy Technology Data Exchange (ETDEWEB)
Majewska, U. E-mail: majewska@pu.kielce.pl; Braziewicz, J.; Banas, D.; Kubala-Kukus, A.; Gozdz, S.; Pajek, M.; Zadrozna, M.; Jaskola, M.; Czyzewski, T
1999-04-02
Concentrations of trace elements in biomedical samples were studied using X-ray fluorescence (XRF), total reflection X-ray fluorescence (TRXRF) and particle-induced X-ray emission (PIXE) methods. Used analytical methods were compared in terms of their detection limits and applicability for studying the trace elements in large populations of biomedical samples. In a result, the XRF and TRXRF methods were selected to be used for the trace element concentration measurements in the urine and woman full-term placenta samples. The measured trace element concentration distributions were found to be strongly asymmetric and described by the logarithmic-normal distribution. Such a distribution is expected for the random sequential process, which realistically models a level of trace elements in studied biomedical samples. The importance and consequences of this finding are discussed, especially in the context of comparison of the concentration measurements in different populations of biomedical samples.
Control of power converters in distributed generation applications under grid fault conditions
DEFF Research Database (Denmark)
Rodriguez, Pedro; Luna, Alvaro; Munoz-Aguilar, Raul
2011-01-01
The operation of distributed power generation systems under grid fault conditions is a key issue for the massive integration of renewable energy systems. Several studies have been conducted to improve the response of such distributed generation systems under voltage dips. In spite of being less s...
DEFF Research Database (Denmark)
Rodriguez, Pedro; Luna, Alvaro; Hermoso, Juan Ramon
2011-01-01
The operation of distributed power generation systems under grid fault conditions is a key issue for the massive integration of renewable energy systems. Several studies have been conducted to improve the response of such distributed generation systems under voltage dips. In spite of being less s...
Differential spatial distribution of arthropods under epiphytic lichens on trees
Directory of Open Access Journals (Sweden)
Jean-Jacques Itzhak Martinez
2014-08-01
Full Text Available Epiphytic lichen thalli on trees may protect arthropods - herbivores or their natural enemies. Although the relationships between lichens on the forest floor to arthropods have been widely studied in boreal regions, those between epiphytic lichens and the arboreal arthropod fauna in temperate and Mediterranean climates are poorly investigated. In particular it is unknown if the animals use lichens differently located on different part of the trees. Our results indicate that numerous arthropods, herbivores and predators, may live in epiphytic lichen cover, and that more of them are found on the trunk than on old or young branches: an average of 2000 individuals were found under each meter square of the thallus covering the trunk of 20 trees, but fewer on branches. In particular more insects from more Orders were detected on trunks than on branches. We propose that this issue should be investigated further to clarify the exact status of epiphytic lichens in arthropod biodiversity conservation.
Production-distribution of electric power in France: 1997-98 statistical data
International Nuclear Information System (INIS)
1999-01-01
This document has been realized using the annual inquiry carried out by the French direction of gas, electricity and coal (Digec). It brings together the main statistical data about the production, transport and consumption of electric power in France: 1997 and 1998 balance sheets, foreign exchanges, long-term evolutions, production with respect to the different energy sources, consumption in the different departments and regions.. (J.S.)
Temperature distribution in the human body under various conditions of induced hyperthermia
Korobko, O. V.; Perelman, T. L.; Fradkin, S. Z.
1977-01-01
A mathematical model based on heat balance equations was developed for studying temperature distribution in the human body under deep hyperthermia which is often induced in the treatment of malignant tumors. The model yields results which are in satisfactory agreement with experimental data. The distribution of temperature under various conditions of induced hyperthermia, i.e. as a function of water temperature and supply rate, is examined on the basis of temperature distribution curves in various body zones.
Guala, M.; Liu, M.
2017-12-01
The kinematics of sediment particles is investigated by non-intrusive imaging methods to provide a statistical description of bedload transport in conditions near the threshold of motion. In particular, we focus on the cyclic transition between motion and rest regimes to quantify the waiting time statistics inferred to be responsible for anomalous diffusion, and so far elusive. Despite obvious limitations in the spatio-temporal domain of the observations, we are able to identify the probability distributions of the particle step time and length, velocity, acceleration, waiting time, and thus distinguish which quantities exhibit well converged mean values, based on the thickness of their respective tails. The experimental results shown here for four different transport conditions highlight the importance of the waiting time distribution and represent a benchmark dataset for the stochastic modeling of bedload transport.
Directory of Open Access Journals (Sweden)
B. Azzouz
2007-01-01
Full Text Available The textile fibre mixture as a multicomponent blend of variable fibres imposes regarding the proper method to predict the characteristics of the final blend. The length diagram and the fibrogram of cotton are generated. Then the length distribution, the length diagram, and the fibrogram of a blend of different categories of cotton are determined. The length distributions by weight of five different categories of cotton (Egyptian, USA (Pima, Brazilian, USA (Upland, and Uzbekistani are measured by AFIS. From these distributions, the length distribution, the length diagram, and the fibrogram by weight of four binary blends are expressed. The length parameters of these cotton blends are calculated and their variations are plotted against the mass fraction x of one component in the blend .These calculated parameters are compared to those of real blends. Finally, the selection of the optimal blends using the linear programming method, based on the hypothesis that the cotton blend parameters vary linearly in function of the components rations, is proved insufficient.
Townsend, J T
1990-11-01
A theory is presented that establishes a dominance hierarchy of potential distinctions (order relations) between two distributions. It is proposed that it is worthwhile for researchers to ascertain the strongest possible distinction, because all weaker distinctions are logically implied. Implications of the theory for hypothesis testing, theory construction, and scales of measurement are considered. Open problems for future research are outlined.
HI column density distribution function at z=0 : Connection to damped Ly alpha statistics
Zwaan, Martin; Verheijen, MAW; Briggs, FH
We present a measurement of the HI column density distribution function f(N-HI) at the present epoch for column densities > 10(20) cm(-2). These high column densities compare to those measured in damped Ly alpha lines seen in absorption against background quasars. Although observationally rare, it
Distribution of Oxycephalidae (Hyperiidea-Amphipoda) in the Indian Ocean- A statistical study
Digital Repository Service at National Institute of Oceanography (India)
Nair, K.K.C.; Jayalakshmy, K.V.
the species total abundance. H(S) and E distribution, indicates that there was high dominance of a few species in the four areas, leading to low species diversity. Significant seasonal variations in species composition was noticed in the SE and SW Indian Ocean...
Directory of Open Access Journals (Sweden)
Jean-Michel eHupé
2015-02-01
Full Text Available Published studies using functional and structural MRI include many errors in the way data are analyzed and conclusions reported. This was observed when working on a comprehensive review of the neural bases of synesthesia, but these errors are probably endemic to neuroimaging studies. All studies reviewed had based their conclusions using Null Hypothesis Significance Tests (NHST. NHST have yet been criticized since their inception because they are more appropriate for taking decisions related to a Null hypothesis (like in manufacturing than for making inferences about behavioral and neuronal processes. Here I focus on a few key problems of NHST related to brain imaging techniques, and explain why or when we should not rely on significance tests. I also observed that, often, the ill-posed logic of NHST was even not correctly applied, and describe what I identified as common mistakes or at least problematic practices in published papers, in light of what could be considered as the very basics of statistical inference. MRI statistics also involve much more complex issues than standard statistical inference. Analysis pipelines vary a lot between studies, even for those using the same software, and there is no consensus which pipeline is the best. I propose a synthetic view of the logic behind the possible methodological choices, and warn against the usage and interpretation of two statistical methods popular in brain imaging studies, the false discovery rate (FDR procedure and permutation tests. I suggest that current models for the analysis of brain imaging data suffer from serious limitations and call for a revision taking into account the new statistics (confidence intervals logic.
Lee, L.; Helsel, D.
2007-01-01
Analysis of low concentrations of trace contaminants in environmental media often results in left-censored data that are below some limit of analytical precision. Interpretation of values becomes complicated when there are multiple detection limits in the data-perhaps as a result of changing analytical precision over time. Parametric and semi-parametric methods, such as maximum likelihood estimation and robust regression on order statistics, can be employed to model distributions of multiply censored data and provide estimates of summary statistics. However, these methods are based on assumptions about the underlying distribution of data. Nonparametric methods provide an alternative that does not require such assumptions. A standard nonparametric method for estimating summary statistics of multiply-censored data is the Kaplan-Meier (K-M) method. This method has seen widespread usage in the medical sciences within a general framework termed "survival analysis" where it is employed with right-censored time-to-failure data. However, K-M methods are equally valid for the left-censored data common in the geosciences. Our S-language software provides an analytical framework based on K-M methods that is tailored to the needs of the earth and environmental sciences community. This includes routines for the generation of empirical cumulative distribution functions, prediction or exceedance probabilities, and related confidence limits computation. Additionally, our software contains K-M-based routines for nonparametric hypothesis testing among an unlimited number of grouping variables. A primary characteristic of K-M methods is that they do not perform extrapolation and interpolation. Thus, these routines cannot be used to model statistics beyond the observed data range or when linear interpolation is desired. For such applications, the aforementioned parametric and semi-parametric methods must be used.
Annunziata, Mario Alberto; Petri, Alberto; Pontuale, Giorgio; Zaccaria, Andrea
2016-10-01
We have considered the statistical distributions of the volumes of 1131 products exported by 148 countries. We have found that the form of these distributions is not unique but heavily depends on the level of development of the nation, as expressed by macroeconomic indicators like GDP, GDP per capita, total export and a recently introduced measure for countries' economic complexity called fitness. We have identified three major classes: a) an incomplete log-normal shape, truncated on the left side, for the less developed countries, b) a complete log-normal, with a wider range of volumes, for nations characterized by intermediate economy, and c) a strongly asymmetric shape for countries with a high degree of development. Finally, the log-normality hypothesis has been checked for the distributions of all the 148 countries through different tests, Kolmogorov-Smirnov and Cramér-Von Mises, confirming that it cannot be rejected only for the countries of intermediate economy.
Directory of Open Access Journals (Sweden)
Krzysztof Jόzwikowska
2015-06-01
Full Text Available The main goal of this work is to determine a statistical non-equilibrium distribution function for the electron and holes in semiconductor heterostructures in steady-state conditions. Based on the postulates of local equilibrium, as well as on the integral form of the weighted Gyarmati’s variational principle in the force representation, using an alternative method, we have derived general expressions, which have the form of the Fermi–Dirac distribution function with four additional components. The physical interpretation of these components has been carried out in this paper. Some numerical results of a non-equilibrium distribution function for an electron in HgCdTe structures are also presented.
Barry, Brendan
2015-11-01
Studies over the last decade have reported power law distributions for the sizes of terrestrial lakes & Arctic melt ponds, as well as relationships between their area & the fractal dimension of their contours. These systems are important for the climate system, in terms of carbon cycling & ice-albedo feedback, respectively; these distributions offer promise for improved quantification & description of their influence. However, a mechanistic explanation of their distribution is lacking, & both systems remain difficult to observe logistically. Here we report 1) a simple mechanistic model for the distribution of lakes & melt ponds, based on statistical topography, which neatly predicts their distribution & the relationship between area & fractal dimension, as well as 2) the existence of a similar phenomena in tidal mud flats. Data was collected at low tide in a tidal bed near Damariscotta, Maine, which reveals a power law size distribution over a large dynamic range & a well-defined compatible fractal dimension. This data set significantly extends the observed spatiotemporal range of such phenomena, & suggests this easily observable system may be an ideal model for lakes & melt ponds. MIT-WHOI Jiont Program, Physical Oceanography.
Some statistical aspects of characterizing the distribution of radionuclides in the environment
International Nuclear Information System (INIS)
Hutchinson, S.W.; Miller, F.L. Jr.
1981-05-01
A radiological characterization was performed at the former Kellex site, the first pilot uranium-diffusion plant, in Jersey City, New Jersey to determine if there were any contaminated regions containing 40 pCi/g or more of 238 U in the top 20 cm of soil over a 400 m 2 area. As a result of this radiological survey, final decisions would be made about the need for remedial action (cleanup) and the suitability for unrestricted use of the site. This paper describes the development and the statistical reasoning behind a sampling plan for the radiological characterization
Accounting providing of statistical analysis of intangible assets renewal under marketing strategy
Directory of Open Access Journals (Sweden)
I.R. Polishchuk
2016-12-01
Full Text Available The article analyzes the content of the Regulations on accounting policies of the surveyed enterprises in terms of the operations concerning the amortization of intangible assets on the following criteria: assessment on admission, determination of useful life, the period of depreciation, residual value, depreciation method, reflection in the financial statements, a unit of account, revaluation, formation of fair value. The characteristic of factors affecting the accounting policies and determining the mechanism for evaluating the completeness and timeliness of intangible assets renewal is showed. The algorithm for selecting the method of intangible assets amortization is proposed. The knowledge base of statistical analysis of timeliness and completeness of intangible assets renewal in terms of the developed internal reporting is expanded. The statistical indicators to assess the effectiveness of the amortization policy for intangible assets are proposed. The marketing strategies depending on the condition and amount of intangible assets in relation to increasing marketing potential for continuity of economic activity are described.
Exponentiated Weibull distribution family under aperture averaging for Gaussian beam waves.
Barrios, Ricardo; Dios, Federico
2012-06-04
Nowadays, the search for a distribution capable of modeling the probability density function (PDF) of irradiance data under all conditions of atmospheric turbulence in the presence of aperture averaging still continues. Here, a family of PDFs alternative to the widely accepted Log-Normal and Gamma-Gamma distributions is proposed to model the PDF of the received optical power in free-space optical communications, namely, the Weibull and the exponentiated Weibull (EW) distribution. Particularly, it is shown how the proposed EW distribution offers an excellent fit to simulation and experimental data under all aperture averaging conditions, under weak and moderate turbulence conditions, as well as for point-like apertures. Another very attractive property of these distributions is the simple closed form expression of their respective PDF and cumulative distribution function.
International Nuclear Information System (INIS)
Hanot, C.; Riaud, P.; Absil, O.; Mennesson, B.; Martin, S.; Liewer, K.; Loya, F.; Mawet, D.; Serabyn, E.
2011-01-01
A new 'self-calibrated' statistical analysis method has been developed for the reduction of nulling interferometry data. The idea is to use the statistical distributions of the fluctuating null depth and beam intensities to retrieve the astrophysical null depth (or equivalently the object's visibility) in the presence of fast atmospheric fluctuations. The approach yields an accuracy much better (about an order of magnitude) than is presently possible with standard data reduction methods, because the astrophysical null depth accuracy is no longer limited by the magnitude of the instrumental phase and intensity errors but by uncertainties on their probability distributions. This approach was tested on the sky with the two-aperture fiber nulling instrument mounted on the Palomar Hale telescope. Using our new data analysis approach alone-and no observations of calibrators-we find that error bars on the astrophysical null depth as low as a few 10 -4 can be obtained in the near-infrared, which means that null depths lower than 10 -3 can be reliably measured. This statistical analysis is not specific to our instrument and may be applicable to other interferometers.
International Nuclear Information System (INIS)
Croce, R P; Demma, Th; Longo, M; Marano, S; Matta, V; Pierro, V; Pinto, I M
2003-01-01
The cumulative distribution of the supremum of a set (bank) of correlators is investigated in the context of maximum likelihood detection of gravitational wave chirps from coalescing binaries with unknown parameters. Accurate (lower-bound) approximants are introduced based on a suitable generalization of previous results by Mohanty. Asymptotic properties (in the limit where the number of correlators goes to infinity) are highlighted. The validity of numerical simulations made on small-size banks is extended to banks of any size, via a Gaussian correlation inequality
Statistics analysis of distribution of Bradysia Ocellaris insect on Oyster mushroom cultivation
Sari, Kurnia Novita; Amelia, Ririn
2015-12-01
Bradysia Ocellaris insect is a pest on Oyster mushroom cultivation. The disitribution of Bradysia Ocellaris have a special pattern that can observed every week with several asumption such as independent, normality and homogenity. We can analyze the number of Bradysia Ocellaris for each week through descriptive analysis. Next, the distribution pattern of Bradysia Ocellaris is described through by semivariogram that is diagram of variance from difference value between pair of observation that separeted by d. Semivariogram model that suitable for Bradysia Ocellaris data is spherical isotropic model.
Quantum-like microeconomics: Statistical model of distribution of investments and production
Khrennikov, Andrei
2008-10-01
In this paper we demonstrate that the probabilistic quantum-like (QL) behavior-the Born’s rule, interference of probabilities, violation of Bell’s inequality, representation of variables by in general noncommutative self-adjoint operators, Schrödinger’s dynamics-can be exhibited not only by processes in the micro world, but also in economics. In our approach the QL-behavior is induced not by properties of systems. Here systems (commodities) are macroscopic. They could not be superpositions of two different states. In our approach the QL-behavior of economical statistics is a consequence of the organization of the process of production as well as investments. In particular, Hamiltonian (“financial energy”) is determined by rate of return.
Portfolio selection problem with liquidity constraints under non-extensive statistical mechanics
International Nuclear Information System (INIS)
Zhao, Pan; Xiao, Qingxian
2016-01-01
In this study, we consider the optimal portfolio selection problem with liquidity limits. A portfolio selection model is proposed in which the risky asset price is driven by the process based on non-extensive statistical mechanics instead of the classic Wiener process. Using dynamic programming and Lagrange multiplier methods, we obtain the optimal policy and value function. Moreover, the numerical results indicate that this model is considerably different from the model based on the classic Wiener process, the optimal strategy is affected by the non-extensive parameter q, the increase in the investment in the risky asset is faster at a larger parameter q and the increase in wealth is similar.
Xiong, Peng; Chen, Quan; Liu, Haiyan
2017-01-01
An important objective of computational protein design is to identify amino acid sequences that stably fold into a given backbone structure. A general approach to this problem is to minimize an energy function in the sequence space. We have previously reported a method to derive statistical energies for fixed-backbone protein design and showed that it led to de novo proteins that fold as expected. Here, we present the usage of the program that implements this method, which we now name as ABACUS (A Backbone-based Amino aCid Usage Survey).
Rosser, B. J.; O'Connor, M. D.
2003-12-01
The status of fish habitat in cold water streams in western North America has, by most accounts, been degraded significantly by sedimentation. In particular, land management activities induce erosion that contributes excess sand-size and finer sediment to stream systems, which is believed to have caused increases in the proportion of fine sediment in spawning gravels. Many watershed studies and regulatory programs have, drawing on previous scientific investigations, set thresholds for fine sediment concentrations in spawning beds. This study examines data from gravel bed streams collected with a McNeil sampler in northern California (typically 25 kg), as well as bulk sediment samples from the Waipaoa River in New Zealand (typically 50 kg). Confidence intervals for various percentiles of the grain size distributions were computed from these data using a two-stage sampling approach. Accuracy and precision of data from these sampling programs were considered in relation to the biological/regulatory thresholds as well as the effort required to obtain, process and analyze grain size distributions. Typically, very large samples are required to obtain data with high precision, suggesting that in many circumstances, it may be difficult to assess whether regulatory thresholds are exceeded.
Panagiotopoulou, Olga; Pataky, Todd C; Hill, Zoe; Hutchinson, John R
2012-05-01
Foot pressure distributions during locomotion have causal links with the anatomical and structural configurations of the foot tissues and the mechanics of locomotion. Elephant feet have five toes bound in a flexible pad of fibrous tissue (digital cushion). Does this specialized foot design control peak foot pressures in such giant animals? And how does body size, such as during ontogenetic growth, influence foot pressures? We addressed these questions by studying foot pressure distributions in elephant feet and their correlation with body mass and centre of pressure trajectories, using statistical parametric mapping (SPM), a neuro-imaging technology. Our results show a positive correlation between body mass and peak pressures, with the highest pressures dominated by the distal ends of the lateral toes (digits 3, 4 and 5). We also demonstrate that pressure reduction in the elephant digital cushion is a complex interaction of its viscoelastic tissue structure and its centre of pressure trajectories, because there is a tendency to avoid rear 'heel' contact as an elephant grows. Using SPM, we present a complete map of pressure distributions in elephant feet during ontogeny by performing statistical analysis at the pixel level across the entire plantar/palmar surface. We hope that our study will build confidence in the potential clinical and scaling applications of mammalian foot pressures, given our findings in support of a link between regional peak pressures and pathogenesis in elephant feet.
Directory of Open Access Journals (Sweden)
E. E. Woodfield
2002-12-01
Full Text Available A statistical investigation of the Doppler spectral width parameter routinely observed by HF coherent radars has been conducted between the Northern and Southern Hemispheres for the nightside ionosphere. Data from the SuperDARN radars at Thykkvibær, Iceland and Syowa East, Antarctica have been employed for this purpose. Both radars frequently observe regions of high (>200 ms-1 spectral width polewards of low (<200 ms-1 spectral width. Three years of data from both radars have been analysed both for the spectral width and line of sight velocity. The pointing direction of these two radars is such that the flow reversal boundary may be estimated from the velocity data, and therefore, we have an estimate of the open/closed field line boundary location for comparison with the high spectral widths. Five key observations regarding the behaviour of the spectral width on the nightside have been made. These are (i the two radars observe similar characteristics on a statistical basis; (ii a latitudinal dependence related to magnetic local time is found in both hemispheres; (iii a seasonal dependence of the spectral width is observed by both radars, which shows a marked absence of latitudinal dependence during the summer months; (iv in general, the Syowa East spectral width tends to be larger than that from Iceland East, and (v the highest spectral widths seem to appear on both open and closed field lines. Points (i and (ii indicate that the cause of high spectral width is magnetospheric in origin. Point (iii suggests that either the propagation of the HF radio waves to regions of high spectral width or the generating mechanism(s for high spectral width is affected by solar illumination or other seasonal effects. Point (iv suggests that the radar beams from each of the radars are subject either to different instrumental or propagation effects, or different geophysical conditions due to their locations, although we suggest that this result is more likely to
Directory of Open Access Journals (Sweden)
E. E. Woodfield
Full Text Available A statistical investigation of the Doppler spectral width parameter routinely observed by HF coherent radars has been conducted between the Northern and Southern Hemispheres for the nightside ionosphere. Data from the SuperDARN radars at Thykkvibær, Iceland and Syowa East, Antarctica have been employed for this purpose. Both radars frequently observe regions of high (>200 ms^{-1} spectral width polewards of low (<200 ms^{-1} spectral width. Three years of data from both radars have been analysed both for the spectral width and line of sight velocity. The pointing direction of these two radars is such that the flow reversal boundary may be estimated from the velocity data, and therefore, we have an estimate of the open/closed field line boundary location for comparison with the high spectral widths. Five key observations regarding the behaviour of the spectral width on the nightside have been made. These are (i the two radars observe similar characteristics on a statistical basis; (ii a latitudinal dependence related to magnetic local time is found in both hemispheres; (iii a seasonal dependence of the spectral width is observed by both radars, which shows a marked absence of latitudinal dependence during the summer months; (iv in general, the Syowa East spectral width tends to be larger than that from Iceland East, and (v the highest spectral widths seem to appear on both open and closed field lines. Points (i and (ii indicate that the cause of high spectral width is magnetospheric in origin. Point (iii suggests that either the propagation of the HF radio waves to regions of high spectral width or the generating mechanism(s for high spectral width is affected by solar illumination or other seasonal effects. Point (iv suggests that the radar beams from each of the radars are subject either to different instrumental or propagation effects, or different geophysical conditions due to their locations, although we suggest that this
Occurrence and distribution of soil Fusarium species under wheat crop in zero tillage
Energy Technology Data Exchange (ETDEWEB)
Silvestro, L. B.; Stenglein, S. A.; Forjan, H.; Dinolfo, M. I.; Aramburri, A. M.; Manso, L.; Moreno, M. V.
2013-05-01
The presence of Fusarium species in cultivated soils is commonly associated with plant debris and plant roots. Fusarium species are also soil saprophytes. The aim of this study was to examine the occurrence and distribution of soil Fusarium spp. at different soil depths in a zero tillage system after the wheat was harvested. Soil samples were obtained at three depths (0-5 cm, 5-10 cm and 10-20 cm) from five crop rotations: I, conservationist agriculture (wheat-sorghum-soybean); II, mixed agriculture/livestock with pastures, without using winter or summer forages (wheat-sorghum-soybean-canola-pastures); III, winter agriculture in depth limited soils (wheat-canola-barley-late soybean); IV, mixed with annual forage (wheat-oat/Vicia-sunflower); V, intensive agriculture (wheat-barley-canola, with alternation of soybean or late soybean). One hundred twenty two isolates of Fusarium were obtained and identified as F. equiseti, F. merismoides, F. oxysporum, F. scirpi and F. solani. The most prevalent species was F. oxysporum, which was observed in all sequences and depths. The Tukey's test showed that the relative frequency of F. oxysporum under intensive agricultural management was higher than in mixed traditional ones. The first 5 cm of soil showed statistically significant differences (p=0.05) with respect to 5-10 cm and 10-20 cm depths. The ANOVA test for the relative frequency of the other species as F. equiseti, F. merismoides, F. scirpi and F. solani, did not show statistically significant differences (p<0.05). We did not find significant differences (p<0.05) in the effect of crop rotations and depth on Shannon, Simpson indexes and species richness. Therefore we conclude that the different sequences and the sampling depth did not affect the alpha diversity of Fusarium community in this system. (Author) 51 refs.
Occurrence and distribution of soil Fusarium species under wheat crop in zero tillage
Directory of Open Access Journals (Sweden)
L. B. Silvestro
2013-01-01
Full Text Available The presence of Fusarium species in cultivated soils is commonly associated with plant debris and plant roots. Fusarium species are also soil saprophytes. The aim of this study was to examine the occurrence and distribution of soil Fusarium spp. at different soil depths in a zero tillage system after the wheat was harvested. Soil samples were obtained at three depths (0-5 cm, 5-10 cm and 10-20 cm from five crop rotations: I, conservationist agriculture (wheat-sorghum-soybean; II, mixed agriculture/livestock with pastures, without using winter or summer forages (wheat-sorghum-soybean-canola-pastures; III, winter agriculture in depth limited soils (wheat-canola-barley-late soybean; IV, mixed with annual forage (wheat-oat/Vicia-sunflower; V, intensive agriculture (wheat-barley-canola, with alternation of soybean or late soybean. One hundred twenty two isolates of Fusarium were obtained and identified as F. equiseti, F. merismoides, F. oxysporum, F. scirpi and F. solani. The most prevalent species was F. oxysporum, which was observed in all sequences and depths. The Tukey’s test showed that the relative frequency of F. oxysporum under intensive agricultural management was higher than in mixed traditional ones. The first 5 cm of soil showed statistically significant differences (p=0.05 with respect to 5-10 cm and 10-20 cm depths. The ANOVA test for the relative frequency of the other species as F. equiseti, F. merismoides, F. scirpi and F. solani, did not show statistically significant differences (p<0.05. We did not find significant differences (p<0.05 in the effect of crop rotations and depth on Shannon, Simpson indexes and species richness. Therefore we conclude that the different sequences and the sampling depth did not affect the alpha diversity of Fusarium community in this system.
Statistical inferences with jointly type-II censored samples from two Pareto distributions
Abu-Zinadah, Hanaa H.
2017-08-01
In the several fields of industries the product comes from more than one production line, which is required to work the comparative life tests. This problem requires sampling of the different production lines, then the joint censoring scheme is appeared. In this article we consider the life time Pareto distribution with jointly type-II censoring scheme. The maximum likelihood estimators (MLE) and the corresponding approximate confidence intervals as well as the bootstrap confidence intervals of the model parameters are obtained. Also Bayesian point and credible intervals of the model parameters are presented. The life time data set is analyzed for illustrative purposes. Monte Carlo results from simulation studies are presented to assess the performance of our proposed method.
A new statistical tool to predict phenology under climate change scenarios
Gienapp, P.; Hemerik, L.; Visser, M.E.
2005-01-01
Climate change will likely affect the phenology of trophic levels differently and thereby disrupt the phenological synchrony between predators and prey. To predict this disruption of the synchrony under different climate change scenarios, good descriptive models for the phenology of the different
Dabanlı, İsmail; Şen, Zekai
2018-04-01
The statistical climate downscaling model by the Turkish Water Foundation (TWF) is further developed and applied to a set of monthly precipitation records. The model is structured by two phases as spatial (regional) and temporal downscaling of global circulation model (GCM) scenarios. The TWF model takes into consideration the regional dependence function (RDF) for spatial structure and Markov whitening process (MWP) for temporal characteristics of the records to set projections. The impact of climate change on monthly precipitations is studied by downscaling Intergovernmental Panel on Climate Change-Special Report on Emission Scenarios (IPCC-SRES) A2 and B2 emission scenarios from Max Plank Institute (EH40PYC) and Hadley Center (HadCM3). The main purposes are to explain the TWF statistical climate downscaling model procedures and to expose the validation tests, which are rewarded in same specifications as "very good" for all stations except one (Suhut) station in the Akarcay basin that is in the west central part of Turkey. Eventhough, the validation score is just a bit lower at the Suhut station, the results are "satisfactory." It is, therefore, possible to say that the TWF model has reasonably acceptable skill for highly accurate estimation regarding standard deviation ratio (SDR), Nash-Sutcliffe efficiency (NSE), and percent bias (PBIAS) criteria. Based on the validated model, precipitation predictions are generated from 2011 to 2100 by using 30-year reference observation period (1981-2010). Precipitation arithmetic average and standard deviation have less than 5% error for EH40PYC and HadCM3 SRES (A2 and B2) scenarios.
Demography and the statistics of lifetime economic transfers under individual stochasticity
Directory of Open Access Journals (Sweden)
Hal Caswell
2015-02-01
Full Text Available Background: As individuals progress through the life cycle, they receive income and consume goods and services. The age schedules of labor income, consumption, and life cycle deficit reflect the economic roles played at different ages. Lifetime accumulation of economic variables has been less well studied, and our goal here is to rectify that. Objective: To derive and apply a method to compute the lifetime accumulated labor income, consumption, and life cycle deficit, and to go beyond the calculation of mean lifetime accumulation to calculate statistics of variability among individuals in lifetime accumulation. Methods: To quantify variation among individuals, we calculate the mean, standard deviation, coefficient of variation, and skewness of lifetime accumulated transfers, using the theory of Markov chains with rewards (Caswell 2011, applied to National Transfer Account data for Germany of 1978, and 2003. Results: The age patterns of lifetime accumulated labor income are relatively stable over time. Both the mean and the standard deviation of remaining lifetime labor income decline with age; the coefficient of variation, measuring variation relative to the mean, increases dramatically with age. The skewness becomes large and positive at older ages. Education level affects all the statistics. About 30Š of the variance in lifetime income is due to variance in age-specific income, and about 70Š is contributed by the mortality schedule. Lifetime consumption is less variable (as measured by the CV than lifetime labor income. Conclusions: We conclude that demographic Markov chains with rewards can add a potentially valuable perspective to studies of the economic lifecycle. The variation among individuals in lifetime accumulations in our results reflects individual stochasticity, not heterogeneity among individuals. Incorporating heterogeneity remains an important problem.
International Nuclear Information System (INIS)
Vardavas, I.M.
1992-01-01
A simple procedure is presented for the statistical analysis of measurement data where the primary concern is the determination of the value corresponding to a specified average exceedance probability. The analysis employs the normal and log-normal frequency distributions together with a χ 2 -test and an error analysis. The error analysis introduces the concept of a counting error criterion, or ζ-test, to test whether the data are sufficient to make the Z 2 -test reliable. The procedure is applied to the analysis of annual rainfall data recorded at stations in the tropical Top End of Australia where the Ranger uranium deposit is situated. 9 refs., 12 tabs., 9 figs
Directory of Open Access Journals (Sweden)
Wang Liang
2016-01-01
Full Text Available Operational modal analysis (OMA is prevalent in large structure modal identification for that it asks for output measurements only. To guarantee identification accuracy, theoretically, OMA data need to be a random process of Gaussian white noise (GWN. Although numerous OMA applications are found in practice, few have particularly discussed the data distribution and to what extent it would blur the modal judgement. This paper presents a method to sieve segments mostly obeying the GWN distribution out of a recording. With a windowing technique, the data segments are evaluated by the modified Kurtosis value. The process has been demonstrated on the monitoring data of two case study structures: one is a laboratory truss bridge excited by artificial forces, the other is a real cable-stayed bridge subject to environmental loads. The results show that weak randomness data may result in false peaks that would possibly mislead the non-parametric modal identification, such as using the Frequency Domain Decomposition method. To overcome, cares on selecting the optimal segment shall be exercised. The proposed method is verified effective to find the most suitable data for modal identification of structural health monitoring systems.
Statistical characteristics of raindrop size distribution in the Tibetan Plateau and southern China
Wu, Yahao; Liu, Liping
2017-06-01
The characteristics of raindrop size distribution (DSD) over the Tibetan Plateau and southern China are studied in this paper, using the DSD data from April to August 2014 collected by HSC-PS32 disdrometers in Nagqu and Yangjiang, comprising a total of 9430 and 6366 1-min raindrop spectra, respectively. The raindrop spectra, characteristics of parameter variations with rainfall rate, and the relationships between reflectivity factor ( Z) and rainfall rate ( R) are analyzed, as well as their DSD changes with precipitation type and rainfall rate. The results show that the average raindrop spectra appear to be one-peak curves, the number concentration for larger drops increase significantly with rainfall rate, and its value over southern China is much higher, especially in convective rain. Standardized Gamma distributions better describe DSD for larger drops, especially for convective rain in southern China. All three Gamma parameters for stratiform precipitation over the Tibetan Plateau are much higher, while its shape parameter ( μ) and mass-weighted mean diameter ( D m), for convective precipitation, are less. In terms of parameter variation with rainfall rate, the normalized intercept parameter ( N w) over the Tibetan Plateau for stratiform rain increases with rainfall rate, which is opposite to the situation in convective rain. The μ over the Tibetan Plateau for stratiform and convective precipitation types decreases with an increase in rainfall rate, which is opposite to the case for D m variation. In Z-R relationships, like " Z = AR b ", the coefficient A over the Tibetan Plateau is smaller, while its b is higher, when the rain type transfers from stratiform to convective ones. Furthermore, with an increase in rainfall rate, parameters A and b over southern China increase gradually, while A over the Tibetan Plateau decreases substantially, which differs from the findings of previous studies. In terms of geographic location and climate over the Tibetan Plateau
Seo, Changwan; Thorne, James H; Hannah, Lee; Thuiller, Wilfried
2009-02-23
Predictions of future species' ranges under climate change are needed for conservation planning, for which species distribution models (SDMs) are widely used. However, global climate model-based (GCM) output grids can bias the area identified as suitable when these are used as SDM predictor variables, because GCM outputs, typically at least 50x50 km, are biologically coarse. We tested the assumption that species ranges can be equally well portrayed in SDMs operating on base data of different grid sizes by comparing SDM performance statistics and area selected by four SDMs run at seven grid sizes, for nine species of contrasting range size. Area selected was disproportionately larger for SDMs run on larger grid sizes, indicating a cut-off point above which model results were less reliable. Up to 2.89 times more species range area was selected by SDMs operating on grids above 50x50 km, compared to SDMs operating at 1 km2. Spatial congruence between areas selected as range also diverged as grid size increased, particularly for species with ranges between 20000 and 90000 km2. These results indicate the need for caution when using such data to plan future protected areas, because an overly large predicted range could lead to inappropriate reserve location selection.
International Nuclear Information System (INIS)
Poudineh, Rahmatallah; Jamasb, Tooraj
2016-01-01
Investment in electricity networks, as regulated natural monopolies, is among the highest regulatory and energy policy priorities. The electricity sector regulators adopt different incentive mechanisms to ensure that the firms undertake sufficient investment to maintain and modernise the grid. Thus, an effective regulatory treatment of investment requires understanding the response of companies to the regulatory incentives. This study analyses the determinants of investment in electricity distribution networks using a panel dataset of 129 Norwegian companies observed from 2004 to 2010. A Bayesian Model Averaging approach is used to provide a robust statistical inference by taking into account the uncertainties around model selection and estimation. The results show that three factors drive nearly all network investments: investment rate in previous period, socio-economic costs of energy not supplied and finally useful life of assets. The results indicate that Norwegian companies have, to some degree, responded to the investment incentives provided by the regulatory framework. However, some of the incentives do not appear to be effective in driving the investments. - Highlights: • This paper investigates determinants of investment under incentive regulation. • We apply a Bayesian model averaging technique to deal with model uncertainty. • Dataset comprises 129 Norwegian electricity network companies from 2004 to 2010. • The results show that firms have generally responded to investment incentives. • However, some of the incentives do not appear to have been effective.
International Nuclear Information System (INIS)
Lewis, J.C.
2011-01-01
In a recent paper (Lewis, 2008) a class of models suitable for application to collision-sequence interference was introduced. In these models velocities are assumed to be completely randomized in each collision. The distribution of velocities was assumed to be Gaussian. The integrated induced dipole moment μk, for vector interference, or the scalar modulation μk, for scalar interference, was assumed to be a function of the impulse (integrated force) fk, or its magnitude fk, experienced by the molecule in a collision. For most of (Lewis, 2008) it was assumed that μk fk and μk fk, but it proved to be possible to extend the models, so that the magnitude of the induced dipole moment is equal to an arbitrary power or sum of powers of the intermolecular force. This allows estimates of the in filling of the interference dip by the dis proportionality of the induced dipole moment and force. One particular such model, using data from (Herman and Lewis, 2006), leads to the most realistic estimate for the in filling of the vector interference dip yet obtained. In (Lewis, 2008) the drastic assumption was made that collision times occurred at equal intervals. In the present paper that assumption is removed: the collision times are taken to form a Poisson process. This is much more realistic than the equal-intervals assumption. The interference dip is found to be a Lorentzian in this model
Isotopic phonon effects in β-rhombohedral boron--non-statistical isotope distribution.
Werheit, H; Filipov, V; Kuhlmann, U; Schwarz, U; Armbrüster, M; Antadze, M
2012-05-02
On the basis of the spectra of IR- and Raman-active phonons, the isotopic phonon effects in β-rhombohedral boron are analysed for polycrystalline (10)B- and (11)B-enriched samples of different origin and high-purity (nat)B single crystals. Intra- and inter-icosahedral B-B vibrations are harmonic, hence meeting the virtual crystal approximation (VCA) requirements. Deviations from the phonon shift expected according to the VCA are attributed to the anharmonic share of the lattice vibrations. In the case of icosahedral vibrations, the agreement with calculations on α-rhombohedral boron by Shirai and Katayama-Yoshida is quite satisfactory. Phonon shifts due to isotopic disorder in (nat)B are separated and determined. Some phonon frequencies are sensitive to impurities. The isotopic phonon effects yield valuable specific information on the nature of the different phonon modes. The occupation of regular boron sites by isotopes deviates significantly from the random distribution. © 2012 IOP Publishing Ltd
A new statistical tool to predict phenology under climate change scenarios
Gienapp, P.; Hemerik, L.; Visser, M.E.
2005-01-01
Climate change will likely affect the phenology of trophic levels differently and thereby disrupt the phenological synchrony between predators and prey. To predict this disruption of the synchrony under different climate change scenarios, good descriptive models for the phenology of the different species are necessary. Many phenological models are based on regressing the observed phenological event against temperatures measured over a fixed period. This is problematic, especially when used fo...
Explicit expressions for European option pricing under a generalized skew normal distribution
Doostparast, Mahdi
2017-01-01
Under a generalized skew normal distribution we consider the problem of European option pricing. Existence of the martingale measure is proved. An explicit expression for a given European option price is presented in terms of the cumulative distribution function of the univariate skew normal and the bivariate standard normal distributions. Some special cases are investigated in a greater detail. To carry out the sensitivity of the option price to the skew parameters, numerical methods are app...
Cardoso, I.M.; Boddington, C.L.; Janssen, B.H.; Oenema, O.; Kuyper, T.W.
2003-01-01
Deep-rooting trees in agroforestry systems may promote distribution of spores of arbuscular mycorrhizal fungi (AMF) at deeper soil levels. We investigated the vertical distribution of AMF spores in Oxisols under agroforestry and monocultural (unshaded) coffee systems in on-farm experiments (
Open Access This is an Open Access article distributed under the ...
African Journals Online (AJOL)
International Journal of Medicine and Biomedical Research ... Open Access This is an Open Access article distributed under the terms of the creative commons Attribution 4.0 licence (http://creativecommons.org/licenses/by/4.0) which permits unrestricted use, distribution, and reproduction in any medium, provided the.
Sohn, Illsoo; Lee, Byong Ok; Lee, Kwang Bok
Recently, multimedia services are increasing with the widespread use of various wireless applications such as web browsers, real-time video, and interactive games, which results in traffic asymmetry between the uplink and downlink. Hence, time division duplex (TDD) systems which provide advantages in efficient bandwidth utilization under asymmetric traffic environments have become one of the most important issues in future mobile cellular systems. It is known that two types of intercell interference, referred to as crossed-slot interference, additionally arise in TDD systems; the performances of the uplink and downlink transmissions are degraded by BS-to-BS crossed-slot interference and MS-to-MS crossed-slot interference, respectively. The resulting performance unbalance between the uplink and downlink makes network deployment severely inefficient. Previous works have proposed intelligent time slot allocation algorithms to mitigate the crossed-slot interference problem. However, they require centralized control, which causes large signaling overhead in the network. In this paper, we propose to change the shape of the cellular structure itself. The conventional cellular structure is easily transformed into the proposed cellular structure with distributed receive antennas (DRAs). We set up statistical Markov chain traffic model and analyze the bit error performances of the conventional cellular structure and proposed cellular structure under asymmetric traffic environments. Numerical results show that the uplink and downlink performances of the proposed cellular structure become balanced with the proper number of DRAs and thus the proposed cellular structure is notably cost-effective in network deployment compared to the conventional cellular structure. As a result, extending the conventional cellular structure into the proposed cellular structure with DRAs is a remarkably cost-effective solution to support asymmetric traffic environments in future mobile cellular
Zhang, Yonggen; Schaap, Marcel G.
2017-04-01
Pedotransfer functions (PTFs) have been widely used to predict soil hydraulic parameters in favor of expensive laboratory or field measurements. Rosetta (Schaap et al., 2001, denoted as Rosetta1) is one of many PTFs and is based on artificial neural network (ANN) analysis coupled with the bootstrap re-sampling method which allows the estimation of van Genuchten water retention parameters (van Genuchten, 1980, abbreviated here as VG), saturated hydraulic conductivity (Ks), and their uncertainties. In this study, we present an improved set of hierarchical pedotransfer functions (Rosetta3) that unify the water retention and Ks submodels into one. Parameter uncertainty of the fit of the VG curve to the original retention data is used in the ANN calibration procedure to reduce bias of parameters predicted by the new PTF. One thousand bootstrap replicas were used to calibrate the new models compared to 60 or 100 in Rosetta1, thus allowing the uni-variate and bi-variate probability distributions of predicted parameters to be quantified in greater detail. We determined the optimal weights for VG parameters and Ks, the optimal number of hidden nodes in ANN, and the number of bootstrap replicas required for statistically stable estimates. Results show that matric potential-dependent bias was reduced significantly while root mean square error (RMSE) for water content were reduced modestly; RMSE for Ks was increased by 0.9% (H3w) to 3.3% (H5w) in the new models on log scale of Ks compared with the Rosetta1 model. It was found that estimated distributions of parameters were mildly non-Gaussian and could instead be described rather well with heavy-tailed α-stable distributions. On the other hand, arithmetic means had only a small estimation bias for most textures when compared with the mean-like "shift" parameter of the α-stable distributions. Arithmetic means and (co-)variances are therefore still recommended as summary statistics of the estimated distributions. However, it
Directory of Open Access Journals (Sweden)
Ernesto eIacucci
2012-02-01
Full Text Available High-throughput molecular biology studies, such as microarray assays of gene expression, two-hybrid experiments for detecting protein interactions, or ChIP-Seq experiments for transcription factor binding, often result in an interesting set of genes—say, genes that are co-expressed or bound by the same factor. One way of understanding the biological meaning of such a set is to consider what processes or functions, as defined in an ontology, are over-represented (enriched or under-represented (depleted among genes in the set. Usually, the significance of enrichment or depletion scores is based on simple statistical models and on the membership of genes in different classifications. We consider the more general problem of computing p-values for arbitrary integer additive statistics, or weighted membership functions. Such membership functions can be used to represent, for example, prior knowledge on the role of certain genes or classifications, differential importance of different classifications or genes to the experimenter, hierarchical relationships between classifications, or different degrees of interestingness or evidence for specific genes. We describe a generic dynamic programming algorithm that can compute exact p-values for arbitrary integer additive statistics. We also describe several optimizations for important special cases, which can provide orders-of-magnitude speed up in the computations. We apply our methods to datasets describing oxidative phosphorylation and parturition and compare p-values based on computations of several different statistics for measuring enrichment. We find major differences between p-values resulting from these statistics, and that some statistics recover gold standard annotations of the data better than others. Our work establishes a theoretical and algorithmic basis for far richer notions of enrichment or depletion of gene sets with respect to gene ontologies than has previously been available.
François, Clément; Schön, Daniele
2014-02-01
There is increasing evidence that humans and other nonhuman mammals are sensitive to the statistical structure of auditory input. Indeed, neural sensitivity to statistical regularities seems to be a fundamental biological property underlying auditory learning. In the case of speech, statistical regularities play a crucial role in the acquisition of several linguistic features, from phonotactic to more complex rules such as morphosyntactic rules. Interestingly, a similar sensitivity has been shown with non-speech streams: sequences of sounds changing in frequency or timbre can be segmented on the sole basis of conditional probabilities between adjacent sounds. We recently ran a set of cross-sectional and longitudinal experiments showing that merging music and speech information in song facilitates stream segmentation and, further, that musical practice enhances sensitivity to statistical regularities in speech at both neural and behavioral levels. Based on recent findings showing the involvement of a fronto-temporal network in speech segmentation, we defend the idea that enhanced auditory learning observed in musicians originates via at least three distinct pathways: enhanced low-level auditory processing, enhanced phono-articulatory mapping via the left Inferior Frontal Gyrus and Pre-Motor cortex and increased functional connectivity within the audio-motor network. Finally, we discuss how these data predict a beneficial use of music for optimizing speech acquisition in both normal and impaired populations. Copyright © 2013 Elsevier B.V. All rights reserved.
Crouch, Daniel J M
2017-10-27
The prevalence of sexual reproduction remains mysterious, as it poses clear evolutionary drawbacks compared to reproducing asexually. Several possible explanations exist, with one of the most likely being that finite population size causes linkage disequilibria to randomly generate and impede the progress of natural selection, and that these are eroded by recombination via sexual reproduction. Previous investigations have either analysed this phenomenon in detail for small numbers of loci, or performed population simulations for many loci. Here we present a quantitative genetic model for fitness, based on the Price Equation, in order to examine the theoretical consequences of randomly generated linkage disequilibria when there are many loci. In addition, most previous work has been concerned with the long-term consequences of deleterious linkage disequilibria for population fitness. The expected change in mean fitness between consecutive generations, a measure of short-term evolutionary success, is shown under random environmental influences to be related to the autocovariance in mean fitness between the generations, capturing the effects of stochastic forces such as genetic drift. Interaction between genetic drift and natural selection, due to randomly generated linkage disequilibria, is demonstrated to be one possible source of mean fitness autocovariance. This suggests a possible role for sexual reproduction in reducing the negative effects of genetic drift, thereby improving the short-term efficacy of natural selection. Copyright © 2017 Elsevier Ltd. All rights reserved.
Statistical inference for the additive hazards model under outcome-dependent sampling.
Yu, Jichang; Liu, Yanyan; Sandler, Dale P; Zhou, Haibo
2015-09-01
Cost-effective study design and proper inference procedures for data from such designs are always of particular interests to study investigators. In this article, we propose a biased sampling scheme, an outcome-dependent sampling (ODS) design for survival data with right censoring under the additive hazards model. We develop a weighted pseudo-score estimator for the regression parameters for the proposed design and derive the asymptotic properties of the proposed estimator. We also provide some suggestions for using the proposed method by evaluating the relative efficiency of the proposed method against simple random sampling design and derive the optimal allocation of the subsamples for the proposed design. Simulation studies show that the proposed ODS design is more powerful than other existing designs and the proposed estimator is more efficient than other estimators. We apply our method to analyze a cancer study conducted at NIEHS, the Cancer Incidence and Mortality of Uranium Miners Study, to study the risk of radon exposure to cancer.
Meynard, Christine N; Migeon, Alain; Navajas, Maria
2013-01-01
Many species are shifting their distributions due to climate change and to increasing international trade that allows dispersal of individuals across the globe. In the case of agricultural pests, such range shifts may heavily impact agriculture. Species distribution modelling may help to predict potential changes in pest distributions. However, these modelling strategies are subject to large uncertainties coming from different sources. Here we used the case of the tomato red spider mite (Tetranychus evansi), an invasive pest that affects some of the most important agricultural crops worldwide, to show how uncertainty may affect forecasts of the potential range of the species. We explored three aspects of uncertainty: (1) species prevalence; (2) modelling method; and (3) variability in environmental responses between mites belonging to two invasive clades of T. evansi. Consensus techniques were used to forecast the potential range of the species under current and two different climate change scenarios for 2080, and variance between model projections were mapped to identify regions of high uncertainty. We revealed large predictive variations linked to all factors, although prevalence had a greater influence than the statistical model once the best modelling strategies were selected. The major areas threatened under current conditions include tropical countries in South America and Africa, and temperate regions in North America, the Mediterranean basin and Australia. Under future scenarios, the threat shifts towards northern Europe and some other temperate regions in the Americas, whereas tropical regions in Africa present a reduced risk. Analysis of niche overlap suggests that the current differential distribution of mites of the two clades of T. evansi can be partially attributed to environmental niche differentiation. Overall this study shows how consensus strategies and analysis of niche overlap can be used jointly to draw conclusions on invasive threat
Directory of Open Access Journals (Sweden)
Christine N Meynard
Full Text Available Many species are shifting their distributions due to climate change and to increasing international trade that allows dispersal of individuals across the globe. In the case of agricultural pests, such range shifts may heavily impact agriculture. Species distribution modelling may help to predict potential changes in pest distributions. However, these modelling strategies are subject to large uncertainties coming from different sources. Here we used the case of the tomato red spider mite (Tetranychus evansi, an invasive pest that affects some of the most important agricultural crops worldwide, to show how uncertainty may affect forecasts of the potential range of the species. We explored three aspects of uncertainty: (1 species prevalence; (2 modelling method; and (3 variability in environmental responses between mites belonging to two invasive clades of T. evansi. Consensus techniques were used to forecast the potential range of the species under current and two different climate change scenarios for 2080, and variance between model projections were mapped to identify regions of high uncertainty. We revealed large predictive variations linked to all factors, although prevalence had a greater influence than the statistical model once the best modelling strategies were selected. The major areas threatened under current conditions include tropical countries in South America and Africa, and temperate regions in North America, the Mediterranean basin and Australia. Under future scenarios, the threat shifts towards northern Europe and some other temperate regions in the Americas, whereas tropical regions in Africa present a reduced risk. Analysis of niche overlap suggests that the current differential distribution of mites of the two clades of T. evansi can be partially attributed to environmental niche differentiation. Overall this study shows how consensus strategies and analysis of niche overlap can be used jointly to draw conclusions on invasive
Statistical analysis of wind speed using two-parameter Weibull distribution in Alaçatı region
International Nuclear Information System (INIS)
Ozay, Can; Celiktas, Melih Soner
2016-01-01
Highlights: • Wind speed & direction data from September 2008 to March 2014 has been analyzed. • Mean wind speed for the whole data set has been found to be 8.11 m/s. • Highest wind speed is observed in July with a monthly mean value of 9.10 m/s. • Wind speed with the most energy has been calculated as 12.77 m/s. • Observed data has been fit to a Weibull distribution and k &c parameters have been calculated as 2.05 and 9.16. - Abstract: Weibull Statistical Distribution is a common method for analyzing wind speed measurements and determining wind energy potential. Weibull probability density function can be used to forecast wind speed, wind density and wind energy potential. In this study a two-parameter Weibull statistical distribution is used to analyze the wind characteristics of Alaçatı region, located in Çeşme, İzmir. The data used in the density function are acquired from a wind measurement station in Alaçatı. Measurements were gathered on three different heights respectively 70, 50 and 30 m between 10 min intervals for five and half years. As a result of this study; wind speed frequency distribution, wind direction trends, mean wind speed, and the shape and the scale (k&c) Weibull parameters have been calculated for the region. Mean wind speed for the entirety of the data set is found to be 8.11 m/s. k&c parameters are found as 2.05 and 9.16 in relative order. Wind direction analysis along with a wind rose graph for the region is also provided with the study. Analysis suggests that higher wind speeds which range from 6–12 m/s are prevalent between the sectors 340–360°. Lower wind speeds, from 3 to 6 m/s occur between sectors 10–29°. Results of this study contribute to the general knowledge about the regions wind energy potential and can be used as a source for investors and academics.
International Nuclear Information System (INIS)
Silva Junior, H.C. da.
1978-12-01
Reactor fuel elements generally consist of rod bundles with the coolant flowing axially through the region between the rods. The confiability of the thermohydraulic design of such elements is related to a detailed description of the velocity field. A two-equation statistical model (K-epsilon) of turbulence is applied to compute main and secondary flow fields, wall shear stress distributions and friction factors of steady, fully developed turbulent flows, with incompressible, temperature independent fluid flowing axially through triangular or square arrays of rod bundles. The numerical procedure uses the vorticity and the stream function to describe the velocity field. Comparison with experimental and analytical data of several investigators is presented. Results are in good agreement. (Author) [pt
Craven, Galen T.; Nitzan, Abraham
2018-01-01
Statistical properties of Brownian motion that arise by analyzing, separately, trajectories over which the system energy increases (upside) or decreases (downside) with respect to a threshold energy level are derived. This selective analysis is applied to examine transport properties of a nonequilibrium Brownian process that is coupled to multiple thermal sources characterized by different temperatures. Distributions, moments, and correlation functions of a free particle that occur during upside and downside events are investigated for energy activation and energy relaxation processes and also for positive and negative energy fluctuations from the average energy. The presented results are sufficiently general and can be applied without modification to the standard Brownian motion. This article focuses on the mathematical basis of this selective analysis. In subsequent articles in this series, we apply this general formalism to processes in which heat transfer between thermal reservoirs is mediated by activated rate processes that take place in a system bridging them.
Mwakanyamale, Kisa; Day-Lewis, Frederick D.; Slater, Lee D.
2013-01-01
Fiber-optic distributed temperature sensing (FO-DTS) increasingly is used to map zones of focused groundwater/surface-water exchange (GWSWE). Previous studies of GWSWE using FO-DTS involved identification of zones of focused GWSWE based on arbitrary cutoffs of FO-DTS time-series statistics (e.g., variance, cross-correlation between temperature and stage, or spectral power). New approaches are needed to extract more quantitative information from large, complex FO-DTS data sets while concurrently providing an assessment of uncertainty associated with mapping zones of focused GSWSE. Toward this end, we present a strategy combining discriminant analysis (DA) and spectral analysis (SA). We demonstrate the approach using field experimental data from a reach of the Columbia River adjacent to the Hanford 300 Area site. Results of the combined SA/DA approach are shown to be superior to previous results from qualitative interpretation of FO-DTS spectra alone.
Directory of Open Access Journals (Sweden)
Kurniasih Anis
2017-01-01
Full Text Available Analysis of foraminifera in geology,usually being used to find the age of rocks/ sediments and depositional environment. In this study, recent foraminifera was used not only to determinethe sedimentary environment,but also to estimate the ecological condition of the water through a statistical approach.Analysis was performed quantitatively in 10 surface seabed sediment samples in Weda Bay North Maluku. The analysis includes dominance (Sympson Index, diversity and evenness (Shannon Index, and the ratio of planktonic -benthic. The results were shown in the plotting diagram of M-R-T (Miliolid-Rotalid-Textularid to determine the depositional environment. Quantitative analysis was performed using Past software (paleontological version Statistic 1:29.The analysis result showed there was no domination of certain taxon with a moderate degree of evenness and stable communities and considerably a moderate diversity. The results of this analysis indicated that research area had a stable water conditions with the optimum level of carbonate content, oxygen supply, salinity, and temperature. The ratio of planktonic and benthic indicate the relative depth, which was deeper the water increased the percentage of planktonic foraminifera. Based on M-R-T diagram showed the distribution of sediment deposited on exposed carbonate (carbonate platform environment with normal saline.
Yan, Wang-Ji; Ren, Wei-Xin
2018-01-01
This study applies the theoretical findings of circularly-symmetric complex normal ratio distribution Yan and Ren (2016) [1,2] to transmissibility-based modal analysis from a statistical viewpoint. A probabilistic model of transmissibility function in the vicinity of the resonant frequency is formulated in modal domain, while some insightful comments are offered. It theoretically reveals that the statistics of transmissibility function around the resonant frequency is solely dependent on 'noise-to-signal' ratio and mode shapes. As a sequel to the development of the probabilistic model of transmissibility function in modal domain, this study poses the process of modal identification in the context of Bayesian framework by borrowing a novel paradigm. Implementation issues unique to the proposed approach are resolved by Lagrange multiplier approach. Also, this study explores the possibility of applying Bayesian analysis in distinguishing harmonic components and structural ones. The approaches are verified through simulated data and experimentally testing data. The uncertainty behavior due to variation of different factors is also discussed in detail.
Siderius, Daniel W; Mahynski, Nathan A; Shen, Vincent K
2017-05-01
Measurement of the pore-size distribution (PSD) via gas adsorption and the so-called "kernel method" is a widely used characterization technique for rigid adsorbents. Yet, standard techniques and analytical equipment are not appropriate to characterize the emerging class of flexible adsorbents that deform in response to the stress imparted by an adsorbate gas, as the PSD is a characteristic of the material that varies with the gas pressure and any other external stresses. Here, we derive the PSD for a flexible adsorbent using statistical mechanics in the osmotic ensemble to draw analogy to the kernel method for rigid materials. The resultant PSD is a function of the ensemble constraints including all imposed stresses and, most importantly, the deformation free energy of the adsorbent material. Consequently, a pressure-dependent PSD is a descriptor of the deformation characteristics of an adsorbent and may be the basis of future material characterization techniques. We discuss how, given a technique for resolving pressure-dependent PSDs, the present statistical mechanical theory could enable a new generation of analytical tools that measure and characterize certain intrinsic material properties of flexible adsorbents via otherwise simple adsorption experiments.
Validity of the formal Edgeworth expansion when the underlying distribution is partly discrete
DEFF Research Database (Denmark)
Jensen, J.L.
1989-01-01
Validity of the formal Edgeworth expansion for the distribution of the statistic √ng(Xn/n, Yn/n) is considered. Here Xn is a continuous variate and Yn is a discrete variate. In general if (Xn, Yn) resemble the sum of i.i.d. variables and the partial derivative of g with respect to the first...... variable has full rank it is possible to establish an Edgeworth expansion. © 1989 Springer-Verlag....
PERFORMANCE ANALYSIS OF A MODIFIED CFAR BASED RADAR DETECTOR UNDER PEARSON DISTRIBUTED CLUTTER
Amritakar Mandal; Rajesh Mishra; Brajesh Kumar Kaushik
2014-01-01
An adaptive target detector in radar system is used to extract targets from background in noisy environment of unknown statistics. The constant false alarm rate (CFAR) is well known detection algorithm that is being used in almost every modern radar. The cell averaging CFAR is the optimum detector in homogeneous clutter environment when the refence cells have identically independent and exponentially distributed signals. The performance of CA CFAR degrades seriously when clutter power substan...
International Nuclear Information System (INIS)
Zheng, Feihu; An, Zhenlian; Zhang, Yewen; Liu, Chuandong; Lin, Chen; Lei, Qingquan
2013-01-01
The thermal pulse method is a powerful method to measure space charge and polarization distributions in thin dielectric films, but a complicated calibration procedure is necessary to obtain the real distribution. In addition, charge dynamic behaviour under an applied electric field cannot be observed by the classical thermal pulse method. In this work, an improved thermal pulse measuring system with a supplemental circuit for applying high voltage is proposed to realize the mapping of charge distribution in thin dielectric films under an applied field. The influence of the modified measuring system on the amplitude and phase of the thermal pulse response current are evaluated. Based on the new measuring system, an easy calibration approach is presented with some practical examples. The newly developed system can observe space charge evolution under an applied field, which would be very helpful in understanding space charge behaviour in thin films. (paper)
Kuss, Oliver; Hoyer, Annika; Solms, Alexander
2014-01-15
There are still challenges when meta-analyzing data from studies on diagnostic accuracy. This is mainly due to the bivariate nature of the response where information on sensitivity and specificity must be summarized while accounting for their correlation within a single trial. In this paper, we propose a new statistical model for the meta-analysis for diagnostic accuracy studies. This model uses beta-binomial distributions for the marginal numbers of true positives and true negatives and links these margins by a bivariate copula distribution. The new model comes with all the features of the current standard model, a bivariate logistic regression model with random effects, but has the additional advantages of a closed likelihood function and a larger flexibility for the correlation structure of sensitivity and specificity. In a simulation study, which compares three copula models and two implementations of the standard model, the Plackett and the Gauss copula do rarely perform worse but frequently better than the standard model. We use an example from a meta-analysis to judge the diagnostic accuracy of telomerase (a urinary tumor marker) for the diagnosis of primary bladder cancer for illustration. Copyright © 2013 John Wiley & Sons, Ltd.
Demin, V F; Pal'tsev, M A; Chaban, E A
2013-01-01
The current European standard (CES) and the World population age distribution standard is widely used in medical and demographic studies, performed by international (WHO, etc.) and national organizations. The Russian Federal Service of States Statistics (RosStat) uses CES in demographic yearbooks and other publications. The standard is applied in calculation of the standardized mortality rate (SMR) of the population in different countries and territories. Risk assessment is also used CES. In the basis of the standards there has been laid the idea to assess mortality according to uniform standard, so to get possibility to compare the mortality rate of the population in different countries and regions, different genders and different calendar years. Analysis of the results of test calculations of the values of the SMR for the population of Russia and other countries with the use of current standards has revealed serious shortcomings of the latters and set up the task of improving them. A new concept of the development of standards based on the use of the concept of stable equilibrium of the age distribution of the population and survivorship function is proposed.
Thériault Lauzier, Pascal; Tang, Jie; Chen, Guang-Hong
2012-03-01
Myocardial perfusion scans are an important tool in the assessment of myocardial viability following an infarction. Cardiac perfusion analysis using CT datasets is limited by the presence of so-called partial scan artifacts. These artifacts are due to variations in beam hardening and scatter between different short-scan angular ranges. In this research, another angular range dependent effect is investigated: non-uniform noise spatial distribution. Images reconstructed using filtered backprojection (FBP) are subject to this effect. Statistical image reconstruction (SIR) is proposed as a potential solution. A numerical phantom with added Poisson noise was simulated and two swines were scanned in vivo to study the effect of FBP and SIR on the spatial uniformity of the noise distribution. It was demonstrated that images reconstructed using FBP often show variations in noise on the order of 50% between different time frames. This variation is mitigated to about 10% using SIR. The noise level is also reduced by a factor of 2 in SIR images. Finally, it is demonstrated that the measurement of quantitative perfusion metrics are generally more accurate when SIR is used instead of FBP.
Distribution of the two-sample t-test statistic following blinded sample size re-estimation.
Lu, Kaifeng
2016-05-01
We consider the blinded sample size re-estimation based on the simple one-sample variance estimator at an interim analysis. We characterize the exact distribution of the standard two-sample t-test statistic at the final analysis. We describe a simulation algorithm for the evaluation of the probability of rejecting the null hypothesis at given treatment effect. We compare the blinded sample size re-estimation method with two unblinded methods with respect to the empirical type I error, the empirical power, and the empirical distribution of the standard deviation estimator and final sample size. We characterize the type I error inflation across the range of standardized non-inferiority margin for non-inferiority trials, and derive the adjusted significance level to ensure type I error control for given sample size of the internal pilot study. We show that the adjusted significance level increases as the sample size of the internal pilot study increases. Copyright © 2016 John Wiley & Sons, Ltd. Copyright © 2016 John Wiley & Sons, Ltd.
Lin, Y. Q.; Ren, W. X.; Fang, S. E.
2011-11-01
Although most vibration-based damage detection methods can acquire satisfactory verification on analytical or numerical structures, most of them may encounter problems when applied to real-world structures under varying environments. The damage detection methods that directly extract damage features from the periodically sampled dynamic time history response measurements are desirable but relevant research and field application verification are still lacking. In this second part of a two-part paper, the robustness and performance of the statistics-based damage index using the forward innovation model by stochastic subspace identification of a vibrating structure proposed in the first part have been investigated against two prestressed reinforced concrete (RC) beams tested in the laboratory and a full-scale RC arch bridge tested in the field under varying environments. Experimental verification is focused on temperature effects. It is demonstrated that the proposed statistics-based damage index is insensitive to temperature variations but sensitive to the structural deterioration or state alteration. This makes it possible to detect the structural damage for the real-scale structures experiencing ambient excitations and varying environmental conditions.
Liu, Daijun; Peñuelas, Josep; Ogaya, Romà; Estiarte, Marc; Tielbörger, Katja; Slowik, Fabian; Yang, Xiaohong; Bilton, Mark C
2018-03-01
Global warming and reduced precipitation may trigger large-scale species losses and vegetation shifts in ecosystems around the world. However, currently lacking are practical ways to quantify the sensitivity of species and community composition to these often-confounded climatic forces. Here we conducted long-term (16 yr) nocturnal-warming (+0.6°C) and reduced precipitation (-20% soil moisture) experiments in a Mediterranean shrubland. Climatic niche groups (CNGs) - species ranked or classified by similar temperature or precipitation distributions - informatively described community responses under experimental manipulations. Under warming, CNGs revealed that only those species distributed in cooler regions decreased. Correspondingly, under reduced precipitation, a U-shaped treatment effect observed in the total community was the result of an abrupt decrease in wet-distributed species, followed by a delayed increase in dry-distributed species. Notably, while partially correlated, CNG explanations of community response were stronger for their respective climate parameter, suggesting some species possess specific adaptations to either warming or drought that may lead to independent selection to the two climatic variables. Our findings indicate that when climatic distributions are combined with experiments, the resulting incorporation of local plant evolutionary strategies and their changing dynamics over time leads to predictable and informative shifts in community structure under independent climate change scenarios. © 2017 The Authors. New Phytologist © 2017 New Phytologist Trust.
Discontinuous pore fluid distribution under microgravity--KC-135 flight investigations
Reddi, Lakshmi N.; Xiao, Ming; Steinberg, Susan L.
2005-01-01
Designing a reliable plant growth system for crop production in space requires the understanding of pore fluid distribution in porous media under microgravity. The objective of this experimental investigation, which was conducted aboard NASA KC-135 reduced gravity flight, is to study possible particle separation and the distribution of discontinuous wetting fluid in porous media under microgravity. KC-135 aircraft provided gravity conditions of 1, 1.8, and 10(-2) g. Glass beads of a known size distribution were used as porous media; and Hexadecane, a petroleum compound immiscible with and lighter than water, was used as wetting fluid at residual saturation. Nitrogen freezer was used to solidify the discontinuous Hexadecane ganglia in glass beads to preserve the ganglia size changes during different gravity conditions, so that the blob-size distributions (BSDs) could be measured after flight. It was concluded from this study that microgravity has little effect on the size distribution of pore fluid blobs corresponding to residual saturation of wetting fluids in porous media. The blobs showed no noticeable breakup or coalescence during microgravity. However, based on the increase in bulk volume of samples due to particle separation under microgravity, groups of particles, within which pore fluid blobs were encapsulated, appeared to have rearranged themselves under microgravity.
Texeira, Antonio; Rosa, Alvaro; Calapez, Teresa
2009-01-01
This article presents statistical power analysis (SPA) based on the normal distribution using Excel, adopting textbook and SPA approaches. The objective is to present the latter in a comparative way within a framework that is familiar to textbook level readers, as a first step to understand SPA with other distributions. The analysis focuses on the…
ATLAS, Collaboration
2013-01-01
Expected distributions of the test statistics q=log(L(0^+)/L(2^+)) for the spin-0 and spin-2 (produced by gluon fusion) hypotheses. The observed value is indicated by a vertical line. The coloured areas correspond to the integrals of the expected distributions used to compute the p-values for the rejection of each hypothesis.
Hotspot detection using space-time scan statistics on children under five years of age in Depok
Verdiana, Miranti; Widyaningsih, Yekti
2017-03-01
Some problems that affect the health level in Depok is the high malnutrition rates from year to year and the more spread infectious and non-communicable diseases in some areas. Children under five years old is a vulnerable part of population to get the malnutrition and diseases. Based on this reason, it is important to observe the location and time, where and when, malnutrition in Depok happened in high intensity. To obtain the location and time of the hotspots of malnutrition and diseases that attack children under five years old, space-time scan statistics method can be used. Space-time scan statistic is a hotspot detection method, where the area and time of information and time are taken into account simultaneously in detecting the hotspots. This method detects a hotspot with a cylindrical scanning window: the cylindrical pedestal describes the area, and the height of cylinder describe the time. Cylinders formed is a hotspot candidate that may occur, which require testing of hypotheses, whether a cylinder can be summed up as a hotspot. Hotspot detection in this study carried out by forming a combination of several variables. Some combination of variables provides hotspot detection results that tend to be the same, so as to form groups (clusters). In the case of infant health level in Depok city, Beji health care center region in 2011-2012 is a hotspot. According to the combination of the variables used in the detection of hotspots, Beji health care center is most frequently as a hotspot. Hopefully the local government can take the right policy to improve the health level of children under five in the city of Depok.
A novel stress distribution analytical model of O-ring seals under different properties of materials
Energy Technology Data Exchange (ETDEWEB)
Wu, Di; Wang, Shao Ping; Wang, Xing Jian [School of Automation Science and Electrical Engineering, Beihang University, Beijing (China)
2017-01-15
The elastomeric O-ring seals have been widely used as sealing elements in hydraulic systems. The sealing performance of O-ring seals is related to stress distribution. The stresses distribution depends on the squeeze rate and internal pressure, and would vary with properties of O-ring seals materials. Thus, in order to study the sealing performance of O-ring seals, it is necessary to describe the analytic relationship between stress distribution and properties of O-ring seals materials. For this purpose, a novel Stress distribution analytical model (SDAM) is proposed in this paper. The analytical model utilizes two stress complex functions to describe the stress distribution of O-ring seals. The proposed SDAM can express not only the analytical relationship between stress distribution and Young’s modulus, but also the one between stress distribution and Poisson’s ratio. Finally, compared results between finite element analysis and the SDAM validate that the proposed model can effectively reveal the stress distribution under different properties for O-ring materials.
Pattison, D
1994-01-01
The 1993 Omnibus Budget Reconciliation Act raised the proportion of benefits includable in income for the Federal personal income tax. This article presents estimates of the income-distributional effects of the new provision in 1994, the first year for which it is effective. Under the pre-1993 law, up to 50 percent of benefits were included in taxable income for certain high-income beneficiaries. Under the new law, some of these beneficiaries are required to include an even higher proportion of benefits--up to 85 percent. Only 11 percent of beneficiary families, concentrated in the top three deciles by family income, include more of their benefits in taxable income under the new law than they would have under the old law. Another 8 percent include the same amount of benefits under either. The remaining beneficiary families, more than 80 percent, include no benefits in taxable income under either the old law or the new.
DEFF Research Database (Denmark)
Boe-Hansen, Rasmus; Martiny, Adam Camillo; Arvin, Erik
2003-01-01
In this study, the construction a model distribution system suitable for studies of attached and suspended microbial activity in drinking water under controlled circumstances is outlined. The model system consisted of two loops connected in series with a total of 140 biofilm sampling points...
Robustness of the Drinking Water Distribution Network under Changing Future Demand
Agudelo-Vera, C.; Blokker, M.; Vreeburg, J.; Bongard, T.; Hillegers, S.; Van der Hoek, J.P.
2014-01-01
A methodology to determine the robustness of the drinking water distribution system is proposed. The performance of three networks under ten future demand scenarios was tested, using head loss and residence time as indicators. The scenarios consider technological and demographic changes. Daily
Probabilistic accounting of uncertainty in forecasts of species distributions under climate change
Seth J. Wenger; Nicholas A. Som; Daniel C. Dauwalter; Daniel J. Isaak; Helen M. Neville; Charles H. Luce; Jason B. Dunham; Michael K. Young; Kurt D. Fausch; Bruce E. Rieman
2013-01-01
Forecasts of species distributions under future climates are inherently uncertain, but there have been few attempts to describe this uncertainty comprehensively in a probabilistic manner. We developed a Monte Carlo approach that accounts for uncertainty within generalized linear regression models (parameter uncertainty and residual error), uncertainty among competing...
Testing the robustness of two water distribution system layouts under changing drinking water demand
Agudelo-Vera, Claudia; Blokker, M; Vreeburg, J; Vogelaar, H.; Hillegers, S; van der Hoek, J.P.
2016-01-01
A drinking water distribution system (DWDS) is a critical and a costly asset with a long lifetime. Drinking water demand is likely to change in the coming decades. Quantifying these changes involves large uncertainties. This paper proposes a stress test on the robustness of existing DWDS under
Void fraction distribution in a heated rod bundle under flow stagnation conditions
Energy Technology Data Exchange (ETDEWEB)
Herrero, V.A.; Guido-Lavalle, G.; Clausse, A. [Centro Atomico Bariloche and Instituto Balseiro, Bariloche (Argentina)
1995-09-01
An experimental study was performed to determine the axial void fraction distribution along a heated rod bundle under flow stagnation conditions. The development of the flow pattern was investigated for different heat flow rates. It was found that in general the void fraction is overestimated by the Zuber & Findlay model while the Chexal-Lellouche correlation produces a better prediction.
MHC allele frequency distributions under parasite-driven selection: A simulation model
Directory of Open Access Journals (Sweden)
Radwan Jacek
2010-10-01
Full Text Available Abstract Background The extreme polymorphism that is observed in major histocompatibility complex (MHC genes, which code for proteins involved in recognition of non-self oligopeptides, is thought to result from a pressure exerted by parasites because parasite antigens are more likely to be recognized by MHC heterozygotes (heterozygote advantage and/or by rare MHC alleles (negative frequency-dependent selection. The Ewens-Watterson test (EW is often used to detect selection acting on MHC genes over the recent history of a population. EW is based on the expectation that allele frequencies under balancing selection should be more even than under neutrality. We used computer simulations to investigate whether this expectation holds for selection exerted by parasites on host MHC genes under conditions of heterozygote advantage and negative frequency-dependent selection acting either simultaneously or separately. Results In agreement with simple models of symmetrical overdominance, we found that heterozygote advantage acting alone in populations does, indeed, result in more even allele frequency distributions than expected under neutrality, and this is easily detectable by EW. However, under negative frequency-dependent selection, or under the joint action of negative frequency-dependent selection and heterozygote advantage, distributions of allele frequencies were less predictable: the majority of distributions were indistinguishable from neutral expectations, while the remaining runs resulted in either more even or more skewed distributions than under neutrality. Conclusions Our results indicate that, as long as negative frequency-dependent selection is an important force maintaining MHC variation, the EW test has limited utility in detecting selection acting on these genes.
Fronsti, Paul
2015-07-01
The Employee Benefit Research Institute (EBRI) maintains a wealth of data collected from various health savings account (HSA) providers. The EBRI HSA Database contains 2.9 million accounts with total assets of $5 billion as of Dec. 31, 2014. This Issue Brief is the second annual report drawing on cross-sectional data from the EBRI HSA Database. It examines account balances, individual and employer contributions, annual distributions, investment accounts, and account-owner demographics for 2014. Enrollment in HSA-eligible health plans is estimated to be about 17 million policyholders and their dependents, and it has also been estimated that there are 13.8 million accounts holding $24.2 billion in assets as of Dec. 31, 2014. Almost 4 in 5 HSAs have been opened since the beginning of 2011. The average HSA balance at the end of 2014 was $1,933, up from $1,408 at the beginning of the year. Average account balances increased with the age of the owner of the account. Account balances averaged $655 for owners under age 25 and $5,016 for owners ages 65 and older. About 6 percent of HSAs had an associated investment account. End-of-year 2014 balance averages were higher in accounts with investment assets. Thirty-seven percent of HSAs with investment assets ended 2014 with a balance of $10,000 or more, whereas only 4 percent of HSAs without investment assets had such a balance. Among HSAs with investment assets, accounts opened in 2014 ended the year with an average balance of $6,544; whereas those opened in 2005 had an average balance of $19,269 at the end of 2014. HSAs with either individual or employer contributions accounted for 70 percent of all accounts and 86 percent of the assets in 2014. Four percent of these accounts ended the year with a zero balance. On a yearly average, individuals who made contributions deposited $2,096 to their account. HSAs receiving employer contributions received $1,021 a year, on average. Four-fifths of HSAs with a contribution also had a
Davis, Joe M; Arriaga, Edgar A
2010-01-01
The separation of organelles by capillary electrophoresis (CE) produces large numbers of narrow peaks, which commonly are assumed to originate from single particles. In this paper, we show this is not always true. Here, we use established methods to partition simulated and real organelle CEs into regions of constant peak density and then use statistical-overlap theory to calculate the number of peaks (single particles) in each region. The only required measurements are the number of observed peaks (maxima) and peak standard deviation in the regions and the durations of the regions. Theory is developed for the precision of the estimated peak number and the threshold saturation above which the calculation is not advisable due to fluctuation of peak numbers. Theory shows that the relative precision is good when the saturation lies between 0.2 and 1.0 and is optimal when the saturation is slightly greater than 0.5. It also shows the threshold saturation depends on the peak standard deviation divided by the region's duration. The accuracy and precision of peak numbers estimated in different regions of organelle CEs are verified by computer simulations having both constant and nonuniform peak densities. The estimates are accurate to 6%. The estimated peak numbers in different regions are used to calculate migration-time and electrophoretic-mobility distributions. These distributions are less biased by peak overlap than ones determined by counting maxima and provide more correct measures of the organelle properties. The procedure is applied to a mitochondrial CE, in which over 20% of peaks are hidden by peak overlap.
Directory of Open Access Journals (Sweden)
SANKU DEY
2010-11-01
Full Text Available The generalized exponential (GE distribution proposed by Gupta and Kundu (1999 is an important lifetime distribution in survival analysis. In this article, we propose to obtain Bayes estimators and its associated risk based on a class of non-informative prior under the assumption of three loss functions, namely, quadratic loss function (QLF, squared log-error loss function (SLELF and general entropy loss function (GELF. The motivation is to explore the most appropriate loss function among these three loss functions. The performances of the estimators are, therefore, compared on the basis of their risks obtained under QLF, SLELF and GELF separately. The relative efficiency of the estimators is also obtained. Finally, Monte Carlo simulations are performed to compare the performances of the Bayes estimates under different situations.
The flow distribution in the parallel tubes of the cavity receiver under variable heat flux
International Nuclear Information System (INIS)
Hao, Yun; Wang, Yueshe; Hu, Tian
2016-01-01
Highlights: • An experimental loop is built to find the flow distribution in the parallel tubes. • With the concentration of heat flux, two-phase flow makes distribution more uneven. • The total flow rate is chosen appropriately for a wider heat flux distribution. • A suitable system pressure is essential for the optimization of flow distribution. - Abstract: As an optical component of tower solar thermal power station, the heliostat mirror reflects sunlight to one point of the heated surface in the solar cavity receiver, called as one-point focusing system. The radiation heat flux concentrated in the cavity receiver is always non-uniform temporally and spatially, which may lead to extremely local over-heat on the receiver evaporation panels. In this paper, an electrical heated evaporating experimental loop, including five parallel vertical tubes, is set up to evaluate the hydrodynamic characteristics of evaporation panels in a solar cavity receiver under various non-uniform heat flux. The influence of the heat flux concentration ratio, total flow rate, and system pressure on the flow distribution of parallel tubes is discussed. It is found that the flow distribution becomes significantly worse with the increase of heat flux and concentration ratio; and as the system pressure decreased, the flow distribution is improved. It is extremely important to obtain these interesting findings for the safe and stable operation of solar cavity receiver, and can also provide valuable references for the design and optimization of operating parameters solar tower power station system.
Taggart, T. P.; Endreny, T. A.; Nowak, D.
2014-12-01
Gray and green infrastructure in urban environments alters many natural hydrologic processes, creating an urban water balance unique to the developed environment. A common way to assess the consequences of impervious cover and grey infrastructure is by measuring runoff hydrographs. This focus on the watershed outlet masks the spatial variation of hydrologic process alterations across the urban environment in response to localized landscape characteristics. We attempt to represent this spatial variation in the urban environment using the statistically and spatially distributed i-Tree Hydro model, a scoping level urban forest effects water balance model. i-Tree Hydro has undergone expansion and modification to include the effect of green infrastructure processes, road network attributes, and urban pipe system leakages. These additions to the model are intended to increase the understanding of the altered urban hydrologic cycle by examining the effects of the location of these structures on the water balance. Specifically, the effect of these additional structures and functions on the spatially varying properties of interception, soil moisture and runoff generation. Differences in predicted properties and optimized parameter sets between the two models are examined and related to the recent landscape modifications. Datasets used in this study consist of watersheds and sewersheds within the Syracuse, NY metropolitan area, an urban area that has integrated green and gray infrastructure practices to alleviate stormwater problems.
Directory of Open Access Journals (Sweden)
Stefanov Valeri T
2002-05-01
Full Text Available Abstract Background Pairs of related individuals are widely used in linkage analysis. Most of the tests for linkage analysis are based on statistics associated with identity by descent (IBD data. The current biotechnology provides data on very densely packed loci, and therefore, it may provide almost continuous IBD data for pairs of closely related individuals. Therefore, the distribution theory for statistics on continuous IBD data is of interest. In particular, distributional results which allow the evaluation of p-values for relevant tests are of importance. Results A technology is provided for numerical evaluation, with any given accuracy, of the cumulative probabilities of some statistics on continuous genome data for pairs of closely related individuals. In the case of a pair of full-sibs, the following statistics are considered: (i the proportion of genome with 2 (at least 1 haplotypes shared identical-by-descent (IBD on a chromosomal segment, (ii the number of distinct pieces (subsegments of a chromosomal segment, on each of which exactly 2 (at least 1 haplotypes are shared IBD. The natural counterparts of these statistics for the other relationships are also considered. Relevant Maple codes are provided for a rapid evaluation of the cumulative probabilities of such statistics. The genomic continuum model, with Haldane's model for the crossover process, is assumed. Conclusions A technology, together with relevant software codes for its automated implementation, are provided for exact evaluation of the distributions of relevant statistics associated with continuous genome data on closely related individuals.
Wang, Yongli; Wang, Gang; Zuo, Yi; Fan, Lisha; Wei, Jiaxiang
2017-03-01
On March 15, 2015, the central office issued the "Opinions on Further Deepening the Reform of Electric Power System" (in the 2015 No. 9). This policy marks the central government officially opened a new round of electricity reform. As a programmatic document under the new situation to comprehensively promote the reform of the power system, No. 9 document will be approved as a separate transmission and distribution of electricity prices, which is the first task of promoting the reform of the power system. Grid tariff reform is not only the transmission and distribution price of a separate approval, more of the grid company input-output relationship and many other aspects of deep-level adjustments. Under the background of the reform of the transmission and distribution price, the main factors affecting the input-output relationship, such as the main business, electricity pricing, and investment approval, financial accounting and so on, have changed significantly. The paper designed the comprehensive evaluation index system of power grid enterprises' credit rating under the reform of transmission and distribution price to reduce the impact of the reform on the company's international rating results and the ability to raise funds.
Ge, Xuezhen; Jiang, Chao; Chen, Linghong; Qiu, Shuang; Zhao, Yuxiang; Wang, Tao; Zong, Shixiang
2017-04-19
Euwallacea fornicatus (Eichhoff) is an important forest pest that has caused serious damage in America and Vietnam. In 2014, it attacked forests of Acer trialatum in the Yunnan province of China, creating concern in China's Forestry Bureau. We used the CLIMEX model to predict and compare the potential distribution for E. fornicates in China under current (1981-2010) and projected climate conditions (2011-2040) using one scenario (RCP8.5) and one global climate model (GCM), CSIRO-Mk3-6-0. Under both current and future climate conditions, the model predicted E. fornicates to be mainly distributed in the south of China. Comparing distributions under both climate conditions showed that the area of potential distribution was projected to increase (mainly because of an increase in favourable habitat) and shift to the north. Our results help clarify the potential effect of climate change on the range of this forest pest and provide a reference and guide to facilitate its control in China.
Directory of Open Access Journals (Sweden)
Yanlong Chen
2018-01-01
Full Text Available In this research, the particle size distribution and permeability of saturated crushed sandstone under variable axial stresses (0, 2, 4, 8, 12, and 16 MPa were studied. X-ray Computed Tomography results revealed that particle crushing is likely to occur considerably as the axial stress is approaching 4 MPa, which results in the change of pore structure greatly. During compression, the particle size distribution satisfies the fractal condition well, and the fractal dimension of particle size distribution is an effective method for describing the particle crushing state of saturated crushed sandstone. When the axial stress increases from 0 MPa to 4 MPa, the fractal dimension of the particle size distribution increases rapidly by over 60% of the total increase (0–16 MPa, and the permeability decreases sharply by about 85% of the total decrease. These results indicate that 4 MPa is a key value in controlling the particle size distribution and the permeability of the saturated crushed sandstone under axial compression. The permeability is influenced by the initial gradation of the specimens, and a larger Talbot exponent corresponds to a larger permeability.
Chen, Youhua
2008-09-01
Changes to the Earth's climate may affect the distribution of countless species. Understanding the potential distribution of known invasive species under an altered climate is vital to predicting impacts and developing management policy. The present study employs ecological niche modeling to construct the global potential distribution range of the yellow crazy ant (Anoplolepis gracilipes) using past, current and future climate scenarios. Three modeling algorithms, GARP, BioClim and Environmental Distance, were used in a comparative analysis. Output from the models suggest firstly that this insect originated from south Asia, expanded into Europe and then into Afrotropical regions, after which it formed its current distribution. Second, the invasive risk of A. gracilipes under future climatic change scenarios will become greater because of an extension of suitable environmental conditions in higher latitudes. Third, when compared to the GARP model, BioClim and Environmental Distance models were better at modeling a species' ancestral distribution. These findings are discussed in light of the predictive accuracy of these models. © 2008 ISZS, Blackwell Publishing and IOZ/CAS.
International Nuclear Information System (INIS)
Bai, D.S.; Chun, Y.R.; Kim, J.G.
1995-01-01
This paper considers the design of life-test sampling plans based on failure-censored accelerated life tests. The lifetime distribution of products is assumed to be Weibull with a scale parameter that is a log linear function of a (possibly transformed) stress. Two levels of stress higher than the use condition stress, high and low, are used. Sampling plans with equal expected test times at high and low test stresses which satisfy the producer's and consumer's risk requirements and minimize the asymptotic variance of the test statistic used to decide lot acceptability are obtained. The properties of the proposed life-test sampling plans are investigated
Bajar, Somvir; Singh, Anita; Kaushik, C P; Kaushik, Anubha
2017-05-01
Biocovers are considered as the most effective and efficient way to treat methane (CH 4 ) emission from dumpsites and landfills. Active methanotrophs in the biocovers play a crucial role in reduction of emissions through microbiological methane oxidation. Several factors affecting methane bio-oxidation (MOX) have been well documented, however, their interactive effect on the oxidation process needs to be explored. Therefore, the present study was undertaken to investigate the suitability of a dumpsite soil to be employed as biocover, under the influence of substrate concentrations (CH 4 and O 2 ) and temperature at variable incubation periods. Statistical design matrix of Response Surface Methodology (RSM) revealed that MOX rate up to 69.58μgCH 4 g -1 dw h -1 could be achieved under optimum conditions. MOX was found to be more dependent on CH 4 concentration at higher level (30-40%, v/v), in comparison to O 2 concentration. However, unlike other studies MOX was found in direct proportionality relationship with temperature within a range of 25-35°C. The results obtained with the dumpsite soil biocover open up a new possibility to provide improved, sustained and environmental friendly systems to control even high CH 4 emissions from the waste sector. Copyright © 2017 Elsevier Ltd. All rights reserved.
Potential distribution of pine wilt disease under future climate change scenarios.
Directory of Open Access Journals (Sweden)
Akiko Hirata
Full Text Available Pine wilt disease (PWD constitutes a serious threat to pine forests. Since development depends on temperature and drought, there is a concern that future climate change could lead to the spread of PWD infections. We evaluated the risk of PWD in 21 susceptible Pinus species on a global scale. The MB index, which represents the sum of the difference between the mean monthly temperature and 15 when the mean monthly temperatures exceeds 15°C, was used to determine current and future regions vulnerable to PWD (MB ≥ 22. For future climate conditions, we compared the difference in PWD risks among four different representative concentration pathways (RCPs 2.6, 4.5, 6.0, and 8.5 and two time periods (2050s and 2070s. We also evaluated the impact of climate change on habitat suitability for each Pinus species using species distribution models. The findings were then integrated and the potential risk of PWD spread under climate change was discussed. Within the natural Pinus distribution area, southern parts of North America, Europe, and Asia were categorized as vulnerable regions (MB ≥ 22; 16% of the total Pinus distribution area. Representative provinces in which PWD has been reported at least once overlapped with the vulnerable regions. All RCP scenarios showed expansion of vulnerable regions in northern parts of Europe, Asia, and North America under future climate conditions. By the 2070s, under RCP 8.5, an estimated increase in the area of vulnerable regions to approximately 50% of the total Pinus distribution area was revealed. In addition, the habitat conditions of a large portion of the Pinus distribution areas in Europe and Asia were deemed unsuitable by the 2070s under RCP 8.5. Approximately 40% of these regions overlapped with regions deemed vulnerable to PWD, suggesting that Pinus forests in these areas are at risk of serious damage due to habitat shifts and spread of PWD.
Potential distribution of pine wilt disease under future climate change scenarios.
Hirata, Akiko; Nakamura, Katsunori; Nakao, Katsuhiro; Kominami, Yuji; Tanaka, Nobuyuki; Ohashi, Haruka; Takano, Kohei Takenaka; Takeuchi, Wataru; Matsui, Tetsuya
2017-01-01
Pine wilt disease (PWD) constitutes a serious threat to pine forests. Since development depends on temperature and drought, there is a concern that future climate change could lead to the spread of PWD infections. We evaluated the risk of PWD in 21 susceptible Pinus species on a global scale. The MB index, which represents the sum of the difference between the mean monthly temperature and 15 when the mean monthly temperatures exceeds 15°C, was used to determine current and future regions vulnerable to PWD (MB ≥ 22). For future climate conditions, we compared the difference in PWD risks among four different representative concentration pathways (RCPs 2.6, 4.5, 6.0, and 8.5) and two time periods (2050s and 2070s). We also evaluated the impact of climate change on habitat suitability for each Pinus species using species distribution models. The findings were then integrated and the potential risk of PWD spread under climate change was discussed. Within the natural Pinus distribution area, southern parts of North America, Europe, and Asia were categorized as vulnerable regions (MB ≥ 22; 16% of the total Pinus distribution area). Representative provinces in which PWD has been reported at least once overlapped with the vulnerable regions. All RCP scenarios showed expansion of vulnerable regions in northern parts of Europe, Asia, and North America under future climate conditions. By the 2070s, under RCP 8.5, an estimated increase in the area of vulnerable regions to approximately 50% of the total Pinus distribution area was revealed. In addition, the habitat conditions of a large portion of the Pinus distribution areas in Europe and Asia were deemed unsuitable by the 2070s under RCP 8.5. Approximately 40% of these regions overlapped with regions deemed vulnerable to PWD, suggesting that Pinus forests in these areas are at risk of serious damage due to habitat shifts and spread of PWD.
Davis, Joe M; Arriaga, Edgar A
2009-08-28
Organelles commonly are separated by capillary electrophoresis (CE) with laser-induced-fluorescence detection. Usually, it is assumed that peaks observed in the CE originate from single organelles, with negligible occurrence of peak overlap. Under this assumption, migration-time and mobility distributions are obtained by partitioning the CE into different regions and counting the number of observed peaks in each region. In this paper, criteria based on statistical-overlap theory (SOT) are developed to test the assumption of negligible peak overlap and to predict conditions for its validity. For regions of the CE having constant peak density, the numbers of peaks (i.e., intensity profiles of single organelles) and observed peaks (i.e., maxima) are modeled by probability distributions. For minor peak overlap, the distributions partially merge, and their mergence is described by an analogy to the Type-II error of hypothesis testing. Criteria are developed for the amount of peak overlap, at which the number of observed peaks has an 85% or 90% probability of lying within the 95% confidence interval of the number of peaks of single organelles. For this or smaller amounts of peak overlap, the number of observed peaks is a good approximation to the number of peaks. A simple procedure is developed for evaluating peak overlap, requiring determination of only the peak standard deviation, the duration of the region occupied by peaks, and the number of observed peaks in the region. The procedure can be applied independently to each region of the partitioned CE. The procedure is applied to a mitochondrial CE.
Blakemore, J S
1962-01-01
Semiconductor Statistics presents statistics aimed at complementing existing books on the relationships between carrier densities and transport effects. The book is divided into two parts. Part I provides introductory material on the electron theory of solids, and then discusses carrier statistics for semiconductors in thermal equilibrium. Of course a solid cannot be in true thermodynamic equilibrium if any electrical current is passed; but when currents are reasonably small the distribution function is but little perturbed, and the carrier distribution for such a """"quasi-equilibrium"""" co
Secure Distributed Detection under Energy Constraint in IoT-Oriented Sensor Networks.
Zhang, Guomei; Sun, Hao
2016-12-16
We study the secure distributed detection problems under energy constraint for IoT-oriented sensor networks. The conventional channel-aware encryption (CAE) is an efficient physical-layer secure distributed detection scheme in light of its energy efficiency, good scalability and robustness over diverse eavesdropping scenarios. However, in the CAE scheme, it remains an open problem of how to optimize the key thresholds for the estimated channel gain, which are used to determine the sensor's reporting action. Moreover, the CAE scheme does not jointly consider the accuracy of local detection results in determining whether to stay dormant for a sensor. To solve these problems, we first analyze the error probability and derive the optimal thresholds in the CAE scheme under a specified energy constraint. These results build a convenient mathematic framework for our further innovative design. Under this framework, we propose a hybrid secure distributed detection scheme. Our proposal can satisfy the energy constraint by keeping some sensors inactive according to the local detection confidence level, which is characterized by likelihood ratio. In the meanwhile, the security is guaranteed through randomly flipping the local decisions forwarded to the fusion center based on the channel amplitude. We further optimize the key parameters of our hybrid scheme, including two local decision thresholds and one channel comparison threshold. Performance evaluation results demonstrate that our hybrid scheme outperforms the CAE under stringent energy constraints, especially in the high signal-to-noise ratio scenario, while the security is still assured.
International Nuclear Information System (INIS)
Das, Rabindra Nath; Kim, Jinseog; Park, Jeong-Soo
2015-01-01
In quality engineering, the most commonly used lifetime distributions are log-normal, exponential, gamma and Weibull. Experimental designs are useful for predicting the optimal operating conditions of the process in lifetime improvement experiments. In the present article, invariant robust first-order D-optimal designs are derived for correlated lifetime responses having the above four distributions. Robust designs are developed for some correlated error structures. It is shown that robust first-order D-optimal designs for these lifetime distributions are always robust rotatable but the converse is not true. Moreover, it is observed that these designs depend on the respective error covariance structure but are invariant to the above four lifetime distributions. This article generalizes the results of Das and Lin [7] for the above four lifetime distributions with general (intra-class, inter-class, compound symmetry, and tri-diagonal) correlated error structures. - Highlights: • This paper presents invariant robust first-order D-optimal designs under correlated lifetime responses. • The results of Das and Lin [7] are extended for the four lifetime (log-normal, exponential, gamma and Weibull) distributions. • This paper also generalizes the results of Das and Lin [7] to more general correlated error structures
Effect of Overhead Ground Wire Installing under Distribution Lines on Surge Arrester Failures
Sugimoto, Hitoshi
Distribution surge arresters are often damaged by lightning strokes, in particular, winter lightning. An overhead ground wire (OGW) is one of effective measures against surge arrester failures. However, adding the conventional OGW to existing overhead power distribution lines needs the power interruption for construction as well as high costs because of installing them above phase conductors. Experimental results show that a covered conductor for distribution lines is more difficult to attract lightning than a bare conductor. Moreover, lightning strokes to distribution pole heads occupied over 90% of all lightning strokes in the observation result of lightning strokes to actual distribution lines without the conventional OGW, and lightning strokes to power lines were hardly observed. These results indicate that the pole heads shield the power lines from direct lightning strokes. Therefore the author studies the application of an OGW under the distribution lines (UGW) for reducing surge arrester failures. The lightning performance of the UGW is estimated by the Electro-Magnetic Transients Program (EMTP) and its effectiveness is demonstrated. The measure is expected to cut costs of construction and maintenance for lightning protection.
Directory of Open Access Journals (Sweden)
M. Tabei
2016-02-01
Full Text Available Introduction: The to be limited available water amount from one side and to be increased needs of world population from the other side have caused increase of cultivation for products. For this reason, employing new irrigation ways and using new water resources like using the uncommon water (salty water, water drainage are two main strategies for regulating water shortage conditions. On the other side, accumulation of salts on the soil surface in dry regions having low rainfall and much evaporation, i.e. an avoidable case. As doing experiment for determining moisture distribution form demands needs a lot of time and conducting desert experiments are costly, stimulator models are suitable alternatives in answering the problem concerning moving and saltiness distribution. Materials and Methods: In this research, simulation of soil saltiness under drip irrigation was done by the SWAP model and potency of the above model was done in comparison with evaluated relevant results. SWAP model was performed based on measured data in a corn field equipped with drip irrigation system in the farming year 1391-92 in the number one research field in the engineering faculty of water science, ShahidChamran university of Ahvaz and hydraulic parameters of soil obtained from RETC . Statistical model in the form of a random full base plan with four attendants for irrigating water saltiness including salinity S1 (Karoon River water with salinity 3 ds/m as a control treatment, S2 (S1 +0/5, S3 (S1 +1 and S4 (S1 +1/5 dS/m, in 3 repetition and in 3 intervals of 10 cm emitter, 20 cm emitters on the stack, at a depth of 0-90 cm (instead of each 30 cm from soil surface and intervals of 30, 60 and 90 days after modeling cultiviation was done. The cultivation way was done handheld in plots including four rows of 3 m in distance of 75 cm rows and with denseness of 80 bushes in a hectar. Drip irrigation system was of type strip with space of 20 cm pores. Results and Discussion
Stress distribution in 450 Y-type fitting under internal pressure
International Nuclear Information System (INIS)
Aono, Muneshige; Itoh, Kenji; Kikuchi, Masatoshi; Iezawa, Tohru.
1983-01-01
The stress distribution in 45 0 Y-type fitting under internal pressure was obtained by strain gauge method and FEM. The results of the stress distribution on the outer surface obtained by the both methods are in good agreement. From the FEM analysis, the stress concentration zones of this fitting are situated on the outer surface of the side of this fitting where mother and branch pipes cross and the inner surface of the both crotches of the fitting, and there a large amount of tensile stresses are generated. However, the maximum principal stress value occuring under the designed internal pressure is approximately 2.9 kgf/mm 2 and allowable stress of this material (SUS 304) is 1.28 kgf/mm 2 , and therefore it is found that the safety factor of this fitting is above 4. (author)
Yan, Yonglian; Takáč, Tomáš; Li, Xiaoquan; Chen, Houbin; Wang, Yingying; Xu, Enfeng; Xie, Ling; Su, Zhaohua; Šamaj, Jozef; Xu, Chunxiang
2015-01-01
Information on the spatial distribution of arabinogalactan proteins (AGPs) in plant organs and tissues during plant reactions to low temperature (LT) is limited. In this study, the extracellular distribution of AGPs in banana leaves and roots, and their changes under LT stress were investigated in two genotypes differing in chilling tolerance, by immuno-techniques using 17 monoclonal antibodies against different AGP epitopes. Changes in total classical AGPs in banana leaves were also tested. The results showed that AGP epitopes recognized by JIM4, JIM14, JIM16, and CCRC-M32 antibodies were primarily distributed in leaf veins, while those recognized by JIM8, JIM13, JIM15, and PN16.4B4 antibodies exhibited predominant sclerenchymal localization. Epitopes recognized by LM2, LM14, and MAC207 antibodies were distributed in both epidermal and mesophyll cells. Both genotypes accumulated classical AGPs in leaves under LT treatment, and the chilling tolerant genotype contained higher classical AGPs at each temperature treatment. The abundance of JIM4 and JIM16 epitopes in the chilling-sensitive genotype decreased slightly after LT treatment, and this trend was opposite for the tolerant one. LT induced accumulation of LM2- and LM14-immunoreactive AGPs in the tolerant genotype compared to the sensitive one, especially in phloem and mesophyll cells. These epitopes thus might play important roles in banana LT tolerance. Different AGP components also showed differential distribution patterns in banana roots. In general, banana roots started to accumulate AGPs under LT treatment earlier than leaves. The levels of AGPs recognized by MAC207 and JIM13 antibodies in the control roots of the tolerant genotype were higher than in the chilling sensitive one. Furthermore, the chilling tolerant genotype showed high immuno-reactivity against JIM13 antibody. These results indicate that several AGPs are likely involved in banana tolerance to chilling injury.
Xiang, T X; Anderson, B D
1994-03-01
A mean-field statistical mechanical theory has been developed to describe molecular distributions in interphases. The excluded volume interaction has been modeled in terms of a reversible work that is required to create a cavity of the solute size against a pressure tensor exerted by the surrounding interphase molecules. The free energy change associated with this compression process includes the configuration entropy as well as the change in conformational energy of the surrounding chain molecules. The lateral pressure profile in a model lipid bilayer (30.5 A2/chain molecule) has been calculated as a function of depth in the bilayer interior by molecular dynamics simulation. The lateral pressure has a plateau value of 309 +/- 48 bar in the highly ordered region and decreases abruptly in the center of the bilayer. Model calculations have shown that for solute molecules with ellipsoidal symmetry, the orientational order increases with the ratio of the long to short molecular axes at a given solute volume and increases with solute volume at a given axial ratio, in accordance with recent experimental data. Increased lateral pressure (p perpendicular) results in higher local order and exclusion of solute from the interphase, in parallel with the effect of surface density on the partitioning and local order. The logarithm of the interphase/water partition coefficient for spherical solutes decreases linearly with solute volume. This is also an excellent approximation for elongated solutes because of the relatively weak dependence of solute partitioning on molecular shape. The slope is equal to (2p perpendicular - p parallel)/3KBT, where p parallel is the normal pressure component, and different from that predicted by the mean-field lattice theory. Finally, the lattice theory has been extended herein to incorporate an additional constraint on chain packing in the interphase and to account for the effect of solute size on partitioning.
Keita, Souleymane; Zhonghua, Tang
2017-10-01
Sustainable management of groundwater resources is a major issue for developing countries, especially in Mali. The multiple uses of groundwater led countries to promote sound management policies for sustainable use of the groundwater resources. For this reason, each country needs data enabling it to monitor and predict the changes of the resources. Also given the importance of groundwater quality changes often marked by the recurrence of droughts; the potential impacts of regional and geological setting of groundwater resources requires careful study. Unfortunately, recent decades have seen a considerable reduction of national capacities to ensure the hydrogeological monitoring and production of qualit data for decision making. The purpose of this work is to use the groundwater data and translate into useful information that can improve water resources management capacity in Mali. In this paper, we used groundwater analytical data from accredited, laboratories in Mali to carry out a national scale assessment of the groundwater types and their distribution. We, adapted multivariate statistical methods to classify 2035 groundwater samples into seven main groundwater types and built a national scale map from the results. We used a two-level K-mean clustering technique to examine the hydro-geochemical records as percentages of the total concentrations of major ions, namely sodium (Na), magnesium (Mg), calcium (Ca), chloride (Cl), bicarbonate (HCO3), and sulphate (SO4). The first step of clustering formed 20 groups, and these groups were then re-clustered to produce the final seven groundwater types. The results were verified and confirmed using Principal Component Analysis (PCA) and RockWare (Aq.QA) software. We found that HCO3 was the most dominant anion throughout the country and that Cl and SO4 were only important in some local zones. The dominant cations were Na and Mg. Also, major ion ratios changed with geographical location and geological, and climatic
Limiting behavior of delayed sums under a non-identically distribution setup
Chen Pingyan
2008-01-01
We present an accurate description the limiting behavior of delayed sums under a non-identically distribution setup, and deduce Chover-type laws of the iterated logarithm for them. These complement and extend the results of Vasudeva and Divanji (Theory of Probability and its Applications, 37 (1992), 534-542).Apresentamos uma descrição precisa do comportamento limite de somas retardadas, e deduzimos leis do tipo Chover de logaritmo iterado para as mesmas. Isso completa e estende os resultados ...
Nagao, Kan; Kawano, Fumiaki; Ichikawa, Tetsuo
2004-12-01
In case of making complete dentures, we have to consider not only denture stability but also the restoration of aesthetics and function such as mastication and speech. However these are contradictory theoretically from the point of view of denture stability, and it is very difficult to satisfy both requirements in the case of a patient who has poor upper and lower alveolar ridges. We investigated the effect of artificial posterior teeth form and occlusal scheme on the distribution of pressure on supporting structures under complete dentures during mastication with upper and lower edentulous simulators. In this report, a guideline for the selection of occlusal scheme for complete dentures, based on our previous investigations, is described. The occlusal scheme remarkably affected the distribution of pressure under simulated complete dentures, as shown by comparing the distribution of pressure using two different occlusal schemes:fully balanced occlusion and lingualized occlusion. However other factors such as posterior teeth form and position affect the distribution of pressure as well, and are related to each other. Therefore, not only occlusal scheme but also posterior artificial teeth form has to be considered, and the form of posterior teeth should be carefully and comprehensively decided when making complete dentures.
Ren, Zhoupeng; Wang, Duoquan; Ma, Aimin; Hwang, Jimee; Bennett, Adam; Sturrock, Hugh J. W.; Fan, Junfu; Zhang, Wenjie; Yang, Dian; Feng, Xinyu; Xia, Zhigui; Zhou, Xiao-Nong; Wang, Jinfeng
2016-02-01
Projecting the distribution of malaria vectors under climate change is essential for planning integrated vector control activities for sustaining elimination and preventing reintroduction of malaria. In China, however, little knowledge exists on the possible effects of climate change on malaria vectors. Here we assess the potential impact of climate change on four dominant malaria vectors (An. dirus, An. minimus, An. lesteri and An. sinensis) using species distribution models for two future decades: the 2030 s and the 2050 s. Simulation-based estimates suggest that the environmentally suitable area (ESA) for An. dirus and An. minimus would increase by an average of 49% and 16%, respectively, under all three scenarios for the 2030 s, but decrease by 11% and 16%, respectively in the 2050 s. By contrast, an increase of 36% and 11%, respectively, in ESA of An. lesteri and An. sinensis, was estimated under medium stabilizing (RCP4.5) and very heavy (RCP8.5) emission scenarios. in the 2050 s. In total, we predict a substantial net increase in the population exposed to the four dominant malaria vectors in the decades of the 2030 s and 2050 s, considering land use changes and urbanization simultaneously. Strategies to achieve and sustain malaria elimination in China will need to account for these potential changes in vector distributions and receptivity.
Rizvi, Mohd Suhail; Pal, Anupam
2014-09-01
The fibrous matrices are widely used as scaffolds for the regeneration of load-bearing tissues due to their structural and mechanical similarities with the fibrous components of the extracellular matrix. These scaffolds not only provide the appropriate microenvironment for the residing cells but also act as medium for the transmission of the mechanical stimuli, essential for the tissue regeneration, from macroscopic scale of the scaffolds to the microscopic scale of cells. The requirement of the mechanical loading for the tissue regeneration requires the fibrous scaffolds to be able to sustain the complex three-dimensional mechanical loading conditions. In order to gain insight into the mechanical behavior of the fibrous matrices under large amount of elongation as well as shear, a statistical model has been formulated to study the macroscopic mechanical behavior of the electrospun fibrous matrix and the transmission of the mechanical stimuli from scaffolds to the cells via the constituting fibers. The study establishes the load-deformation relationships for the fibrous matrices for different structural parameters. It also quantifies the changes in the fiber arrangement and tension generated in the fibers with the deformation of the matrix. The model reveals that the tension generated in the fibers on matrix deformation is not homogeneous and hence the cells located in different regions of the fibrous scaffold might experience different mechanical stimuli. The mechanical response of fibrous matrices was also found to be dependent on the aspect ratio of the matrix. Therefore, the model establishes a structure-mechanics interdependence of the fibrous matrices under large deformation, which can be utilized in identifying the appropriate structure and external mechanical loading conditions for the regeneration of load-bearing tissues. Copyright © 2014 Elsevier Ltd. All rights reserved.
PREDICTION OF CHANGES IN VEGETATION DISTRIBUTION UNDER CLIMATE CHANGE SCENARIOS USING MODIS DATASET
Directory of Open Access Journals (Sweden)
H. Hirayama
2016-06-01
Full Text Available The distribution of vegetation is expected to change under the influence of climate change. This study utilizes vegetation maps derived from Terra/MODIS data to generate a model of current climate conditions suitable to beech-dominated deciduous forests, which are the typical vegetation of Japan’s cool temperate zone. This model will then be coordinated with future climate change scenarios to predict the future distribution of beech forests. The model was developed by using the presence or absence of beech forest as the dependent variable. Four climatic variables; mean minimum daily temperature of the coldest month (TMC，warmth index (WI， winter precipitation (PRW and summer precipitation (PRS: and five geophysical variables; topography (TOPO, surface geology (GEOL, soil (SOIL, slope aspect (ASP, and inclination (INCL; were adopted as independent variables. Previous vegetation distribution studies used point data derived from field surveys. The remote sensing data utilized in this study, however, should permit collecting of greater amounts of data, and also frequent updating of data and distribution maps. These results will hopefully show that use of remote sensing data can provide new insights into our understanding of how vegetation distribution will be influenced by climate change.
Narukawa, Takafumi; Yamaguchi, Akira; Jang, Sunghyon; Amaya, Masaki
2018-02-01
For estimating fracture probability of fuel cladding tube under loss-of-coolant accident conditions of light-water-reactors, laboratory-scale integral thermal shock tests were conducted on non-irradiated Zircaloy-4 cladding tube specimens. Then, the obtained binary data with respect to fracture or non-fracture of the cladding tube specimen were analyzed statistically. A method to obtain the fracture probability curve as a function of equivalent cladding reacted (ECR) was proposed using Bayesian inference for generalized linear models: probit, logit, and log-probit models. Then, model selection was performed in terms of physical characteristics and information criteria, a widely applicable information criterion and a widely applicable Bayesian information criterion. As a result, it was clarified that the log-probit model was the best among the three models to estimate the fracture probability in terms of the degree of prediction accuracy for both next data to be obtained and the true model. Using the log-probit model, it was shown that 20% ECR corresponded to a 5% probability level with a 95% confidence of fracture of the cladding tube specimens.
Directory of Open Access Journals (Sweden)
Jiazheng Lu
2018-03-01
Full Text Available Composite insulators are widely used in modern power systems to provide electrical insulation and mechanical support for transmission lines and substations. However, the insulation strength will decrease greatly under the combined conditions of ice-covering and contamination, and icing flashovers may take place under these serious conditions. In this paper, AC flashover tests of different artificially ice-covered 220 kV composite insulators were carried out in a multi-function artificial climate chamber under energized ice accumulation conditions. The test results indicate that, with the increasing of ice thickness, the flashover voltages decrease and tend to saturation. The icing flashover voltages can be increased by adding booster sheds, but excessive booster sheds can lead to lower flashover voltages under heavy icing conditions. The voltage distributions of the iced insulators were measured using experimental methods. The results show that, the air gaps withstand most of the applied voltage. The zinc oxide (ZnO resistors that are contained in the insulators can influence the voltage distributions of the iced insulators, but have little affect on the icing flashover voltages. The work done in this paper can provide reference for the design and type selection of outdoor composite insulators in cold climate regions.
Studies on the temperature distribution of steel plates with different paints under solar radiation
International Nuclear Information System (INIS)
Liu, Hongbo; Chen, Zhihua; Chen, Binbin; Xiao, Xiao; Wang, Xiaodun
2014-01-01
Thermal effects on steel structures exposed to solar radiation are significant and complicated. Furthermore, the solar radiation absorption coefficient of steel surface with different paintings is the main factor affecting the non-uniform temperature of spatial structures under solar radiation. In this paper, nearly two hundreds steel specimens with different paintings were designed and measured to obtain their solar radiation absorption coefficients using spectrophotometer. Based on the test results, the effect of surface color, painting type, painting thickness on the solar radiation absorption coefficient was analyzed. The actual temperatures under solar radiation for all specimens were also measured in summer not only to verify the absorption coefficient but also provide insight for the temperature distribution of steel structures with different paintings. A numerical simulation and simplified formula were also conducted and verified by test, in order to study the temperature distribution of steel plates with different paints under solar radiation. The results have given an important reference in the future research of thermal effect of steel structures exposed to solar radiation. - Highlights: • Solar radiation absorptions for steel with different paintings were measured. • The temperatures of all specimens under solar radiation were measured. • The effect of color, thickness and painting type on solar absorption was analyzed. • A numerical analysis was conducted and verified by test data. • A simplified formula was deduced and verified by test data
Understanding the Sampling Distribution and the Central Limit Theorem.
Lewis, Charla P.
The sampling distribution is a common source of misuse and misunderstanding in the study of statistics. The sampling distribution, underlying distribution, and the Central Limit Theorem are all interconnected in defining and explaining the proper use of the sampling distribution of various statistics. The sampling distribution of a statistic is…
Directory of Open Access Journals (Sweden)
Yanlong Guo
2016-10-01
Full Text Available Climate change will significantly affect plant distribution as well as the quality of medicinal plants. Although numerous studies have analyzed the effect of climate change on future habitats of plants through species distribution models (SDMs, few of them have incorporated the change of effective content of medicinal plants. Schisandra sphenanthera Rehd. et Wils. is an endangered traditional Chinese medical plant which is mainly located in the Qinling Mountains. Combining fuzzy theory and a maximum entropy model, we obtained current spatial distribution of quality assessment for S. spenanthera. Moreover, the future quality and distribution of S. spenanthera were also projected for the periods 2020s, 2050s and 2080s under three different climate change scenarios (SRES-A1B, SRES-A2 and SRES-B1 emission scenarios described in the Special Report on Emissions Scenarios (SRES of IPCC (Intergovernmental Panel on Climate Change. The results showed that the moderately suitable habitat of S. sphenanthera under all climate change scenarios remained relatively stable in the study area. The highly suitable habitat of S. sphenanthera would gradually decrease in the future and a higher decline rate of the highly suitable habitat area would occur under climate change scenarios SRES-A1B and SRES-A2. The result suggested that in the study area, there would be no more highly suitable habitat areas for S. sphenanthera when the annual mean temperature exceeds 20 °C or its annual precipitation exceeds 1,200 mm. Our results will be influential in the future ecological conservation and management of S. sphenanthera and can be taken as a reference for habitat suitability assessment research for other medicinal plants.
Study of Stand-Alone Microgrid under Condition of Faults on Distribution Line
Malla, S. G.; Bhende, C. N.
2014-10-01
The behavior of stand-alone microgrid is analyzed under the condition of faults on distribution feeders. During fault since battery is not able to maintain dc-link voltage within limit, the resistive dump load control is presented to do so. An inverter control is proposed to maintain balanced voltages at PCC under the unbalanced load condition and to reduce voltage unbalance factor (VUF) at load points. The proposed inverter control also has facility to protect itself from high fault current. Existing maximum power point tracker (MPPT) algorithm is modified to limit the speed of generator during fault. Extensive simulation results using MATLAB/SIMULINK established that the performance of the controllers is quite satisfactory under different fault conditions as well as unbalanced load conditions.
Energy Technology Data Exchange (ETDEWEB)
Tang, Yinjie; Martin, Hector Garcia; Deutschbauer, Adam; Feng, Xueyang; Huang, Rick; Llora, Xavier; Arkin, Adam; Keasling, Jay D.
2009-04-21
An environmentally important bacterium with versatile respiration, Shewanella oneidensis MR-1, displayed significantly different growth rates under three culture conditions: minimal medium (doubling time {approx} 3 hrs), salt stressed minimal medium (doubling time {approx} 6 hrs), and minimal medium with amino acid supplementation (doubling time {approx}1.5 hrs). {sup 13}C-based metabolic flux analysis indicated that fluxes of central metabolic reactions remained relatively constant under the three growth conditions, which is in stark contrast to the reported significant changes in the transcript and metabolite profiles under various growth conditions. Furthermore, ten transposon mutants of S. oneidensis MR-1 were randomly chosen from a transposon library and their flux distributions through central metabolic pathways were revealed to be identical, even though such mutational processes altered the secondary metabolism, for example, glycine and C1 (5,10-Me-THF) metabolism.
Under-Frequency Load Shedding Technique Considering Event-Based for an Islanded Distribution Network
Directory of Open Access Journals (Sweden)
Hasmaini Mohamad
2016-06-01
Full Text Available One of the biggest challenge for an islanding operation is to sustain the frequency stability. A large power imbalance following islanding would cause under-frequency, hence an appropriate control is required to shed certain amount of load. The main objective of this research is to develop an adaptive under-frequency load shedding (UFLS technique for an islanding system. The technique is designed considering an event-based which includes the moment system is islanded and a tripping of any DG unit during islanding operation. A disturbance magnitude is calculated to determine the amount of load to be shed. The technique is modeled by using PSCAD simulation tool. A simulation studies on a distribution network with mini hydro generation is carried out to evaluate the UFLS model. It is performed under different load condition: peak and base load. Results show that the load shedding technique have successfully shed certain amount of load and stabilized the system frequency.
Holland, Bart K.
2006-01-01
A generally-educated individual should have some insight into how decisions are made in the very wide range of fields that employ statistical and probabilistic reasoning. Also, students of introductory probability and statistics are often best motivated by specific applications rather than by theory and mathematical development, because most…
Lunsford, M. Leigh; Rowell, Ginger Holmes; Goodson-Espy, Tracy
2006-01-01
We applied a classroom research model to investigate student understanding of sampling distributions of sample means and the Central Limit Theorem in post-calculus introductory probability and statistics courses. Using a quantitative assessment tool developed by previous researchers and a qualitative assessment tool developed by the authors, we…
DEFF Research Database (Denmark)
Risager, Morten S.; Rudnick, Zeev
We study a variant of a problem considered by Dinaburg and Sinai on the statistics of the minimal solution to a linear Diophantine equation. We show that the signed ratio between the Euclidean norms of the minimal solution and the coefficient vector is uniformly distributed modulo one. We reduce ...
Wang, Wei; Wen, Changyun; Huang, Jiangshuai; Fan, Huijin
2017-11-01
In this paper, a backstepping based distributed adaptive control scheme is proposed for multiple uncertain Euler-Lagrange systems under directed graph condition. The common desired trajectory is allowed totally unknown by part of the subsystems and the linearly parameterized trajectory model assumed in currently available results is no longer needed. To compensate the effects due to unknown trajectory information, a smooth function of consensus errors and certain positive integrable functions are introduced in designing virtual control inputs. Besides, to overcome the difficulty of completely counteracting the coupling terms of distributed consensus errors and parameter estimation errors in the presence of asymmetric Laplacian matrix, extra information transmission of local parameter estimates are introduced among linked subsystem and adaptive gain technique is adopted to generate distributed torque inputs. It is shown that with the proposed distributed adaptive control scheme, global uniform boundedness of all the closed-loop signals and asymptotically output consensus tracking can be achieved. Copyright © 2017 ISA. Published by Elsevier Ltd. All rights reserved.
Distribution functions of a simple fluid under shear: Low shear rates
International Nuclear Information System (INIS)
Kalyuzhnyi, Y.V.; Cui, S.T.; Cummings, P.T.; Cochran, H.D.
1999-01-01
Anisotropic pair distribution functions for a simple, soft sphere fluid at moderate and high density under shear have been calculated by nonequilibrium molecular dynamics, by equilibrium molecular dynamics with a nonequilibrium potential, and by a nonequilibrium distribution function theory [H. H. Gan and B. C. Eu, Phys. Rev. A 45, 3670 (1992)] and some variants. The nonequilibrium distribution function theory consists of a nonequilibrium Ornstein-Zernike relation, a closure relation, and a nonequilibrium potential and is solved in spherical harmonics. The distortion of the fluid structure due to shear is presented as the difference between the nonequilibrium and equilibrium pair distribution functions. From comparison of the results of theory against results of equilibrium molecular dynamics with the nonequilibrium potential at low shear rates, it is concluded that, for a given nonequilibrium potential, the theory is reasonably accurate, especially with the modified hypernetted chain closure. The equilibrium molecular-dynamics results with the nonequilibrium potential are also compared against the results of nonequilibrium molecular dynamics and suggest that the nonequilibrium potential used is not very accurate. In continuing work, a nonequilibrium potential better suited to high shear rates [H. H. Gan and B. C. Eu, Phys. Rev. A 46, 6344 (1992)] is being tested. copyright 1999 The American Physical Society
International Nuclear Information System (INIS)
Cheong, Jae Hak
2010-01-01
A statistical evaluation methodology was developed to determine the compliance of candidate waste stream with clearance criteria based upon distribution of radionuclide in a waste stream at a certain confidence level. For the cases where any information on the radionuclide distribution is not available, the relation between arithmetic mean of radioactivity concentration and its acceptable maximum standard deviation was demonstrated by applying widely-known Markov Inequality and One-side Chebyshev Inequality. The relations between arithmetic mean and its acceptable maximum standard deviation were newly derived for normally or lognormally distributed radionuclide in a waste stream, using probability density function, cumulative density function, and other statistical relations. The evaluation methodology was tested for a representative case at 95% of confidence level and 100 Bq/g of clearance level of radioactivity concentration, and then the acceptable range of standard deviation at a given arithmetic mean was quantitatively shown and compared, by varying the type of radionuclide distribution. Furthermore, it was statistically demonstrated that the allowable range of clearance can be expanded, even at the same confidence level, if information on the radionuclide distribution is available.
2010-04-01
... under the Food, Drug and Cosmetic Act. 1310.10 Section 1310.10 Food and Drugs DRUG ENFORCEMENT... Removal of the exemption of drugs distributed under the Food, Drug and Cosmetic Act. (a) The Administrator... manner of packaging of the drug product; (2) The manner of distribution and advertising of the drug...
2010-04-01
... 21 Food and Drugs 9 2010-04-01 2010-04-01 false Reinstatement of exemption for drug products distributed under the Food, Drug and Cosmetic Act. 1310.11 Section 1310.11 Food and Drugs DRUG ENFORCEMENT... Reinstatement of exemption for drug products distributed under the Food, Drug and Cosmetic Act. (a) The...
How Useful Are Species Distribution Models for Managing Biodiversity under Future Climates?
Directory of Open Access Journals (Sweden)
Steve J. Sinclair
2010-03-01
Full Text Available Climate change presents unprecedented challenges for biological conservation. Agencies are increasingly looking to modeled projections of species' distributions under future climates to inform management strategies. As government scientists with a responsibility to communicate the best available science to our policy colleagues, we question whether current modeling approaches and outputs are practically useful. Here, we synthesize conceptual problems with species distribution models (SDMs associated with interspecific interactions, dispersal, ecological equilibria and time lags, evolution, and the sampling of niche space. Although projected SDMs have undoubtedly been critical in alerting us to the magnitude of climate change impacts, we conclude that until they offer insights that are more precise than what we can derive from basic ecological theory, we question their utility in deciding how to allocate scarce funds to large-scale conservation projects.
Boens, Noël; Van der Auweraer, Mark
2014-02-01
The deterministic identifiability analysis of photophysical models for the kinetics of excited-state processes, assuming errorless time-resolved fluorescence data, can verify whether the model parameters can be determined unambiguously. In this work, we have investigated the identifiability of several uncommon models for time-resolved fluorescence with underlying distributions of rate constants which lead to non-exponential decays. The mathematical functions used here for the description of non-exponential fluorescence decays are the stretched exponential or Kohlrausch function, the Becquerel function, the Förster type energy transfer function, decay functions associated with exponential, Gaussian and uniform distributions of rate constants, a decay function with extreme sub-exponential behavior, the Mittag-Leffler function and Heaviside's function. It is shown that all the models are uniquely identifiable, which means that for each specific model there exists a single parameter set that describes its associated fluorescence δ-response function.
Distributed Consensus of Stochastic Delayed Multi-agent Systems Under Asynchronous Switching.
Wu, Xiaotai; Tang, Yang; Cao, Jinde; Zhang, Wenbing
2016-08-01
In this paper, the distributed exponential consensus of stochastic delayed multi-agent systems with nonlinear dynamics is investigated under asynchronous switching. The asynchronous switching considered here is to account for the time of identifying the active modes of multi-agent systems. After receipt of confirmation of mode's switching, the matched controller can be applied, which means that the switching time of the matched controller in each node usually lags behind that of system switching. In order to handle the coexistence of switched signals and stochastic disturbances, a comparison principle of stochastic switched delayed systems is first proved. By means of this extended comparison principle, several easy to verified conditions for the existence of an asynchronously switched distributed controller are derived such that stochastic delayed multi-agent systems with asynchronous switching and nonlinear dynamics can achieve global exponential consensus. Two examples are given to illustrate the effectiveness of the proposed method.
Energy Technology Data Exchange (ETDEWEB)
Ruth, Mark; Pratt, Annabelle; Lunacek, Monte; Mittal, Saurabh; Wu, Hongyu; Jones, Wesley
2015-07-17
The combination of distributed energy resources (DER) and retail tariff structures to provide benefits to both utility consumers and the utilities is poorly understood. To improve understanding, an Integrated Energy System Model (IESM) is being developed to simulate the physical and economic aspects of DER technologies, the buildings where they reside, and feeders servicing them. The IESM was used to simulate 20 houses with home energy management systems on a single feeder under a time of use tariff to estimate economic and physical impacts on both the households and the distribution utilities. HEMS reduce consumers’ electric bills by precooling houses in the hours before peak electricity pricing. Household savings are greater than the reduction utility net revenue indicating that HEMS can provide a societal benefit providing tariffs are structured so that utilities remain solvent. Utilization of HEMS reduce peak loads during high price hours but shifts it to hours with off-peak and shoulder prices and resulting in a higher peak load.
Effect of soft denture liner on stress distribution in supporting structures under a denture.
Kawano, F; Koran, A; Asaoka, K; Matsumoto, N
1993-01-01
This study examined the effect of a soft denture liner on the distribution of stresses in the denture-supporting structures. Dentures without a linear and with three configurations of a soft liner were simulated by using a two-dimensional viscoelastic finite-element stress analysis. The stress intensity at functional force-bearing areas decreased when a soft denture liner was used. However, the stresses in the bone increased remarkably up to 3.0 seconds after loading. Because of the time-dependent effect of stresses applied to soft denture liners, denture patients who clench or brux may not benefit as greatly from soft denture liners. The study indicates that viscoelastic finite-element analysis is helpful for evaluating soft denture liners. Soft denture liners appear to be useful for improving the stress distribution in the supporting structures under dentures.
Misnaza, Sandra Patricia; Roncancio, Claudia Patricia; Peña, Isabel Cristina; Prieto, Franklin Edwin
2016-09-01
During 2012, 13% of the deaths worldwide in children under the age of 28 days were due to congenital malformations. In Colombia, congenital malformations are the second leading cause of infant mortality. Objective: To determine the geographical distribution of extended perinatal mortality due to congenital malformations in Colombia between 1999 and 2008. Materials and methods: We conducted a cross-sectional study. We revised all death certificates issued between 1999 and 2008. We defined perinatal mortality as fetal or non-fetal deaths within the first 28 days after delivery in children with body weight ≥500 grams, and congenital malformations according to ICD-10 diagnostic codes Q000 - Q999. The annual birth projection was used as the denominator. We defined high mortality areas due to congenital malformations as those in the 90th percentile. Results: We recorded 22,361 perinatal deaths due to congenital malformations. The following provinces exceeded the 90th perinatal mortality percentile: Antioquia, Caldas, Risaralda, Huila, Quindío, Bogotá, Valle del Cauca and Guainía. Among the municipalities, the highest perinatal mortality rates were found in Giraldo, Ciudad Bolívar, Riosucio, Liborina, Supía, Alejandría, Sopetrán, San Jerónimo, Santa Fe de Antioquia and Marmato (205.81 and 74.18 per 10.000 live births).The perinatal mortality rate due to malformations of the circulatory system was 28.1 per 10.000 live births, whereas the rates for central nervous system defects and chromosomal abnormalities were 13.7 and 7.0, respectively. The Andean region showed high perinatal mortality rates due to congenital malformations. There is an urgent need to identify possible risk factors of perinatal mortality and implement successive prevention programs in that particular region.
Density and spatial distribution of Parkia biglobosa pattern in Benin under climate change
Directory of Open Access Journals (Sweden)
Fafunkè Titilayo Dotchamou
2016-06-01
Full Text Available Parkia biglobosa is an indigenous species which, traditionally contributes to the resilience of the agricultural production system in terms of food security, source of income, poverty reduction and ecosystem stability. Therefore, it is important to improve knowledge on its density, current and future spatial distribution. The main objective of this study is to evaluate the tree density, the climate change effects on the spatial distribution of the species in the future for better conservation. The modeling of the current and future geographical distribution of the species is based on the principle of Maximum Entropy (MaxEnt on a total of 286 occurrence points from field work and Global Biodiversity Information Facility GBIF-Data Portal-(www.gbif.org. Two climatic models (HadGEM2_ES and Csiro_mk3_6_0 have been used under two scenarios RCP 2.6 and RCP 8.5 for the projection of the species distribution at the horizon 2050. The correlation analyses and Jackknife test have helped to identify seven variables which are less correlated (r < 0.80 with highest modeling participation. The soil, annual precipitation (BIO12 and temperature (diurnal average Deviation are the variables which have mostly contributed to performance of the models. Currently, 53% of national territory, spread from north to south is very suitable to the cultivation of P. biglobosa. The scenarios have predicted at the horizon 2050, a loss of the habitats which are currently very suitable for the cultivation and conservation of P. biglobosa, to the benefit of moderate and weak habitats. 51% and 57% are the highest proportion of this lost which will be registered with HadGEM2_ES model under two scenarios. These results revealed that the suitable habitat of the species is threatened by climate change in Benin. In order to limit damage such as decreased productivity, extinction of species, some appropriate solutions must be found.
Reside, April E; VanDerWal, Jeremy; Kutt, Alex S
2012-01-01
Identifying the species most vulnerable to extinction as a result of climate change is a necessary first step in mitigating biodiversity decline. Species distribution modeling (SDM) is a commonly used tool to assess potential climate change impacts on distributions of species. We use SDMs to predict geographic ranges for 243 birds of Australian tropical savannas, and to project changes in species richness and ranges under a future climate scenario between 1990 and 2080. Realistic predictions require recognition of the variability in species capacity to track climatically suitable environments. Here we assess the effect of dispersal on model results by using three approaches: full dispersal, no dispersal and a partial-dispersal scenario permitting species to track climate change at a rate of 30 km per decade. As expected, the projected distributions and richness patterns are highly sensitive to the dispersal scenario. Projected future range sizes decreased for 66% of species if full dispersal was assumed, but for 89% of species when no dispersal was assumed. However, realistic future predictions should not assume a single dispersal scenario for all species and as such, we assigned each species to the most appropriate dispersal category based on individual mobility and habitat specificity; this permitted the best estimates of where species will be in the future. Under this “realistic” dispersal scenario, projected ranges sizes decreased for 67% of species but showed that migratory and tropical-endemic birds are predicted to benefit from climate change with increasing distributional area. Richness hotspots of tropical savanna birds are expected to move, increasing in southern savannas and southward along the east coast of Australia, but decreasing in the arid zone. Understanding the complexity of effects of climate change on species’ range sizes by incorporating dispersal capacities is a crucial step toward developing adaptation policies for the conservation of
Digital Repository Service at National Institute of Oceanography (India)
Jayalakshmy, K.V.; Rao, K.K.
, 17 foraminiferal species, species were clustered into 5 groups with row normalisation and varimax rotation for Q-mode factor analysis. The 19 stations were also grouped into 5 groups with only 2 groups statistically significant using column...
Limiting behavior of delayed sums under a non-identically distribution setup
Directory of Open Access Journals (Sweden)
Chen Pingyan
2008-12-01
Full Text Available We present an accurate description the limiting behavior of delayed sums under a non-identically distribution setup, and deduce Chover-type laws of the iterated logarithm for them. These complement and extend the results of Vasudeva and Divanji (Theory of Probability and its Applications, 37 (1992, 534-542.Apresentamos uma descrição precisa do comportamento limite de somas retardadas, e deduzimos leis do tipo Chover de logaritmo iterado para as mesmas. Isso completa e estende os resultados de Vasudeva e Divanji (Theory of Probability and its Aplications, 37 (1992, 534-542.
Large deflection analysis of cantilever beam under end point and distributed load
DEFF Research Database (Denmark)
Kimiaeifar, Amin; Tolou, N; Barari, Amin
2014-01-01
distributed loads. Direct nonlinear solution by use of homotopy analysis method was implemented to drive the semi-exact solution of trajectory position of any point along the beam length. For the purpose of comparison, the deflections were calculated and compared to those of finite element method which...... requires numerical solution of simultaneous equations which is a significant drawback for optimization or reliability analysis. This paper is motivated to overcome these shortcomings by presenting an analytical solution for the large deflection analysis of a cantilever beam under free end point and uniform...
Ge, Xuezhen; Jiang, Chao; Chen, Linghong; Qiu, Shuang; Zhao, Yuxiang; Wang, Tao; Zong, Shixiang
2017-01-01
Euwallacea fornicatus (Eichhoff) is an important forest pest that has caused serious damage in America and Vietnam. In 2014, it attacked forests of Acer trialatum in the Yunnan province of China, creating concern in China?s Forestry Bureau. We used the CLIMEX model to predict and compare the potential distribution for E. fornicates in China under current (1981?2010) and projected climate conditions (2011?2040) using one scenario (RCP8.5) and one global climate model (GCM), CSIRO-Mk3-6-0. Unde...
Directory of Open Access Journals (Sweden)
Xuezhen Ge
Full Text Available As the primary pest of palm trees, Rhynchophorus ferrugineus (Olivier (Coleoptera: Curculionidae has caused serious harm to palms since it first invaded China. The present study used CLIMEX 1.1 to predict the potential distribution of R. ferrugineus in China according to both current climate data (1981-2010 and future climate warming estimates based on simulated climate data for the 2020s (2011-2040 provided by the Tyndall Center for Climate Change Research (TYN SC 2.0. Additionally, the Ecoclimatic Index (EI values calculated for different climatic conditions (current and future, as simulated by the B2 scenario were compared. Areas with a suitable climate for R. ferrugineus distribution were located primarily in central China according to the current climate data, with the northern boundary of the distribution reaching to 40.1°N and including Tibet, north Sichuan, central Shaanxi, south Shanxi, and east Hebei. There was little difference in the potential distribution predicted by the four emission scenarios according to future climate warming estimates. The primary prediction under future climate warming models was that, compared with the current climate model, the number of highly favorable habitats would increase significantly and expand into northern China, whereas the number of both favorable and marginally favorable habitats would decrease. Contrast analysis of EI values suggested that climate change and the density of site distribution were the main effectors of the changes in EI values. These results will help to improve control measures, prevent the spread of this pest, and revise the targeted quarantine areas.
Ge, Xuezhen; He, Shanyong; Wang, Tao; Yan, Wei; Zong, Shixiang
2015-01-01
As the primary pest of palm trees, Rhynchophorus ferrugineus (Olivier) (Coleoptera: Curculionidae) has caused serious harm to palms since it first invaded China. The present study used CLIMEX 1.1 to predict the potential distribution of R. ferrugineus in China according to both current climate data (1981-2010) and future climate warming estimates based on simulated climate data for the 2020s (2011-2040) provided by the Tyndall Center for Climate Change Research (TYN SC 2.0). Additionally, the Ecoclimatic Index (EI) values calculated for different climatic conditions (current and future, as simulated by the B2 scenario) were compared. Areas with a suitable climate for R. ferrugineus distribution were located primarily in central China according to the current climate data, with the northern boundary of the distribution reaching to 40.1°N and including Tibet, north Sichuan, central Shaanxi, south Shanxi, and east Hebei. There was little difference in the potential distribution predicted by the four emission scenarios according to future climate warming estimates. The primary prediction under future climate warming models was that, compared with the current climate model, the number of highly favorable habitats would increase significantly and expand into northern China, whereas the number of both favorable and marginally favorable habitats would decrease. Contrast analysis of EI values suggested that climate change and the density of site distribution were the main effectors of the changes in EI values. These results will help to improve control measures, prevent the spread of this pest, and revise the targeted quarantine areas.
PERFORMANCE ANALYSIS OF A MODIFIED CFAR BASED RADAR DETECTOR UNDER PEARSON DISTRIBUTED CLUTTER
Directory of Open Access Journals (Sweden)
Amritakar Mandal
2014-12-01
Full Text Available An adaptive target detector in radar system is used to extract targets from background in noisy environment of unknown statistics. The constant false alarm rate (CFAR is well known detection algorithm that is being used in almost every modern radar. The cell averaging CFAR is the optimum detector in homogeneous clutter environment when the refence cells have identically independent and exponentially distributed signals. The performance of CA CFAR degrades seriously when clutter power substantially varies in non-homogeneous background. To overcome the performance degradation, a non-linear compression technique based CFAR has been introduced for adaptive thresholding to meet the challenges of target detection from various degrees of Pearson distributed non-homogeneous clutter. Extensive MATLAB simulations have been done using various levels of clutter input to show the effectiveness of the proposed design. Improvement in Signal-to-Noise ratio (SNR has been achieved using Swerling I model for Rayleigh fluctuating target in the backdrop of heavy clutter.
Vlad, Marcel Ovidiu; Tsuchiya, Masa; Oefner, Peter; Ross, John
2002-01-01
We investigate the statistical properties of systems with random chemical composition and try to obtain a theoretical derivation of the self-similar Dirichlet distribution, which is used empirically in molecular biology, environmental chemistry, and geochemistry. We consider a system made up of many chemical species and assume that the statistical distribution of the abundance of each chemical species in the system is the result of a succession of a variable number of random dilution events, which can be described by using the renormalization-group theory. A Bayesian approach is used for evaluating the probability density of the chemical composition of the system in terms of the probability densities of the abundances of the different chemical species. We show that for large cascades of dilution events, the probability density of the composition vector of the system is given by a self-similar probability density of the Dirichlet type. We also give an alternative formal derivation for the Dirichlet law based on the maximum entropy approach, by assuming that the average values of the chemical potentials of different species, expressed in terms of molar fractions, are constant. Although the maximum entropy approach leads formally to the Dirichlet distribution, it does not clarify the physical origin of the Dirichlet statistics and has serious limitations. The random theory of dilution provides a physical picture for the emergence of Dirichlet statistics and makes it possible to investigate its validity range. We discuss the implications of our theory in molecular biology, geochemistry, and environmental science.
Energy statistics yearbook 2000
International Nuclear Information System (INIS)
2002-01-01
The Energy Statistics Yearbook 2000 is a comprehensive collection of international energy statistics prepared by the United Nations Statistics Division. It is the forty-third in a series of annual compilations which commenced under the title World Energy Supplies in Selected Years, 1929-1950. It updates the statistical series shown in the previous issue. Supplementary series of monthly and quarterly data on production of energy may be found in the Monthly Bulletin of Statistics. The principal objective of the Yearbook is to provide a global framework of comparable data on long-term trends in the supply of mainly commercial primary and secondary forms of energy. Data for each type of fuel and aggregate data for the total mix of commercial fuels are shown for individual countries and areas and are summarized into regional and world totals. The data are compiled primarily from the annual energy questionnaire distributed by the United Nations Statistics Division and supplemented by official national statistical publications. Where official data are not available or are inconsistent, estimates are made by the Statistics Division based on governmental, professional or commercial materials. Estimates include, but are not limited to, extrapolated data based on partial year information, use of annual trends, trade data based on partner country reports, breakdowns of aggregated data as well as analysis of current energy events and activities
Energy statistics yearbook 2001
International Nuclear Information System (INIS)
2004-01-01
The Energy Statistics Yearbook 2001 is a comprehensive collection of international energy statistics prepared by the United Nations Statistics Division. It is the forty-fifth in a series of annual compilations which commenced under the title World Energy Supplies in Selected Years, 1929-1950. It updates the statistical series shown in the previous issue. Supplementary series of monthly and quarterly data on production of energy may be found in the Monthly Bulletin of Statistics. The principal objective of the Yearbook is to provide a global framework of comparable data on long-term trends in the supply of mainly commercial primary and secondary forms of energy. Data for each type of fuel and aggregate data for the total mix of commercial fuels are shown for individual countries and areas and are summarized into regional and world totals. The data are compiled primarily from the annual energy questionnaire distributed by the United Nations Statistics Division and supplemented by official national statistical publications. Where official data are not available or are inconsistent, estimates are made by the Statistics Division based on governmental, professional or commercial materials. Estimates include, but are not limited to, extrapolated data based on partial year information, use of annual trends, trade data based on partner country reports, breakdowns of aggregated data as well as analysis of current energy events and activities
Energy statistics yearbook 2002
International Nuclear Information System (INIS)
2005-01-01
The Energy Statistics Yearbook 2002 is a comprehensive collection of international energy statistics prepared by the United Nations Statistics Division. It is the forty-sixth in a series of annual compilations which commenced under the title World Energy Supplies in Selected Years, 1929-1950. It updates the statistical series shown in the previous issue. Supplementary series of monthly and quarterly data on production of energy may be found in the Monthly Bulletin of Statistics. The principal objective of the Yearbook is to provide a global framework of comparable data on long-term trends in the supply of mainly commercial primary and secondary forms of energy. Data for each type of fuel and aggregate data for the total mix of commercial fuels are shown for individual countries and areas and are summarized into regional and world totals. The data are compiled primarily from the annual energy questionnaire distributed by the United Nations Statistics Division and supplemented by official national statistical publications. Where official data are not available or are inconsistent, estimates are made by the Statistics Division based on governmental, professional or commercial materials. Estimates include, but are not limited to, extrapolated data based on partial year information, use of annual trends, trade data based on partner country reports, breakdowns of aggregated data as well as analysis of current energy events and activities
Hung, R. J.
1995-01-01
A set of mathematical formulation is adopted to study vapor deposition from source materials driven by heat transfer process under normal and oblique directions of gravitational acceleration with extremely low pressure environment of 10(exp -2) mm Hg. A series of time animation of the initiation and development of flow and temperature profiles during the course of vapor deposition has been obtained through the numerical computation. Computations show that the process of vapor deposition has been accomplished by the transfer of vapor through a fairly complicated flow pattern of recirculation under normal direction gravitational acceleration. It is obvious that there is no way to produce a homogeneous thin crystalline films with fine grains under such a complicated flow pattern of recirculation with a non-uniform temperature distribution under normal direction gravitational acceleration. There is no vapor deposition due to a stably stratified medium without convection for reverse normal direction gravitational acceleration. Vapor deposition under oblique direction gravitational acceleration introduces a reduced gravitational acceleration in vertical direction which is favorable to produce a homogeneous thin crystalline films. However, oblique direction gravitational acceleration also induces an unfavorable gravitational acceleration along horizontal direction which is responsible to initiate a complicated flow pattern of recirculation. In other words, it is necessary to carry out vapor deposition under a reduced gravity in the future space shuttle experiments with extremely low pressure environment to process vapor deposition with a homogeneous crystalline films with fine grains. Fluid mechanics simulation can be used as a tool to suggest most optimistic way of experiment with best setup to achieve the goal of processing best nonlinear optical materials.
Schätzle, W
1976-05-31
The normal distribution of several lysosomal enzymes was studied in 20 guinea pigs. In the outer hair cells lysosomal enzymes are mainly localized at the apical cell pole, while in inner hair cells the distribution was uniform. Nonlysosomal enzymes like alcaline phosphatase are of predominantly basal localization. The concentration of some lysosomal enzymes like N-acetyl-beta-glucosaminidase was higher in outer than in inner hair cells while others like acid phosphatase, beta-glucuronidase and sulfatase showed a stronger reaction in the inner hair cells. After 10 days of sound overstimulation with 120 dB for 1 h a day, there was an increase of lysosomal enzyme content namely in the outer hair cells. There was no change of non-lysosomal enzymes. Under these conditions there might be a partial destruction of cellular organelles eliminated by lysosomal activity without loss of a total cell. In addition the distribution and possible function of lysosomal enzymes in other labyrinthine tissues was discussed.
Irigoitia, Manuel Marcial; Braicovich, Paola Elizabeth; Lanfranchi, Ana Laura; Farber, Marisa Diana; Timi, Juan Tomás
2018-02-21
In order to evaluate the infestation by anisakids present in elasmobranchs and their distribution in the Argentine Sea, this study was carried at a regional scale with the following aims: 1) to identify those anisakid species present in skates under exploitation; 2) to characterize quantitatively these infestations and 3) to determine those factors driving the variability in parasite burdens across skate species. A total of 351 skates, belonging to 3 species (218 Sympterygia bonapartii, 86 Zearaja chilensis and 47 Atlantoraja castelnaui) and from different localities of the Argentine Sea were examined for anisakids. Parasites were found in the stomach wall at high prevalence in some samples. Based on morphology and mtDNA cox2 sequences analyses (from 24 larval worms), specimens were identified as Anisakis berlandi, A. pegreffii and Pseudoterranova cattani; the last two known as potentially pathogenic for humans. Differential distribution patterns were observed across parasite and hosts species. In general, fish caught in southern and deeper waters exhibited higher loads of Anisakis spp., whereas infestation levels by P. cattani increase in larger skates. Taking into account that the mere presence of worms or their antigens in fish meat can provoke allergic responses, information on distribution of parasites and their variability is essential for the implementation of food safety practices. Copyright © 2017 Elsevier B.V. All rights reserved.
Directory of Open Access Journals (Sweden)
Xinwei Wang
2016-11-01
Full Text Available Sandwich structures are widely used in practice and thus various engineering theories adopting simplifying assumptions are available. However, most engineering theories of beams, plates and shells cannot recover all stresses accurately through their constitutive equations. Therefore, the soft-core is directly modeled by two-dimensional (2D elasticity theory without any pre-assumption on the displacement field. The top and bottom faces act like the elastic supports on the top and bottom edges of the core. The differential equations of the 2D core are then solved by the harmonic differential quadrature method (HDQM. To circumvent the difficulties in dealing with the locally distributed load by point discrete methods such as the HDQM, a general and rigorous way is proposed to treat the locally distributed load. Detailed formulations are provided. The static behavior of sandwich panels under different locally distributed loads is investigated. For verification, results are compared with data obtained by ABAQUS with very fine meshes. A high degree of accuracy on both displacement and stress has been observed.
Fruit yield and root system distribution of 'Tommy Atkins' mango under different irrigation regimes
Directory of Open Access Journals (Sweden)
Marcelo R. dos Santos
2014-04-01
Full Text Available This study aimed to evaluate the fruit yield and the distribution of 'Tommy Atkins' mango root system under different irrigation regimes in the semiarid region of Bahia. The experimental design was completely randomized with five treatments and three replicates: 1 - Irrigation supplying 100% of ETc in phases I, II and III; 2 - Regulated deficit irrigation (RDI supplying 50% of ETc in phase I (beginning of flowering to early fruit growth; 3 - RDI supplying 50% ETc in phase II (start of expansion until the beginning of physiological maturity; 4 - RDI supplying 50% ETc in phase III (physiological mature fruits; 5 - No irrigation during all three phases. The regulated deficit irrigation supplying 50% of the ETc during phase I and II provided larger root length density of 'Tommy Atkins' mango. Regardless of management strategy, the roots were developed in all evaluated soil volume and the highest density is concentrated from 0.50 to 1.50 m distance from the trunk and in 0.20 to 0.90 m depth in the soil, that suggests this region to be the best place for fertilizer application as well for soil water sensor placement. The application of RDI during fruit set does not influence either root distribution or production. Root system and crop production is significantly reduced under no irrigation conditions.
The potential distribution of bioenergy crops in Europe under present and future climate
International Nuclear Information System (INIS)
Tuck, Gill; Glendining, Margaret J.; Smith, Pete; Wattenbach, Martin; House, Jo I.
2006-01-01
We have derived maps of the potential distribution of 26 promising bioenergy crops in Europe, based on simple rules for suitable climatic conditions and elevation. Crops suitable for temperate and Mediterranean climates were selected from four groups: oilseeds (e.g. oilseed rape, sunflower), starch crops (e.g. potatoes), cereals (e.g. barley) and solid biofuel crops (e.g. sorghum, Miscanthus). The impact of climate change under different scenarios and GCMs on the potential future distribution of these crops was determined, based on predicted future climatic conditions. Climate scenarios based on four IPCC SRES emission scenarios, A1FI, A2, B1 and B2, implemented by four global climate models, HadCM3, CSIRO2, PCM and CGCM2, were used. The potential distribution of temperate oilseeds, cereals, starch crops and solid biofuels is predicted to increase in northern Europe by the 2080s, due to increasing temperatures, and decrease in southern Europe (e.g. Spain, Portugal, southern France, Italy, and Greece) due to increased drought. Mediterranean oil and solid biofuel crops, currently restricted to southern Europe, are predicted to extend further north due to higher summer temperatures. Effects become more pronounced with time and are greatest under the A1FI scenario and for models predicting the greatest climate forcing. Different climate models produce different regional patterns. All models predict that bioenergy crop production in Spain is especially vulnerable to climate change, with many temperate crops predicted to decline dramatically by the 2080s. The choice of bioenergy crops in southern Europe will be severely reduced in future unless measures are taken to adapt to climate change. (author)
International Nuclear Information System (INIS)
Reed, Donald Timothy; Borkowski, Marian; Lucchini, Jean-Francois; Ams, David; Richmann, M.K.; Khaing, H.; Swanson, J.S.
2010-01-01
The fate and potential mobility of multivalent actinides in the subsurface is receiving increased attention as the DOE looks to cleanup the many legacy nuclear waste sites and associated subsurface contamination. Plutonium, uranium and neptunium are the near-surface multivalent contaminants of concern and are also key contaminants for the deep geologic disposal of nuclear waste. Their mobility is highly dependent on their redox distribution at their contamination source as well as along their potential migration pathways. This redox distribution is often controlled, especially in the near-surface where organic/inorganic contaminants often coexist, by the direct and indirect effects of microbial activity. Under anoxic conditions, indirect and direct bioreduction mechanisms exist that promote the prevalence of lower-valent species for multivalent actinides. Oxidation-state-specific biosorption is also an important consideration for long-term migration and can influence oxidation state distribution. Results of ongoing studies to explore and establish the oxidation-state specific interactions of soil bacteria (metal reducers and sulfate reducers) as well as halo-tolerant bacteria and Archaea for uranium, neptunium and plutonium will be presented. Enzymatic reduction is a key process in the bioreduction of plutonium and uranium, but co-enzymatic processes predominate in neptunium systems. Strong sorptive interactions can occur for most actinide oxidation states but are likely a factor in the stabilization of lower-valent species when more than one oxidation state can persist under anaerobic microbiologically-active conditions. These results for microbiologically active systems are interpreted in the context of their overall importance in defining the potential migration of multivalent actinides in the subsurface.
International Nuclear Information System (INIS)
Zhixiang, Z.
1983-01-01
The least squares fit has been performed using chi-squared distribution function for all available evaluated data for s-wave reduced neutron width of several nuclei. The number of degrees of freedom and average value have been obtained. The missing levels of weak s-wave resonances and extra p-wave levels have been taken into account, if any. For 75 As and 103 Rh, s-wave population has been separated by Bayes' theorem before making fit. The results thus obtained are consistent with Porter-Thomas distribution, i.e., chi-squared distribution with γ=1, as one would expect. It has not been found in this work that the number of degrees of freedom for the distribution of s-wave reduced neutron width might be greater than one as reported by H.C.Sharma et al. (1976) at the international conference on interactions of neutrons with nuclei. (Auth.)
Energy Technology Data Exchange (ETDEWEB)
Kim, Kihwan, E-mail: kihwankim@kaeri.re.kr; Euh, Dong-Jin; Chu, In-Cheol; Youn, Young-Jung; Choi, Hae-Seob; Kwon, Tae-Soon, E-mail: tskwon@kaeri.re.kr
2013-12-15
Highlights: • Experimental facility with a 1/5 scale was designed to perform various hydraulic tests of an APR+ reactor. • Two kinds of experiments, balanced and unbalanced flows under 4-pump running conditions were carried out. • The core inlet flow rates and exit pressure distributions were measured and analyzed at 257 discrete points. • The coolant mixing characteristics were investigated with the sectional pressure loss coefficients. - Abstract: The core inlet flow rates and exit pressure distributions of an APR+ (Advanced Power Reactor Plus) reactor were evaluated experimentally with the ACOP (APR+ Core Flow and Pressure) test facility. The ACOP test facility was constructed with a linear reduced scale of 1/5 referring to the APR+ reactor. The major flow path from the clod leg to hot leg was preserved with a principle of similarity. The core region was simulated using 257 core simulators, which are representative of the real HIPER fuel assemblies that APR+ reactor adopted. The core inlet flow rates and pressure distributions along the main flow path, which are significant information as an input data to evaluate the core thermal margin and reactor safety, were obtained by differential pressures measured at core simulators representing 257 fuel assemblies, and the static or differential pressures at 584 points, respectively. Two kinds of experiments, 4-pump balanced and unbalanced flow conditions, were conducted to examine the hydraulic characteristics of the reactor coolant flow. The mass balance and overall pressure drop were carefully examined to check the reliability of the obtained values. The inlet flow rates of the two test results showed similar distributions, which met the hydraulic performance requirement. The details of these experiments, the facility, and a data analysis are also described in this paper.
Zhang, Z J; Ong, S H; Lynn, H S; Peng, W X; Zhou, Y B; Zhao, G M; Jiang, Q W
2008-09-01
A new generalization of the negative binomial distribution (GNBD) is introduced and fitted to counts of Oncomelania hupensis, the intermediate host of Schistosoma japonicum, made, in areas of Chinese lakeland and marshland, early in the winter of 2005 and late in the spring of 2006. The GNBD was found to fit the snail data better than the standard negative binomial distribution (NBD) that has previously been widely used to model the distribution of O. hupensis. With two more parameters than the NBD, the GNBD can integrate many discrete distributions and is more flexible than the NBD in modelling O. hupensis. It also provides a better theoretical distribution for the quantitative study of O. hupensis, especially in building an accurate prediction model of snail density. The justification for adopting the GNBD is discussed. The GNBD allows researchers to broaden the field in the quantitative study not only of O. hupensis and schistosomiasis japonica but also of other environment-related helminthiases and family-clustered diseases that have, traditionally, been modelled using the NBD.
Energy Technology Data Exchange (ETDEWEB)
Dučić, Tanja, E-mail: tanja.ducic@desy.de; Borchert, Manuela [DESY, Notkestrasse 85, D-22607 Hamburg (Germany); Savić, Aleksandar; Kalauzi, Aleksandar; Mitrović, Aleksandra; Radotić, Ksenija, E-mail: tanja.ducic@desy.de [University of Belgrade, Kneza Višeslava 1, 11000 Belgrade (Serbia)
2013-03-01
Synchrotron-radiation-based X-ray microfluorescence has been used for in situ investigation of the distribution of micronutrient and macronutrient elements in an unstained cross section of a stem of monocotyledonous liana plant Dioscorea balcanica Košanin. The elemental allocation has been quantified and the grouping/co-localization in straight and twisted stem internodes has been analysed. Synchrotron-based X-ray microfluorescence (µSXRF) is an analytical method suitable for in situ investigation of the distribution of micronutrient and macronutrient elements in several-micrometres-thick unstained biological samples, e.g. single cells and tissues. Elements are mapped and quantified at sub-p.p.m. concentrations. In this study the quantity, distribution and grouping/co-localization of various elements have been identified in straight and twisted internodes of the stems of the monocotyledonous climber D. balcanica Košanin. Three different statistical methods were employed to analyse the macro-nutrient and micronutrient distributions and co-localization. Macronutrient elements (K, P, Ca, Cl) are distributed homogeneously in both straight and twisted internodes. Micronutrient elements are mostly grouped in the vasculature and in the sclerenchyma cell layer. In addition, co-localization of micronutrient elements is much more prominent in twisted than in straight internodes. These image analyses and statistical methods provided very similar outcomes and could be applied to various types of biological samples imaged by µSXRF.
Privacy-Preserving k-Means Clustering under Multiowner Setting in Distributed Cloud Environments
Directory of Open Access Journals (Sweden)
Hong Rong
2017-01-01
Full Text Available With the advent of big data era, clients who lack computational and storage resources tend to outsource data mining tasks to cloud service providers in order to improve efficiency and reduce costs. It is also increasingly common for clients to perform collaborative mining to maximize profits. However, due to the rise of privacy leakage issues, the data contributed by clients should be encrypted using their own keys. This paper focuses on privacy-preserving k-means clustering over the joint datasets encrypted under multiple keys. Unfortunately, existing outsourcing k-means protocols are impractical because not only are they restricted to a single key setting, but also they are inefficient and nonscalable for distributed cloud computing. To address these issues, we propose a set of privacy-preserving building blocks and outsourced k-means clustering protocol under Spark framework. Theoretical analysis shows that our scheme protects the confidentiality of the joint database and mining results, as well as access patterns under the standard semihonest model with relatively small computational overhead. Experimental evaluations on real datasets also demonstrate its efficiency improvements compared with existing approaches.
Directory of Open Access Journals (Sweden)
Qing Shuang
2016-01-01
Full Text Available The stability of water service is a hot point in industrial production, public safety, and academic research. The paper establishes a service evaluation model for the water distribution network (WDN. The serviceability is measured in three aspects: (1 the functionality of structural components under disaster environment; (2 the recognition of cascading failure process; and (3 the calculation of system reliability. The node and edge failures in WDN are interrelated under seismic excitations. The cascading failure process is provided with the balance of water supply and demand. The matrix-based system reliability (MSR method is used to represent the system events and calculate the nonfailure probability. An example is used to illustrate the proposed method. The cascading failure processes with different node failures are simulated. The serviceability is analyzed. The critical node can be identified. The result shows that the aged network has a greater influence on the system service under seismic scenario. The maintenance could improve the antidisaster ability of WDN. Priority should be given to controlling the time between the initial failure and the first secondary failure, for taking postdisaster emergency measures within this time period can largely cut down the spread of cascade effect in the whole WDN.
Directory of Open Access Journals (Sweden)
T. Viskari
2012-12-01
Full Text Available Aerosol characteristics can be measured with different instruments providing observations that are not trivially inter-comparable. Extended Kalman Filter (EKF is introduced here as a method to estimate aerosol particle number size distributions from multiple simultaneous observations. The focus here in Part 1 of the work was on general aspects of EKF in the context of Differential Mobility Particle Sizer (DMPS measurements. Additional instruments and their implementations are discussed in Part 2 of the work. University of Helsinki Multi-component Aerosol model (UHMA is used to propagate the size distribution in time. At each observation time (10 min apart, the time evolved state is updated with the raw particle mobility distributions, measured with two DMPS systems. EKF approach was validated by calculating the bias and the standard deviation for the estimated size distributions with respect to the raw measurements. These were compared to corresponding bias and standard deviation values for particle number size distributions obtained from raw measurements by a inversion of the instrument kernel matrix method. Despite the assumptions made in the EKF implementation, EKF was found to be more accurate than the inversion of the instrument kernel matrix in terms of bias, and compatible in terms of standard deviation. Potential further improvements of the EKF implementation are discussed.
Uptake and Distribution of Aluminum in Root Apices of Two Rice Varieties under Aluminum Stress
Directory of Open Access Journals (Sweden)
MIFTAHUDIN
2007-09-01
Full Text Available Aluminum (Al toxicity is the major limiting factor of plant growth and production in acid soils. The target of Al toxicity is the root tip, which affects mainly on root growth inhibition. The aim of this research was to study the uptake and distribution of Al in root apices of two rice varieties IR64 (Al-sensitive and Krowal (Al-tolerant, which were grown on nutrient solution containing 0, 15, 30, 45, and 60 ppm of Al. The root growth was significantly inhibited in both rice varieties at as low as 15 ppm Al concentration. The adventive roots of both varieties showed stunted growth in respons to Al stress. There was no difference in root growth inhibition between both rice varieties as well as among Al concentrations. Al uptake on root apices was qualitatively and quantitatively analyzed. Histochemical staining of roots using hematoxylin showed dark purple color on 1 mm region of Al-treated root apices. Rice var. IR 64 tended to take up more Al in root tip than Krowal did. However, there was no statistically significant difference (p = 0.176 in root Al content of both varieties in response to different concentration and period of Al treatments. Al distribution in root apices was found in the epidermal and subepidermal region in both rice varieties. Based on those results, rice var. Krowal that was previously grouped as Al-tolerant variety has similar root growth and physiological response to Al stress as compared to Al-sensitive variety IR64.
Directory of Open Access Journals (Sweden)
Antônia Arleudina Barros de Melo
2016-08-01
Full Text Available Spatial distribution of organic carbon and humic substances in irrigated soils under different management systems in a semi-arid zone in Ceará, Brazil Knowledge of the spatial variability in soil properties can contribute to effective use and management. This study was conducted to evaluate the spatial distribution of the levels of total organic carbon (TOC and humic substances (humic acid (C-FAH, fulvic acid fraction (C-FAF, and humin fraction (C-HUM in an Ultisol under different land uses, located in the irrigated perimeter of Baixo Acaraú-CE, transition to semiarid Ceará. The distribution and spatial dependence of the humic fractions were evaluated using descriptive statistics, including semivariogram analysis and data interpolation (kriging. The TOC showed a pure nugget effect, whereas the other fractions showed moderate spatial dependence. Forested and banana cultivation areas showed similar distributions of C-FAH and C-FAF, due to the high input of organic matter (leaves and pseudostems in the area of banana cultivation and the absence of soil disturbance in the forested area. Data interpolation (kriging and mapping were useful tools to assess the distribution and spatial dependence of soil attributes.
Understanding Statistics - Cancer Statistics
Annual reports of U.S. cancer statistics including new cases, deaths, trends, survival, prevalence, lifetime risk, and progress toward Healthy People targets, plus statistical summaries for a number of common cancer types.
Statistics for dental researchers: descriptive statistics
Directory of Open Access Journals (Sweden)
Mohammad Reza Baneshi PhD
2012-09-01
Full Text Available Descriptive statistics is the process of summarizing gathered raw data from a research and creating useful statistics,which help the better understanding of data. According to the types of variables, which consist of qualitative andquantitative variables, some descriptive statistics have been introduced. Frequency percentage is used in qualitativedata, and mean, median, mode, standard deviation, standard error, variance, and range are some of the statistics whichare used in quantitative data. In health sciences, the majority of continuous variables follow a normal distribution.skewness and kurtosis are two statistics which help to compare a given distribution with the normal distribution.
Spatial distribution of nematodes in soil cultivated with sugarcane under different uses
Cardoso, M. O.; Pedrosa, E. M. R.; Vicente, T. F. S.; Siqueira, G. M.; Montenegro, A. A. A.
2012-04-01
Sugarcane is a crop of major importance within the Brazilian economy, being an activity that generates energy and with high capacity to develop various economic sectors. Currently the greatest challenge is to maximize productivity and minimize environmental impacts. The plant-parasites nematodes have great expression, because influence directly the productive potential of sugarcane crops. Accordingly, little research has been devoted to the study of spatial variability of nematodes. Thus, the purpose of this work is to analyze the spatial distribution of nematodes in a soil cultivated with sugarcane in areas with and without irrigation, with distinct spacing of sampling to determine the differences between the sampling scales. The study area is located in the municipality of Goiana (Pernambuco State, Brazil). The experiment was conducted in two areas with 40 hectares each, being collected 90 samples at different spacing: 18 samples with spacing of 200.00 x 200.00 m, 36 samples with spacing of 20.00 m x 20.00 m and 36 samples with spacing of 2.00 m x 2.00 m. Soil samples were collected at deep of 0.00-0.20 m and nematodes were extracted per 300 cm3 of soil through centrifugal flotation in sucrose being quantified, classified according trophic habit (plant-parasites, fungivores, bacterivores, omnivores and predators) and identified in level of genus or family. In irrigated area the amount of water applied was determined considering the evapotranspiration of culture. The data were analyzed using classical statistics and geostatistics. The results demonstrated that the data showed high values of coefficient of variation in both study areas. All attributes studied showed log normal frequency distribution. The area B (irrigated) has a population of nematodes more stable than the area A (non-irrigated), a fact confirmed by its mean value of the total population of nematodes (282.45 individuals). The use of geostatistics not allowed to assess the spatial distribution of
Directory of Open Access Journals (Sweden)
Manna S.K.
2008-01-01
Full Text Available In this paper, we consider the problem of simultaneous determination of retail price and lot-size (RPLS under the assumption that the supplier offers a fixed credit period to the retailer. It is assumed that the item in stock deteriorates over time at a rate that follows a two-parameter Weibull distribution and that the price-dependent demand is represented by a constant-price-elasticity function of retail price. The RPLS decision model is developed and solved analytically. Results are illustrated with the help of a base example. Computational results show that the supplier earns more profits when the credit period is greater than the replenishment cycle length. Sensitivity analysis of the solution to changes in the value of input parameters of the base example is also discussed.
Flexible voltage support control for three-phase distributed generation inverters under grid fault
DEFF Research Database (Denmark)
Camacho, Antonio; Castilla, Miguel; Miret, Jaume
2013-01-01
connected inverters is proposed. In three phase balanced voltage sags, the inverter should inject reactive power in order to raise the voltage in all phases. In one or two phase faults, the main concern of the distributed generation inverter is to equalize voltages by reducing the negative symmetric...... Operators describe the behavior of the energy source, regulating voltage limits and reactive power injection to remain connected and support the grid under fault. On the basis that different kinds of voltage sags require different voltage support strategies, a flexible control scheme for three phase grid...... sequence and clear the phase jump. Due to system limitations, a balance between these two extreme policies is mandatory. Thus, over-voltage and undervoltage can be avoided, and the proposed control scheme prevents disconnection while achieving the desired voltage support service. The main contribution...
Energy analysis of under-floor air distribution (UFAD) system: An office building case study
International Nuclear Information System (INIS)
Alajmi, Ali F.; Abou-Ziyan, Hosny Z.; El-Amer, Wid
2013-01-01
Highlights: • The key issue for efficient performance of UFAD system is to ensure the thermal stratification establishment. • The unnecessarily excess air supplied to the room deteriorates the thermal stratification. • Improper UFAD operation increases the fan power and HVAC electric demand. • The proper UFAD system is typically more efficient than the existed UFAD system with energy savings of about 23–37%. • UFAD system shows over the CBAD system saving by about 37–39% during the peak months and 51% during October. - Abstract: This paper presents the results of an experimental and theoretical investigation to evaluate an under-floor air distribution (UFAD) system existed in an office building working on hot climate. Air temperature a distribution and supply air velocity are measured in two measuring stations; each consists of eight temperature sensors which were installed to measure room air temperatures along zone height. The obtained data shows an inefficient operation of the UFAD system which deteriorates the advantages of energy saving that presumed by UFAD system. The building energy simulation program, EnergyPlus, was used to identify the best setting of UFAD system and compare it with the existed UFAD and the conventional ceiling based air distribution (CBAD) system. The simulation results show that setting of room thermostat at 26 °C and supply air temperature at 18 °C provides the best efficient UFAD system. Due to improper operation of the tested UFAD system, its actual consumption is found to be higher than the best simulated UFAD by 23–37% during July to October. Also, the simulation results show that the HVAC demand of UFAD is lower than CBAD by 37–39% during July–September and 51% in October
Directory of Open Access Journals (Sweden)
Timothy Andrew Joyner
Full Text Available Anthrax, caused by the bacterium Bacillus anthracis, is a zoonotic disease that persists throughout much of the world in livestock, wildlife, and secondarily infects humans. This is true across much of Central Asia, and particularly the Steppe region, including Kazakhstan. This study employed the Genetic Algorithm for Rule-set Prediction (GARP to model the current and future geographic distribution of Bacillus anthracis in Kazakhstan based on the A2 and B2 IPCC SRES climate change scenarios using a 5-variable data set at 55 km(2 and 8 km(2 and a 6-variable BioClim data set at 8 km(2. Future models suggest large areas predicted under current conditions may be reduced by 2050 with the A2 model predicting approximately 14-16% loss across the three spatial resolutions. There was greater variability in the B2 models across scenarios predicting approximately 15% loss at 55 km(2, approximately 34% loss at 8 km(2, and approximately 30% loss with the BioClim variables. Only very small areas of habitat expansion into new areas were predicted by either A2 or B2 in any models. Greater areas of habitat loss are predicted in the southern regions of Kazakhstan by A2 and B2 models, while moderate habitat loss is also predicted in the northern regions by either B2 model at 8 km(2. Anthrax disease control relies mainly on livestock vaccination and proper carcass disposal, both of which require adequate surveillance. In many situations, including that of Kazakhstan, vaccine resources are limited, and understanding the geographic distribution of the organism, in tandem with current data on livestock population dynamics, can aid in properly allocating doses. While speculative, contemplating future changes in livestock distributions and B. anthracis spore promoting environments can be useful for establishing future surveillance priorities. This study may also have broader applications to global public health surveillance relating to other diseases in addition to B
Oprisan, Ana; Oprisan, Sorinel A; Hegseth, John J; Garrabos, Yves; Lecoutre-Chabot, Carole; Beysens, Daniel
2014-09-01
Phase separation has important implications for the mechanical, thermal, and electrical properties of materials. Weightless conditions prevent buoyancy and sedimentation from affecting the dynamics of phase separation and the morphology of the domains. In our experiments, sulfur hexafluoride (SF6) was initially heated about 1K above its critical temperature under microgravity conditions and then repeatedly quenched using temperature steps, the last one being of 3.6 mK, until it crossed its critical temperature and phase-separated into gas and liquid domains. Both full view (macroscopic) and microscopic view images of the sample cell unit were analyzed to determine the changes in the distribution of liquid droplet diameters during phase separation. Previously, dimple coalescences were only observed in density-matched binary liquid mixture near its critical point of miscibility. Here we present experimental evidences in support of dimple coalescence between phase-separated liquid droplets in pure, supercritical, fluids under microgravity conditions. Although both liquid mixtures and pure fluids belong to the same universality class, both the mass transport mechanisms and their thermophysical properties are significantly different. In supercritical pure fluids the transport of heat and mass are strongly coupled by the enthalpy of condensation, whereas in liquid mixtures mass transport processes are purely diffusive. The viscosity is also much smaller in pure fluids than in liquid mixtures. For these reasons, there are large differences in the fluctuation relaxation time and hydrodynamics flows that prompted this experimental investigation. We found that the number of droplets increases rapidly during the intermediate stage of phase separation. We also found that above a cutoff diameter of about 100 microns the size distribution of droplets follows a power law with an exponent close to -2, as predicted from phenomenological considerations.
Toward enhancing the distributed video coder under a multiview video codec framework
Lee, Shih-Chieh; Chen, Jiann-Jone; Tsai, Yao-Hong; Chen, Chin-Hua
2016-11-01
The advance of video coding technology enables multiview video (MVV) or three-dimensional television (3-D TV) display for users with or without glasses. For mobile devices or wireless applications, a distributed video coder (DVC) can be utilized to shift the encoder complexity to decoder under the MVV coding framework, denoted as multiview distributed video coding (MDVC). We proposed to exploit both inter- and intraview video correlations to enhance side information (SI) and improve the MDVC performance: (1) based on the multiview motion estimation (MVME) framework, a categorized block matching prediction with fidelity weights (COMPETE) was proposed to yield a high quality SI frame for better DVC reconstructed images. (2) The block transform coefficient properties, i.e., DCs and ACs, were exploited to design the priority rate control for the turbo code, such that the DVC decoding can be carried out with fewest parity bits. In comparison, the proposed COMPETE method demonstrated lower time complexity, while presenting better reconstructed video quality. Simulations show that the proposed COMPETE can reduce the time complexity of MVME to 1.29 to 2.56 times smaller, as compared to previous hybrid MVME methods, while the image peak signal to noise ratios (PSNRs) of a decoded video can be improved 0.2 to 3.5 dB, as compared to H.264/AVC intracoding.
International Nuclear Information System (INIS)
Nazim, K.; Khan, M.U.; Ali, Q.M.; Ahmed, M.; Shaukat, S.S.; Sherwani, S.K.
2012-01-01
Fungi and bacteria are heterotrophic decomposers that grow on organic matter and occupy various habitats in mangrove forests. This paper deals with the distribution and diversity of air-borne microbiota (fungi and bacteria) under a mangrove forest at Sandspit, Pakistan. A permanent stand was set up at Sandspit to observe the qualitative and quantitative variations throughout the year, using petri plate techniques. During the study, a total of 16 fungal species, viz., Aspergillus niger, A. fumigatus, A. sulphureus, A. terreus, A. wentii, A. flavus, Alternaria alternata, A. maritima, A. porri, Alternaria sp., Rhizopus varians, Mucormucedo, Penicillium sp., P. notatum, Dreshellera biseptata, Exosporiella fungorum, Cladosporium oxysporum and 14420 +- 267 bacterial colonies were recorded from the selected site. The study revealed that the fungi were the major component of airborne microflora in mangrove environment. It was observed that both fungal species and number of bacterial colonies were higher in summer than in winter. It is anticipated that the temperature and salinity of sea-water directly affect the diversity of fungi and bacteria in mangroves environment. The maximum diversity H' (1.906) was recorded in August whereas the minimum H' (1.053) was recorded in March. It is hoped that this research will add to our knowledge pertaining to the distribution and diversity of the airborne microbiota (bacteria and fungi) in mangrove ecosystem. (author)
This study assessed the pollutant emission offset potential of distributed grid-connected photovoltaic (PV) power systems. Computer-simulated performance results were utilized for 211 PV systems located across the U.S. The PV systems' monthly electrical energy outputs were based ...
Angerman, H.J.; Brinke, G. ten; Slot, J.J.M.
1999-01-01
In this paper we investigate in a systematic way the influence of polydispersity in the block lengths on the phase behavior of AB-multiblock copolymer melts. As model system we take a polydisperse multiblock copolymer for which both the A-blocks and the B-blocks satisfy a Schultz-Zimm distribution.
Directory of Open Access Journals (Sweden)
Catherine S. Jarnevich
2017-01-01
Full Text Available Invasive species provide a unique opportunity to evaluate factors controlling biogeographic distributions; we can consider introduction success as an experiment testing suitability of environmental conditions. Predicting potential distributions of spreading species is not easy, and forecasting potential distributions with changing climate is even more difficult. Using the globally invasive coypu (Myocastor coypus [Molina, 1782], we evaluate and compare the utility of a simplistic ecophysiological based model and a correlative model to predict current and future distribution. The ecophysiological model was based on winter temperature relationships with nutria survival. We developed correlative statistical models using the Software for Assisted Habitat Modeling and biologically relevant climate data with a global extent. We applied the ecophysiological based model to several global circulation model (GCM predictions for mid-century. We used global coypu introduction data to evaluate these models and to explore a hypothesized physiological limitation, finding general agreement with known coypu distribution locally and globally and support for an upper thermal tolerance threshold. Global circulation model based model results showed variability in coypu predicted distribution among GCMs, but had general agreement of increasing suitable area in the USA. Our methods highlighted the dynamic nature of the edges of the coypu distribution due to climate non-equilibrium, and uncertainty associated with forecasting future distributions. Areas deemed suitable habitat, especially those on the edge of the current known range, could be used for early detection of the spread of coypu populations for management purposes. Combining approaches can be beneficial to predicting potential distributions of invasive species now and in the future and in exploring hypotheses of factors controlling distributions.
International Nuclear Information System (INIS)
Krivoruchenko, M.I.
1989-01-01
A detailed statistical analysis of angular distribution of neutrino events observed in Kamiokande II and IMB detectors on UT 07:35, 2/23'87 is carried out. Distribution functions of the mean scattering angles in the reaction anti υ e p→e + n and υe→υe are constructed with account taken of the multiple Coulomb scattering and the experimental angular errors. The Smirnov and Wald-Wolfowitz run tests are used to test the hypothesis that the angular distributions of events from the two detectors agree with each other. We test with the use of the Kolmogorov and Mises statistical criterions the hypothesis that the recorded events all represent anti υ e p→e + n inelastic scatterings. Then the Neyman-Pearson test is applied to each event in testing the hypothesis anti υ e p→e + n against the alternative υe→υe. The hypotheses that the number of elastic events equals s=0, 1, 2, ... against the alternatives s≠0, 1, 2, ... are tested on the basis of the generalized likelihood ratio criterion. The confidence intervals for the number of elastic events are also constructed. The current supernova models fail to give a satisfactory account of the angular distribution data. (orig.)
Dučić, Tanja; Borchert, Manuela; Savić, Aleksandar; Kalauzi, Aleksandar; Mitrović, Aleksandra; Radotić, Ksenija
2013-01-01
Synchrotron-based X-ray microfluorescence (µSXRF) is an analytical method suitable for in situ investigation of the distribution of micronutrient and macronutrient elements in several-micrometres-thick unstained biological samples, e.g. single cells and tissues. Elements are mapped and quantified at sub-p.p.m. concentrations. In this study the quantity, distribution and grouping/co-localization of various elements have been identified in straight and twisted internodes of the stems of the monocotyledonous climber D. balcanica Košanin. Three different statistical methods were employed to analyse the macronutrient and micronutrient distributions and co-localization. Macronutrient elements (K, P, Ca, Cl) are distributed homogeneously in both straight and twisted internodes. Micronutrient elements are mostly grouped in the vasculature and in the sclerenchyma cell layer. In addition, co-localization of micronutrient elements is much more prominent in twisted than in straight internodes. These image analyses and statistical methods provided very similar outcomes and could be applied to various types of biological samples imaged by µSXRF. PMID:23412492
International Nuclear Information System (INIS)
Buffa, Francesca M.
2000-01-01
The aim of this work is to investigate the influence of the statistical fluctuations of Monte Carlo (MC) dose distributions on the dose volume histograms (DVHs) and radiobiological models, in particular the Poisson model for tumour control probability (tcp). The MC matrix is characterized by a mean dose in each scoring voxel, d, and a statistical error on the mean dose, σ d ; whilst the quantities d and σ d depend on many statistical and physical parameters, here we consider only their dependence on the phantom voxel size and the number of histories from the radiation source. Dose distributions from high-energy photon beams have been analysed. It has been found that the DVH broadens when increasing the statistical noise of the dose distribution, and the tcp calculation systematically underestimates the real tumour control value, defined here as the value of tumour control when the statistical error of the dose distribution tends to zero. When increasing the number of energy deposition events, either by increasing the voxel dimensions or increasing the number of histories from the source, the DVH broadening decreases and tcp converges to the 'correct' value. It is shown that the underestimation of the tcp due to the noise in the dose distribution depends on the degree of heterogeneity of the radiobiological parameters over the population; in particular this error decreases with increasing the biological heterogeneity, whereas it becomes significant in the hypothesis of a radiosensitivity assay for single patients, or for subgroups of patients. It has been found, for example, that when the voxel dimension is changed from a cube with sides of 0.5 cm to a cube with sides of 0.25 cm (with a fixed number of histories of 10 8 from the source), the systematic error in the tcp calculation is about 75% in the homogeneous hypothesis, and it decreases to a minimum value of about 15% in a case of high radiobiological heterogeneity. The possibility of using the error on the
Kleisner, Kristin M.; Fogarty, Michael J.; McGee, Sally; Hare, Jonathan A.; Moret, Skye; Perretti, Charles T.; Saba, Vincent S.
2017-04-01
The U.S. Northeast Continental Shelf marine ecosystem has warmed much faster than the global ocean and it is expected that this enhanced warming will continue through this century. Complex bathymetry and ocean circulation in this region have contributed to biases in global climate model simulations of the Shelf waters. Increasing the resolution of these models results in reductions in the bias of future climate change projections and indicates greater warming than suggested by coarse resolution climate projections. Here, we used a high-resolution global climate model and historical observations of species distributions from a trawl survey to examine changes in the future distribution of suitable thermal habitat for various demersal and pelagic species on the Shelf. Along the southern portion of the shelf (Mid-Atlantic Bight and Georges Bank), a projected 4.1 °C (surface) to 5.0 °C (bottom) warming of ocean temperature from current conditions results in a northward shift of the thermal habitat for the majority of species. While some southern species like butterfish and black sea bass are projected to have moderate losses in suitable thermal habitat, there are potentially significant increases for many species including summer flounder, striped bass, and Atlantic croaker. In the north, in the Gulf of Maine, a projected 3.7 °C (surface) to 3.9 °C (bottom) warming from current conditions results in substantial reductions in suitable thermal habitat such that species currently inhabiting this region may not remain in these waters under continued warming. We project a loss in suitable thermal habitat for key northern species including Acadian redfish, American plaice, Atlantic cod, haddock, and thorney skate, but potential gains for some species including spiny dogfish and American lobster. We illustrate how changes in suitable thermal habitat of important commercially fished species may impact local fishing communities and potentially impact major fishing ports
2010-07-01
... distribution requirements for Secondary School Vocational Education Program or the Postsecondary and Adult... VOCATIONAL AND APPLIED TECHNOLOGY EDUCATION PROGRAM What Kinds of Activities Does the Secretary Assist Under... distribution under— (1) The Secondary School Vocational Education Program; or (2) The Postsecondary and Adult...
Rohatgi, Vijay K
2003-01-01
Unified treatment of probability and statistics examines and analyzes the relationship between the two fields, exploring inferential issues. Numerous problems, examples, and diagrams--some with solutions--plus clear-cut, highlighted summaries of results. Advanced undergraduate to graduate level. Contents: 1. Introduction. 2. Probability Model. 3. Probability Distributions. 4. Introduction to Statistical Inference. 5. More on Mathematical Expectation. 6. Some Discrete Models. 7. Some Continuous Models. 8. Functions of Random Variables and Random Vectors. 9. Large-Sample Theory. 10. General Meth
Directory of Open Access Journals (Sweden)
Verónica del Rosario Avalos
2015-07-01
Full Text Available In this study we projected the effect of anthropogenic climate change in endemic and restricted-range Andean bird species that spread out from the center of Bolivia to southeastern Peru. We also analyzed the representation of these species in protected areas. The ensemble forecasts from niche-based models indicated that 91–100% of species may reduce their range size under full and no dispersal scenarios, including five species that are currently threatened. The large range reduction (average 63% suggests these mountain species may be threatened by climate change. The strong effects due to range species losses are predicted in the humid mountain forests of Bolivia. The representation of bird species also decreased in protected areas. Partial gap species (94–86% are expected to increase over the present (62%. This suggests climate change and other non-climate stressors should be incorporated in conservations plans for the long-term persistence of these species. This study anticipates the magnitude of shifts in the distribution of endemic birds, and represents in the study area the first exploration of the representation of range-restricted Andean birds in protected areas under climate change.
On the Pressure Distribution in a Porous Media under a Spherical Loading Surface
Wang, Qiuyun; Zhu, Zenghao; Nathan, Rungun; Wu, Qianhong
2017-11-01
The phenomenon of pressure generation and relaxation inside a porous media is widely observed in biological systems. Herein, we report a biomimetic study to examine the pressure distribution inside a soft porous layer when a spherical loaded surface suddenly impacts on it. A novel experimental setup was developed that includes a fully instrumented spherical piston and a soft fibrous porous layer underneath. Extensive experimental study was performed with different porous materials, different loadings and different sized loading surfaces. The pore pressure generation and the motion of the loading surface were recorded. A novel theoretical model was developed to characterize the pressure field during the process. Excellent agreement was observed between the experimental results and the theoretically predictions. It shows that the pressure generation is governed by the Brinkman parameter, α = h/Kp0.5, where h is the porous layer thickness, and Kp is the undeformed permeability. The study improves our understanding of the dynamic response of soft porous media under rapid compression. It has board impact on the study of transient load bearing in biological systems and industry applications. This work was supported by the National Science Foundation (NSF CBET) under Award #1511096.
[Absorption and distribution of K, Na and Mg in Avicennia marina seedlings under cadmium stress].
Lu, Zhi-qiang; Chen, Chang-xu; Ma, Li; Zheng, Wen-jiao
2015-05-01
In this paper, mangrove seedlings Avicennia marina were treated with various contents of cadmium (0, 0.5, 5, 25, 50, 100, 150 mg · L(-1)). These seedlings were cultivated by man-made seawater with a salinity of 15 in sand for 90 days in a greenhouse. The absorption and distribution of elements contents (K, Na and Mg) under cadmium stress were investigated at 45th and 90th day, respectively. The results showed that the enrichment of cadmium in the different components of seedlings increased with the increasing cadmium stress level and exposure time. The cadmium contents in roots and cotyledons were relatively higher than in the other components, accounting for 66.9% and 16.3% of cadmium in the seedlings under the 150 mg · L(-1) cadmium stress, respectively. The fall of cotyledons could reduce the damage of cadmium stress to the whole seedlings. The Na contents increased in roots and stems and decreased in leaves and cotyledons after cadmium stress for 90 days. The K content decreased in roots and cotyledons, while had no significant change in stems and leaves. The Mg content in roots, stems, leaves and cotyledons of seedlings treated with cadmium for 90 days were lower than those of the control, and were negatively related to the cadmium content.
Directory of Open Access Journals (Sweden)
Andreev Vladimir Igorevich
2018-01-01
Full Text Available Subject: one of the promising trends in the development of structural mechanics is the development of methods for solving problems in the theory of elasticity for bodies with continuous inhomogeneity of any deformation characteristics: these methods make it possible to use the strength of the material most fully. In this paper, we consider the two-dimensional problem for the case when a vertical, locally distributed load acts on the hemisphere and the inhomogeneity is caused by the influence of the temperature field. Research objectives: derive governing system of equations in spherical coordinates for determination of the stress state of the radially inhomogeneous hemispherical shell under locally distributed vertical load. Materials and methods: as a mechanical model, we chose a thick-walled reinforced concrete shell (hemisphere with inner and outer radii a and b, respectively, b > a. The shell’s parameters are a = 3.3 m, b = 4.5 m, Poisson’s ratio ν = 0.16; the load parameters are f = 10MPa - vertical localized load distributed over the outer face, θ0 = 30°, temperature on the internal surface of the shell Ta = 500 °C, temperature on the external surface of the shell Tb = 0 °C. The resulting boundary-value problem (a system of differential equations with variable coefficients is solved using the Maple software package. Results: maximal compressive stresses σr with allowance for material inhomogeneity are reduced by 10 % compared with the case when the inhomogeneity is ignored. But it is not so important compared with a 3-fold decrease in the tensile stress σθ on the inner surface and a 2-fold reduction in the tensile stress σθ on the outer surface of the hemisphere as concretes generally have a tensile strength substantially smaller than the compressive strength. Conclusions: the method presented in this article makes it possible to reduce the deformation characteristics of the material, i.e. it leads to a reduction in stresses
Valsan, Aswathy; Cv, Biju; Krishna, Ravi; Huffman, Alex; Poschl, Ulrich; Gunthe, Sachin
2016-04-01
Biological aerosols constitute a wide range of dead and alive biological materials and structures that are suspended in the atmosphere. They play an important role in the atmospheric physical, chemical and biological processes and health of living being by spread of diseases among humans, plants, and, animals. The atmospheric abundance, sources, physical properties of PBAPs as compared to non-biological aerosols, however, is poorly characterized. Though omnipresent, their concentration and composition exhibit large spatial and temporal variations depending up on their sources, land-use, and local meteorology. The Indian tropical region, which constitutes approximately 18% of the world's total population exhibits vast geographical extend and experiences a distinctive meteorological phenomenon by means of Indian Summer Monsoon (IMS). Thus, the sources, properties and characteristics of biological aerosols are also expected to have significant variations over the Indian subcontinent depending upon the location and seasons. Here we present the number concentration and size distribution of Fluorescent Biological Aerosol Particles (FBAP) from two contrasting locations in Southern tropical India measured during contrasting seasons using Ultra Violet Aerodynamic Particle Sizer (UV-APS). Measurements were carried out at a pristine high altitude continental site, Munnar (10.09 N, 77.06 E; 1605 m asl) during two contrasting seasons, South-West Monsoon (June-August, 2014) and winter (Jan - Feb, 2015) and in Chennai, a coastal urban area, during July - November 2015. FBAP concentrations at both the locations showed large variability with higher concentrations occurring at Chennai. Apart from regional variations, the FBAP concentrations also exhibited variations over two different seasons under the same environmental condition. In Munnar the FBAP concentration increased by a factor of four from South-West Monsoon to winter season. The average size distribution of FBAP at both
Directory of Open Access Journals (Sweden)
Dong Wang
2015-01-01
Full Text Available Gears are widely used in gearbox to transmit power from one shaft to another. Gear crack is one of the most frequent gear fault modes found in industry. Identification of different gear crack levels is beneficial in preventing any unexpected machine breakdown and reducing economic loss because gear crack leads to gear tooth breakage. In this paper, an intelligent fault diagnosis method for identification of different gear crack levels under different working conditions is proposed. First, superhigh-dimensional statistical features are extracted from continuous wavelet transform at different scales. The number of the statistical features extracted by using the proposed method is 920 so that the extracted statistical features are superhigh dimensional. To reduce the dimensionality of the extracted statistical features and generate new significant low-dimensional statistical features, a simple and effective method called principal component analysis is used. To further improve identification accuracies of different gear crack levels under different working conditions, support vector machine is employed. Three experiments are investigated to show the superiority of the proposed method. Comparisons with other existing gear crack level identification methods are conducted. The results show that the proposed method has the highest identification accuracies among all existing methods.
Khosravi Tanak, A.; Mohtashami Borzadaran, G. R.; Ahmadi, J.
2015-11-01
In economics and social sciences, the inequality measures such as Gini index, Pietra index etc., are commonly used to measure the statistical dispersion. There is a generalization of Gini index which includes it as special case. In this paper, we use principle of maximum entropy to approximate the model of income distribution with a given mean and generalized Gini index. Many distributions have been used as descriptive models for the distribution of income. The most widely known of these models are the generalized beta of second kind and its subclass distributions. The obtained maximum entropy distributions are fitted to the US family total money income in 2009, 2011 and 2013 and their relative performances with respect to generalized beta of second kind family are compared.
Changes of the potential distribution area of French Mediterranean forests under global warming
Directory of Open Access Journals (Sweden)
C. Gaucherel
2008-11-01
Full Text Available This work aims at understanding future spatial and temporal distributions of tree species in the Mediterranean region of France under various climates. We focused on two different species (Pinus Halepensis and Quercus Ilex and compared their growth under the IPCC-B2 climate scenario in order to quantify significant changes between present and future. The influence of environmental factors such as atmospheric CO_{2} increase and topography on the tree growth has also been quantified.
We modeled species growth with the help of a process-based model (MAIDEN, previously calibrated over measured ecophysiological and dendrochronological series with a Bayesian scheme. The model was fed with the ARPEGE – MeteoFrance climate model, combined with an explicit increase in CO_{2} atmospheric concentration. The main output of the model gives the carbon allocation in boles and thus tree production.
Our results show that the MAIDEN model is correctly able to simulate pine and oak production in space and time, after detailed calibration and validation stages. Yet, these simulations, mainly based on climate, are indicative and not predictive. The comparison of simulated growth at end of 20th and 21st centuries, show a shift of the pine production optimum from about 650 to 950 m due to 2.5 K temperature increase, while no optimum has been found for oak. With the direct effect of CO_{2} increase taken into account, both species show a significant increase in productivity (+26 and +43% for pine and oak respectively at the end of the 21st century.
While both species have different growth mechanisms, they have a good chance to extend their spatial distribution and their elevation in the Alps during the 21st century under the IPCC-B2 climate scenario. This extension is mainly due to the CO_{2} fertilization effect.
Dong, Zhao; Lewis, Christopher G; Burgess, Robert M; Coull, Brent; Shine, James P
2016-05-01
Free metal ion concentrations have been recognized as a better indicator of metal bioavailability in aquatic environments than total dissolved metal concentrations. However, our understanding of the determinants of free ion concentrations, especially in a metal mixture, is limited, due to underexplored techniques for measuring multiple free metal ions simultaneously. In this work, we performed statistical analyses on a large dataset containing repeated measurements of free ion concentrations of Cu, Zn, Pb, Ni, and Cd, the most commonly measured metals in seawater, at five inshore locations in Boston Harbor, previously collected using an in-situ equilibrium-based multi-metal free ion sampler, the 'Gellyfish'. We examined correlations among these five metals by season, and evaluated effects of 10 biogeochemical variables on free ion concentrations over time and location through multivariate regressions. We also explored potential clustering among the five metals through a principal component analysis. We found significant correlations among metals, with varying patterns over season. Our regression results suggest that instead of dissolved metals, pH, salinity, temperature and rainfall were the most significant determinants of free metal ion concentrations. For example, a one-unit decrease in pH was associated with a 2.2 (Cd) to 99 (Cu) times increase in free ion concentrations. This work is among the first to reveal key contributors to spatiotemporal variations in free ion concentrations, and demonstrated the usefulness of the Gellyfish sampler in routine sampling of free ions within metal mixtures and in generating data for statistical analyses. Copyright © 2016. Published by Elsevier Ltd.
Distributions of decadal means of temperature and precipitation change under global warming
Watterson, I. G.; Whetton, P. H.
2011-04-01
There remains uncertainty in the projected climate change over the 21st century, in part because of the range of responses forced by rising greenhouse gas concentrations among global climate models. This paper applies a method of estimating distributions and "probability density functions" (PDFs) for forced change, based on the pattern scaling technique and previously used for Australia, to generate changes in temperature and precipitation at locations over the globe, from simulations of 23 CMIP3 models. Changes for 2030 and 2100, under the A1B scenario for concentrations, for both seasonal and annual cases are presented. The PDFs for temperature have a standard deviation that averages 31% of the mean change, and they tend to be positively skewed. The standard deviation for precipitation averages 15% of the base climate mean, leading to five and 95 percentile estimates that are of opposite sign for most of the globe. A further source of uncertainty of change for a particular period of time, such as a decadal average, is the unforced or internal variability of climate. A joint probability distribution approach is used to produce PDFs for decadal means by adding in an estimate of internal variability. In the decade centered on 2030, this broadens the PDFs substantially. The results are related to time series of observations and projections over 1900-2100 for the agricultural regions of Iowa and the Murray-Darling Basin. For most land areas, warming becomes clearly discernable, allowing for both uncertainties, in the next few decades. Data files of the key results are provided.
International Nuclear Information System (INIS)
Goodarzi, Mohsen; Amooie, Hossein
2016-01-01
Crosswind significantly decreases cooling efficiency of a natural draft dry cooling tower. The possibility of improving cooling efficiency with heterogeneous water distribution within the cooling tower radiators under crosswind condition is analysed. A CFD approach was used to model the flow field and heat transfer phenomena within the cooling tower and airflow surrounding the cooling tower. A mathematical model was developed from various CFD results. Having used a trained Genetic Algorithm with the result of mathematical model, the best water distribution was found among the others. Remodeling the best water distribution with the CFD approach showed that the highest enhancement of the heat transfer compared to the usual uniform water distribution.
Lamon, Lara; Von Waldow, Harald; Macleod, Matthew; Scheringer, Martin; Marcomini, Antonio; Hungerbühler, Konrad
2009-08-01
We used the multimedia chemical fate model BETR Global to evaluate changes in the global distribution of two polychlorinated biphenyls, PCB 28 and PCB 153, under the influence of climate change. This was achieved by defining two climate scenarios based on results from a general circulation model, one scenario representing the last twenty years of the 20th century (20CE scenario) and another representing the global climate under the assumption of strong future greenhouse gas emissions (A2 scenario). The two climate scenarios are defined by four groups of environmental parameters: (1) temperature in the planetary boundary layer and the free atmosphere, (2) wind speeds and directions in the atmosphere, (3) current velocities and directions in the surface mixed layer of the oceans, and (4) rate and geographical pattern of precipitation. As a fifth parameter in our scenarios, we considerthe effect of temperature on primary volatilization emissions of PCBs. Comparison of dynamic model results using environmental parameters from the 20CE scenario against historical long-term monitoring data of concentrations of PCB 28 and PCB 153 in air from 16 different sites shows satisfactory agreement between modeled and measured PCBs concentrations. The 20CE scenario and A2 scenario were compared using steady-state calculations and assuming the same source characteristics of PCBs. Temperature differences between the two scenarios is the dominant factor that determines the difference in PCB concentrations in air. The higher temperatures in the A2 scenario drive increased primary and secondary volatilization emissions of PCBs, and enhance transport from temperate regions to the Arctic. The largest relative increase in concentrations of both PCB congeners in air under the A2 scenario occurs in the high Arctic and the remote Pacific Ocean. Generally, higher wind speeds under the A2 scenario result in more efficient intercontinental transport of PCB 28 and PCB 153 compared to the 20CE
Deschanel, Stephanie; Vigier, Gerard; Godin, Nathalie; Vanel, Loic; Ciliberto, Sergio
2007-03-01
For some heterogeneous materials fracture can be described as a clustering of microcracks: global rupture being not controlled by a single event. We focus on polyurethane foams whose heterogeneities (pores) constitute the termination points where microcracks can stop. We record both the spatial and time distributions of acoustic emission emitted by a sample during mechanical tests: each microcrack nucleation corresponds to a burst of energy that can be localized on the widest face of the specimen. The probability distributions of the energy released is power-law distributed, independently of the material density, the loading mode or the mechanical behavior. On the other hand, the agreement of a power law for the time intervals between two damaging events seems to require a quasi constant stress during damaging. Moreover, we notice a behavior difference of the cumulative number of events and the cumulative energy of the localized events with temperature in the case of tensile tests and not any more for creep tests. The occurrence of a unique behavior and a power law in a restricted time interval for the cumulative number of events and the cumulative energy in creep allow us to apprehend interesting later studies of materials' lifetime prediction.
Jana, Madhusudan
2015-01-01
Statistical mechanics is self sufficient, written in a lucid manner, keeping in mind the exam system of the universities. Need of study this subject and its relation to Thermodynamics is discussed in detail. Starting from Liouville theorem gradually, the Statistical Mechanics is developed thoroughly. All three types of Statistical distribution functions are derived separately with their periphery of applications and limitations. Non-interacting ideal Bose gas and Fermi gas are discussed thoroughly. Properties of Liquid He-II and the corresponding models have been depicted. White dwarfs and condensed matter physics, transport phenomenon - thermal and electrical conductivity, Hall effect, Magneto resistance, viscosity, diffusion, etc. are discussed. Basic understanding of Ising model is given to explain the phase transition. The book ends with a detailed coverage to the method of ensembles (namely Microcanonical, canonical and grand canonical) and their applications. Various numerical and conceptual problems ar...
Uptake and subcellular distribution of triclosan in typical hydrophytes under hydroponic conditions.
He, Yupeng; Nie, Enguang; Li, Chengming; Ye, Qingfu; Wang, Haiyan
2017-01-01
The increasing discharge of pharmaceuticals and personal care products (PPCPs) into the environment has generated serious public concern. The recent awareness of the environmental impact of this emerging class of pollutants and their potential adverse effects on human health have been documented in many reports. However, information regarding uptake and intracellular distribution of PPCPs in hydrophytes under hydroponic conditions, and potential human exposure is very limited. A laboratory experiment was conducted using 14 C-labeled triclosan (TCS) to investigate uptake and distribution of TCS in six aquatic plants (water spinach, purple perilla, cress, penny grass, cane shoot, and rice), and the subcellular distribution of 14 C-TCS was determined in these plants. The results showed that the uptake and removal rate of TCS from nutrient solution by hydrophytes followed the order of cress (96%) > water spinach (94%) > penny grass (87%) > cane shoot (84%) > purple perilla (78%) > rice (63%) at the end of incubation period (192 h). The range of 14 C-TCS content in the roots was 94.3%-99.0% of the added 14 C-TCS, and the concentrations in roots were 2-3 orders of magnitude greater than those in shoots. Furthermore, the subcellular fraction-concentration factor (3.6 × 10 2 -2.6 × 10 3 mL g -1 ), concentration (0.58-4.47 μg g -1 ), and percentage (30%-61%) of 14 C-TCS in organelles were found predominantly greater than those in cell walls and/or cytoplasm. These results indicate that for these plants, the roots are the primary storage for TCS, and within plant cells organelles are the major domains for TCS accumulation. These findings provide a better understanding of translocation and accumulation of TCS in aquatic plants at the cellular level, which is valuable for environmental and human health assessments of TCS. Copyright Â© 2016 Elsevier Ltd. All rights reserved.
International Nuclear Information System (INIS)
Liu Hongbo; Chen Zhihua; Zhou Ting
2012-01-01
Thermal effects on steel structures exposed to solar radiation are significant and complicated. Furthermore, the temperature variation within a year may result in damage in steel structures considering the solar radiation. In this paper, the temperature distribution of H-shaped steel members was investigated through a systematic experimental and theoretical study in the case of solar radiation. First, an H-shaped steel specimen was designed and its temperature distribution under solar radiation was obtained by a test. After that, a numerical method was proposed to obtain the temperature distribution under solar radiation. This method was based on transient thermal analysis and the analytical result was verified by the above experimental result. Furthermore, a parametric study was conducted to investigate the influence of various solar radiation parameters and orientation of H-shaped steel members on the temperature distribution under solar radiation. Finally, a simplified approach was developed to predict the temperature distribution under solar radiation. Both experimental and numerical results showed that the solar radiation had a significant effect on the temperature distribution of H-shaped steels. Considering the solar radiation, the temperature of the specimen is about 20.6 °C higher than the surrounding ambient air temperature. The temperature distribution under solar radiation was observed to be sensitive to the steel solar radiation absorption and orientation, but insensitive to the solar radiation reflectance. - Highlights: ► The temperature of H-shaped steel members was measured under solar radiation. ► A numerical method was proposed to consider the shadow of solar radiation. ► A parametric study was conducted. ► A simplified approach for temperature distribution was developed and verified.
International Nuclear Information System (INIS)
Ortiz Yusty, Carlos; Restrepo, Adriana; Paez, Vivian P
2014-01-01
We implemented a species distribution modelling approach to establish the potential distribution of Podocnemis lewyana, to explore the climatic factors that may influence the species' distribution and to evaluate possible changes in distribution under future climate scenarios. The distribution models predicted a continuous distribution from south to north along the Magdalena River, from Rivera and Palermo in the Department of Huila to the departments of Atlantico and Magdalena in the north. Temperature was the variable most influential in the distribution of P. lewyana; this species tends to be present in warm regions with low temperature variability. The distribution model predicted an increase in the geographic range of P. lewyana under climate change scenarios. However, taking into account the habitat preferences of this species and its strong association with water, this result should be treated with caution since the model considered only terrestrial climatic variables. Given the life history characteristics of this species (temperature dependent sex determination, high pivotal temperature and a very narrow transition range) and the negative effect of changes in hydrological regimes on embryo survival, expansion of the potential distribution of P. lewyana in the future does not mean that the species will not be affected by global climate change.
International Nuclear Information System (INIS)
Ihara, Ryohei; Katsuyama, JInya; Onizawa, Kunio; Hashimoto, Tadafumi; Mikami, Yoshiki; Mochizuki, Masahito
2011-01-01
Research highlights: → Residual stress distributions due to welding and machining are evaluated by XRD and FEM. → Residual stress due to machining shows higher tensile stress than welding near the surface. → Crack growth analysis is performed using calculated residual stress. → Crack growth result is affected machining rather than welding. → Machining is an important factor for crack growth. - Abstract: In nuclear power plants, stress corrosion cracking (SCC) has been observed near the weld zone of the core shroud and primary loop recirculation (PLR) pipes made of low-carbon austenitic stainless steel Type 316L. The joining process of pipes usually includes surface machining and welding. Both processes induce residual stresses, and residual stresses are thus important factors in the occurrence and propagation of SCC. In this study, the finite element method (FEM) was used to estimate residual stress distributions generated by butt welding and surface machining. The thermoelastic-plastic analysis was performed for the welding simulation, and the thermo-mechanical coupled analysis based on the Johnson-Cook material model was performed for the surface machining simulation. In addition, a crack growth analysis based on the stress intensity factor (SIF) calculation was performed using the calculated residual stress distributions that are generated by welding and surface machining. The surface machining analysis showed that tensile residual stress due to surface machining only exists approximately 0.2 mm from the machined surface, and the surface residual stress increases with cutting speed. The crack growth analysis showed that the crack depth is affected by both surface machining and welding, and the crack length is more affected by surface machining than by welding.
Pace, Roberto; Martinelli, Ernesto Marco; Sardone, Nicola; D E Combarieu, Eric
2015-03-01
Ginseng is any one of the eleven species belonging to the genus Panax of the family Araliaceae and is found in North America and in eastern Asia. Ginseng is characterized by the presence of ginsenosides. Principally Panax ginseng and Panax quinquefolius are the adaptogenic herbs and are commonly distributed as health food markets. In the present study high performance liquid chromatography has been used to identify and quantify ginsenosides in the two subject species and the different parts of the plant (roots, neck, leaves, flowers, fruits). The power of this chromatographic technique to evaluate the identity of botanical material and to distinguishing different part of the plants has been investigated with metabolomic technique such as principal component analysis. Metabolomics provide a good opportunity for mining useful chemical information from the chromatographic data set resulting an important tool for quality evaluation of medicinal plants in the authenticity, consistency and efficacy. Copyright © 2015 Elsevier B.V. All rights reserved.
Xu, Xiaole; Chen, Shengyong
2014-01-01
This paper investigates the finite-time consensus problem of leader-following multiagent systems. The dynamical models for all following agents and the leader are assumed the same general form of linear system, and the interconnection topology among the agents is assumed to be switching and undirected. We mostly consider the continuous-time case. By assuming that the states of neighbouring agents are known to each agent, a sufficient condition is established for finite-time consensus via a neighbor-based state feedback protocol. While the states of neighbouring agents cannot be available and only the outputs of neighbouring agents can be accessed, the distributed observer-based consensus protocol is proposed for each following agent. A sufficient condition is provided in terms of linear matrix inequalities to design the observer-based consensus protocol, which makes the multiagent systems achieve finite-time consensus under switching topologies. Then, we discuss the counterparts for discrete-time case. Finally, we provide an illustrative example to show the effectiveness of the design approach. PMID:24883367
The potential distribution of bioenergy crops in the UK under present and future climate
International Nuclear Information System (INIS)
Bellarby, Jessica; Smith, Pete; Tuck, Gill; Glendining, Margaret J.; Wattenbach, Martin
2010-01-01
We have predicted the potential distribution of 26 bioenergy crops in the UK, based on the simple model described by Tuck et al. The model has been applied at a 5 km resolution using the UKCIP02 model for scenarios at Low, Medium-Low, Medium-High and High emissions. In the analysis of the results the limitations for crop growth are assigned to elevation, temperature, high and low rainfall. Most of the crops currently grown are predicted to remain prevalent in the UK. A number of crops are suitable for introduction to the UK under a changing climate, whereas others retreat to northern parts of the UK. The greatest changes are expected in England. The simplicity of the model means that it has a relatively high uncertainty, with minor modifications to the model leading to quite different results. Nevertheless, it is well suited for identifying areas and crops that are most likely to be affected by the greatest changes. It has been noted that Miscanthus and Short Rotation Coppice (SRC) willow and poplar, which are currently regarded as highly suitable for UK conditions, may be less suited to southern areas in the future, where, for example, kenaf could have a greater potential. Further investigations are required to reduce uncertainty associated with the projections based on this simple model and to make conclusions more firmly. (author)
Wu, Wenyong; Yin, Shiyang; Liu, Honglu; Niu, Yong; Bao, Zhe
2014-10-01
The purpose of this study was to determine and evaluate the spatial changes in soil salinity by using geostatistical methods. The study focused on the suburb area of Beijing, where urban development led to water shortage and accelerated wastewater reuse to farm irrigation for more than 30 years. The data were then processed by GIS using three different interpolation techniques of ordinary kriging (OK), disjunctive kriging (DK), and universal kriging (UK). The normality test and overall trend analysis were applied for each interpolation technique to select the best fitted model for soil parameters. Results showed that OK was suitable for soil sodium adsorption ratio (SAR) and Na(+) interpolation; UK was suitable for soil Cl(-) and pH; DK was suitable for soil Ca(2+). The nugget-to-sill ratio was applied to evaluate the effects of structural and stochastic factors. The maps showed that the areas of non-saline soil and slight salinity soil accounted for 6.39 and 93.61%, respectively. The spatial distribution and accumulation of soil salt were significantly affected by the irrigation probabilities and drainage situation under long-term wastewater irrigation.
Bacterial distribution in the rhizosphere of wild barley under contrasting microclimates.
Directory of Open Access Journals (Sweden)
Salme Timmusk
Full Text Available BACKGROUND: All plants in nature harbor a diverse community of rhizosphere bacteria which can affect the plant growth. Our samples are isolated from the rhizosphere of wild barley Hordeum spontaneum at the Evolution Canyon ('EC', Israel. The bacteria which have been living in close relationship with the plant root under the stressful conditions over millennia are likely to have developed strategies to alleviate plant stress. METHODOLOGY/PRINCIPAL FINDINGS: We studied distribution of culturable bacteria in the rhizosphere of H. spontaneum and characterized the bacterial 1-aminocyclopropane-1-carboxylate deaminase (ACCd production, biofilm production, phosphorus solubilization and halophilic behavior. We have shown that the H. spontaneum rhizosphere at the stressful South Facing Slope (SFS harbors significantly higher population of ACCd producing biofilm forming phosphorus solubilizing osmotic stress tolerant bacteria. CONCLUSIONS/SIGNIFICANCE: The long-lived natural laboratory 'EC' facilitates the generation of theoretical testable and predictable models of biodiversity and genome evolution on the area of plant microbe interactions. It is likely that the bacteria isolated at the stressful SFS offer new opportunities for the biotechnological applications in our agro-ecological systems.
Bacterial distribution in the rhizosphere of wild barley under contrasting microclimates.
Timmusk, Salme; Paalme, Viiu; Pavlicek, Tomas; Bergquist, Jonas; Vangala, Ameraswar; Danilas, Triin; Nevo, Eviatar
2011-03-23
All plants in nature harbor a diverse community of rhizosphere bacteria which can affect the plant growth. Our samples are isolated from the rhizosphere of wild barley Hordeum spontaneum at the Evolution Canyon ('EC'), Israel. The bacteria which have been living in close relationship with the plant root under the stressful conditions over millennia are likely to have developed strategies to alleviate plant stress. We studied distribution of culturable bacteria in the rhizosphere of H. spontaneum and characterized the bacterial 1-aminocyclopropane-1-carboxylate deaminase (ACCd) production, biofilm production, phosphorus solubilization and halophilic behavior. We have shown that the H. spontaneum rhizosphere at the stressful South Facing Slope (SFS) harbors significantly higher population of ACCd producing biofilm forming phosphorus solubilizing osmotic stress tolerant bacteria. The long-lived natural laboratory 'EC' facilitates the generation of theoretical testable and predictable models of biodiversity and genome evolution on the area of plant microbe interactions. It is likely that the bacteria isolated at the stressful SFS offer new opportunities for the biotechnological applications in our agro-ecological systems.
A multi-period distribution network design model under demand uncertainty
Tabrizi, Babak H.; Razmi, Jafar
2013-05-01
Supply chain management is taken into account as an inseparable component in satisfying customers' requirements. This paper deals with the distribution network design (DND) problem which is a critical issue in achieving supply chain accomplishments. A capable DND can guarantee the success of the entire network performance. However, there are many factors that can cause fluctuations in input data determining market treatment, with respect to short-term planning, on the one hand. On the other hand, network performance may be threatened by the changes that take place within practicing periods, with respect to long-term planning. Thus, in order to bring both kinds of changes under control, we considered a new multi-period, multi-commodity, multi-source DND problem in circumstances where the network encounters uncertain demands. The fuzzy logic is applied here as an efficient tool for controlling the potential customers' demand risk. The defuzzifying framework leads the practitioners and decision-makers to interact with the solution procedure continuously. The fuzzy model is then validated by a sensitivity analysis test, and a typical problem is solved in order to illustrate the implementation steps. Finally, the formulation is tested by some different-sized problems to show its total performance.
Pressure distribution under three different types of harnesses used for guide dogs.
Peham, C; Limbeck, S; Galla, K; Bockstahler, B
2013-12-01
The aim of this study was to evaluate the pressure distribution under three different types of harnesses used for guide dogs (designated H1, H2 and H3). The dogs (n = 8) led a trainer through a course including a range of exercises (straight line, curve left, curve right, upstairs and downstairs). All dogs were clinically sound and showed no sign of lameness. The pressures beneath the harnesses were determined by sensor strips and related to the gait. In all harnesses, the highest pressures were found in the right sternal region (H1 2.02 ± 0.6N/cm(2); H2 1.76 ± 0.4N/cm(2); H3 1.14 ± 0.5 N/cm(2)). In all other regions, the pressures were in the range of 0-1.32 N/cm(2). The right and left sternal regions were almost constantly loaded. Contrary to previous assumptions, the back regions had minimal loading. This investigation demonstrated that there were significant differences among the harnesses. Copyright © 2013 Elsevier Ltd. All rights reserved.
DEFF Research Database (Denmark)
Zhong, Lan; Zhang, Kunlin; Huang, Xiangang
2003-01-01
that repeats of different copy number have different probabilities of appearance in shotgun data, so based on this principle, we constructed a statistical model and inferred criteria for mathematically defined repeats (MDRs) at different shotgun coverages. According to these criteria, we developed software...... MDRmasker to identify and mask MDRs in shotgun data. With repeats masked prior to assembly, the speed of assembly was increased with lower error probability. In addition, clone-insert size affect the accuracy of repeat assembly and scaffold construction, we also designed length distribution of clone...
Directory of Open Access Journals (Sweden)
Ante Kutle
2012-12-01
Full Text Available Geochemical environment can influence human health causing chronic medical problems related to long-term, low-level exposures to toxic agents such are trace elements. Humans can be directly exposed to toxic substances by inhalation of air dust or indirectly through food chain or by consumption of local water for drinking, cooking, personal hygiene and recreational purposes. Chronic medical problems related to geochemical characteristics of the environment can also be caused by chronic deficit of chemical elements essential for humans. In this paper we will present several applications of the GIS and statistical methods for relating the geographical distribution of diseases with geochemical characteristics of the environment. In addition, we are presenting methods applied for distinguishing natural distribution of elements from anthropogenic contribution, which is important information for establishing protective measures necessary for decreasing the health risk (the paper is published in Croatian.
Stromqvist Vetelino, Frida E.
The performance of lasercom systems operating in the atmosphere is reduced by optical turbulence, which causes irradiance fluctuations in the received signal. The result is a randomly fading signal. Fade statistics for lasercom systems are determined from the probability density function (PDF) of the irradiance fluctuations. The expected number of fades per second and their mean fade time require the joint PDF of the fluctuating irradiance and its time derivative. Theoretical integral expressions, as well as closed form, analytical approximations, were developed for the joint PDF of a gamma-gamma distributed irradiance and its time derivative, and the corresponding expression for the expected number of fades per second. The new approximation for the conditional PDF of the time derivative of a gamma-gamma irradiance is a zero mean Gaussian distribution, with a complicated irradiance depending variance. Fade statistics obtained from experimental data were compared to theoretical predictions based on the lognormal and gamma-gamma distributions. A Gaussian beam wave was propagated through the atmosphere along a horizontal path, near ground, in the moderate-to-strong optical turbulence. To characterize the propagation path, a new method that infers atmospheric propagation parameters was developed. Scintillation theory combined with a numerical scheme was used to infer the structure constant Cn2, the inner scale, l0, and the outer scale, L0, from the optical measurements. The inferred parameters were used in calculations for the theoretical PDFs. It was found that fade predictions made by the gamma-gamma and lognormal distributions provide an upper and lower bound, respectively, for the probability of fade and the number of fades per second for irradiance data collected in the moderate-to-strong fluctuation regime. Aperture averaging effects on the PDF of the irradiance fluctuations were investigated by comparing the irradiance distributions for the three receiver
Directory of Open Access Journals (Sweden)
Rodrigues M. L. K.
2013-04-01
Full Text Available In this study we evaluated the occurrence of heavy metals in a fluvial environment under the influence of tanneries – the Cadeia and Feitoria rivers basin (RS, south Brazil, highlighting the distribution and potential mobility of the selected elements. Every three months, over one year-period, selected heavy metals and ancillary parameters were analyzed in water and sediment samples taken at ten sites along the rivers. Water analyses followed APHA recommendations, and sediment analyses were based on methods from USEPA (SW846 and European Community (BCR sequential extraction. The determinations were performed by ICP/OES, except for Hg (CV/ETA. Statistical factor analysis was applied to water and sediment data sets, in order to obtain a synthesis of the environmental diagnosis. The results revealed that water quality decreased along the rivers, and mainly on the dry period (January, showing the influence of tannery plants vicinity and flow variations. Except for Fe, Al, and eventually Mn, heavy metal contents in water were in agreement with Brazilian standards. Concerning sediments, Al, Cu, Fe, Ni, Mn, Ti, and Zn concentrations appeared to reflect the base levels, while Cr and Hg were enriched in the deposits from the lower part of the basin. The partition of heavy metals among the sediment geochemical phases showed higher mobility of Mn along the sampling sites, followed by Cr in the lower reach of the basin, most affected by tanneries. Since Cr was predominantly associated to the oxidizable fraction, its potential mobilization from contaminated sediments would be associated to redox conditions. The detection of Hg in the tissue of a bottom-fish species indicated that the environmental conditions are apparently favoring the remobilization of this metal from contaminated sediments.
Directory of Open Access Journals (Sweden)
Junzeng Xu
2016-03-01
Full Text Available The diurnal pattern of nitrous oxide (N2O emissions is essential in understanding how weather and soil conditions influence the daily mean estimate of N2O fluxes. Incubation experiments were conducted to investigate the effects of vertical soil moisture distribution patterns on diurnal variation of N2O emissions. Clear diurnal patterns of N2O emissions on both surface watering (SW and subsurface watering (SUW treatments (SUW12, SUW15, and SUW18 were detected from soil sample (I, silty clay, and soil sample (II, sandy loam, where peak N2O fluxes usually occurred between 12:00 and 18:00 h. Different vertical watering patterns resulted in changes in the daily range of N2O fluxes and peak time. Mean fluxes from the SUW12, SUW15, and SUW18 treatments were 37.4%, 32.7%, and 43.3% lower than those from SW treatments from soil sample I, and 32.0%, 40.3%, and 41.1% from soil sample II. Moisture distribution patterns under SUW soils could be effective to mitigate N2O emissions. The N2O emissions from soil sample I ranged from178.3 to 2741.0 μg N2O m-2 h-1, which was more than in soil sample II with 7.0 to 83.7 μg N2O m-2 h-1. The different soil texture and N content level might account for the differences in magnitude of N2O fluxes from soils. The optimal soil moisture condition for peak N2O fluxes in the SW treatment had relatively narrower ranges than the SUW treatments with 46% to 60% water-filled pore space (WFPS for soil sample I and 26% to 34% WFPS for soil sample II even though surface soil moisture for peak N2O fluxes were somewhat different from the previously reported optimal soil moisture range of 45% to 75% WFPS.
Suresh, S; Radha, K V
2016-03-01
The present study deals with production of phytase from Rhizopus oligosporus MTCC 556 by solid state fermentation (SSF) using different (ADT27, IR20, PAIYUR1, KG, and RASI) rice bran varieties, in which ADT27 rice bran yield maximum of 6.2 U gds⁻¹ phytase. Statistical optimization was employed by Central Composite Design (CCD); the results showed that 3.0 g dextrose, 2.5 g ammonium nitrate, substrate size of 80 mesh, 10 mg calcium chloride was 116 hr at optimal for phytase production by SSF, with maximum of 23.14 U gds'. Phytase production improved by 4 fold (31.3 U/gds) due to chemical mutagenesis (mutant Rhizopus oligosporus MTCC 1116) in optimized media composition. Partially purified phytase showed approximately 90 kDa of molecular mass and was optimally active at 5.5 pH and 50°C temperature. Substrate specificity exhibited in sodium phytic acid and phytase activity was stimulated by Zn²⁺ and Ca²⁺.
International Nuclear Information System (INIS)
Alpizar Chavarria, Oscar
2013-01-01
A literature review is conducted to understand the distributed generation, the reason for the introduction into modern power systems and other distributed generation technologies based on renewable energies that have been installed around the country. The frequency protections of distributed generation equipment under 1MW are studied according to international standards like IEEE-1547 and specifications of equipment manufacturers. The influence of the recommended international standards settings are investigated for systems of distributed generation, the performance in frequency that have presented under some frequency perturbation, as well as the influence that can have on the national and regional electrical system, with different amounts of technologies included in the national system. The recommended settings are evaluated through simulations in PSSE program in the context of the behavior of the frequency in the national electric system [es
International Nuclear Information System (INIS)
Tanaka, H.; Ohno, N.; Tsuji, Y.; Kajita, S.
2010-01-01
We have analyzed the 2D convective motion of coherent structures, which is associated with plasma blobs, under attached and detached plasma conditions of a linear divertor simulator, NAGDIS-II. Data analysis of probes and a fast-imaging camera by spatio-temporal correlation with three decomposition and proper orthogonal decomposition (POD) was carried out to determine the basic properties of coherent structures detached from a bulk plasma column. Under the attached plasma condition, the spatio-temporal correlation with three decomposition based on the probe measurement showed that two types of coherent structures with different sizes detached from the bulk plasma and the azimuthally localized structure radially propagated faster than the larger structure. Under the detached plasma condition, movies taken by the fast-imaging camera clearly showed the dynamics of a 2D spiral structure at peripheral regions of the bulk plasma; this dynamics caused the broadening of the plasma profile. The POD method was used for the data processing of the movies to obtain low-dimensional mode shapes. It was found that the m=1 and m=2 ring-shaped coherent structures were dominant. Comparison between the POD analysis of both the movie and the probe data suggested that the coherent structure could be detached from the bulk plasma mainly associated with the m=2 fluctuation. This phenomena could play an important role in the reduction of the particle and heat flux as well as the plasma recombination processes in plasma detachment (copyright 2010 WILEY-VCH Verlag GmbH and Co. KGaA, Weinheim) (orig.)
Humic substances and its distribution in coffee crop under cover crops and weed control methods
Directory of Open Access Journals (Sweden)
Bruno Henrique Martins
2016-08-01
Full Text Available ABSTRACT Humic substances (HS comprise the passive element in soil organic matter (SOM, and represent one of the soil carbon pools which may be altered by different cover crops and weed control methods. This study aimed to assess HS distribution and characteristics in an experimental coffee crop area subjected to cover crops and cultural, mechanical, and chemical weed control. The study was carried out at Londrina, in the state of Paraná, southern Brazil (23°21’30” S; 51°10’17” W. In 2008, seven weed control/cover crops were established in a randomized block design between two coffee rows as the main-plot factor per plot and soil sampling depths (0-10 cm, 10-20 cm, 20-30 cm and 30-40 cm as a split-plot. HS were extracted through alkaline and acid solutions and analyzed by chromic acid wet oxidation and UV-Vis spectroscopy. Chemical attributes presented variations in the topsoil between the field conditions analyzed. Cover crop cutting and coffee tree pruning residues left on the soil surface may have interfered in nutrient cycling and the humification process. Data showed that humic substances comprised about 50 % of SOM. Although different cover crops and weed control methods did not alter humic and fulvic acid carbon content, a possible incidence of condensed aromatic structures at depth increments in fulvic acids was observed, leading to an average decrease of 53 % in the E4/E6 ratio. Humin carbon content increased 25 % in the topsoil, particularly under crop weed-control methods, probably due to high incorporation of recalcitrant structures from coffee tree pruning residues and cover crops.
Stress distribution in dental prosthesis under an occlusal combined dynamic loading
International Nuclear Information System (INIS)
Merdji, A.; Bachir Bouiadjra, B.; Ould Chikh, B.; Mootanah, R.; Aminallah, L.; Serier, B.; Muslih, I.M.
2012-01-01
Highlights: ► The mechanical stress reaches the highest in areas of cortical bones. ► The mechanical stress in the cancellous bone reaches greatest in the bottom of the dental implant. ► Implant with low-volume bone might cause increased stress concentration in the cortical bone. -- Abstract: The biomechanical behavior of osseointegrated dental prostheses systems plays an important role in its functional longevity inside the bone. Simulation of these systems requires an accurate modeling of the prosthesis components, the jaw bone, the implant–bone interface, and the response of the system to different types of applied forces. The purpose of this study was to develop a new three-dimensional model of an osseointegrated molar dental prosthesis and to carry out finite element analysis to evaluate stress distributions in the bone and the dental prosthesis compounds under an occlusal combined dynamic load was applied to the top of the occlusale face of the prosthesis crown. The jaw bone model containing cortical bone and cancellous bone was constructed by using computer tomography scan pictures and Computer Aided Design tools. The dental prosthesis compounds were constructed, simulating the commercially available cylindrical implant of 4.8 mm diameter and 10 mm length. Both finite element models were created in Abaqus finite element software. All materials used in the models were considered to be isotropic, homogeneous and linearly elastic. The elastic properties, loads and constraints used in the model were taken from published data. Results of our finite element analyses, indicated that the maximum stresses were located around the mesial neck of the implant, in the marginal bone. Thus, this area should be preserved clinically in order to maintain the bone–implant interface structurally and functionally.
Cai, Huai-yu; Dong, Xiao-tong; Zhu, Meng; Huang, Zhan-hua
2018-01-01
Wavefront coding for athermal technique can effectively ensure the stability of the optical system imaging in large temperature range, as well as the advantages of compact structure and low cost. Using simulation method to analyze the properties such as PSF and MTF of wavefront coding athermal system under several typical temperature gradient distributions has directive function to characterize the working state of non-ideal temperature environment, and can effectively realize the system design indicators as well. In this paper, we utilize the interoperability of data between Solidworks and ZEMAX to simplify the traditional process of structure/thermal/optical integrated analysis. Besides, we design and build the optical model and corresponding mechanical model of the infrared imaging wavefront coding athermal system. The axial and radial temperature gradients of different degrees are applied to the whole system by using SolidWorks software, thus the changes of curvature, refractive index and the distance between the lenses are obtained. Then, we import the deformation model to ZEMAX for ray tracing, and obtain the changes of PSF and MTF in optical system. Finally, we discuss and evaluate the consistency of the PSF (MTF) of the wavefront coding athermal system and the image restorability, which provides the basis and reference for the optimal design of the wavefront coding athermal system. The results show that the adaptability of single material infrared wavefront coding athermal system to axial temperature gradient can reach the upper limit of temperature fluctuation of 60°C, which is much higher than that of radial temperature gradient.
Tyagi, H.; Gosain, A. K.; Khosa, R.; Anand, J.
2015-12-01
Rivers have no regard for human demarcated boundaries. Besides, ever increasing demand-supply gap & vested riparian interests, fuel transboundary water conflicts. For resolving such disputes, appropriation doctrines advocating equity & fairness have received endorsement in the Helsinki Rules-1966 & UN Convention-1997. Thus, current study proposes the principle of equitable apportionment for sharing Ganges waters as it balances the interests & deservedness of all stakeholders, namely, India & its 11 states, Bangladesh, Nepal, & China. The study endeavors to derive a reasonable share of each co-basin state by operationalizing the vague concepts of fairness & equity through an objective & quantitative framework encompassing proportionality & egalitarianism for distributive & procedural justice. Equal weightage factors reflecting hydrology, geography & water use potential are chosen for fair share computation, wherein each contender ranks these factors to maximize his entitlement. If cumulative claims exceed the water availability, each claimant puts forth next ranked factor & this process continues till the claims match availability. Due to inter-annual variability in few factors, scenarios for Rabi & Kharif seasons are considered apart from cases for maximum, upper quartile, median, lower quartile & minimum. Possibility of spatial homogeneity & heterogeneity in factors is also recognized. Sometimes lack of technical information hinders transboundary dispute resolution via legal mechanisms. Hence, the study also attempts to bridge this gap between law & technology through GIS-based SWAT hydrologic model by estimating the Ganges water yield, & consequent share of each riparian for range of flows incorporating e-flows as well, under present & future climate & landuse scenarios. 82% of India's territory lies within interstate rivers, & therefore this research is very pertinent as it can facilitate the decision makers in effective interstate water conflict resolution.
DEFF Research Database (Denmark)
Jeong, Cheol-Ho
2009-01-01
Most acoustic measurements are based on an assumption of ideal conditions. One such ideal condition is a diffuse and reverberant field. In practice, a perfectly diffuse sound field cannot be achieved in a reverberation chamber. Uneven incident energy density under measurement conditions can cause...... discrepancies between the measured value and the theoretical random incidence absorption coefficient. Therefore the angular distribution of the incident acoustic energy onto an absorber sample should be taken into account. The angular distribution of the incident energy density was simulated using the beam...... tracing method for various room shapes and source positions. The averaged angular distribution is found to be similar to a Gaussian distribution. As a result, an angle-weighted absorption coefficient was proposed by considering the angular energy distribution to improve the agreement between...
Understanding advanced statistical methods
Westfall, Peter
2013-01-01
Introduction: Probability, Statistics, and ScienceReality, Nature, Science, and ModelsStatistical Processes: Nature, Design and Measurement, and DataModelsDeterministic ModelsVariabilityParametersPurely Probabilistic Statistical ModelsStatistical Models with Both Deterministic and Probabilistic ComponentsStatistical InferenceGood and Bad ModelsUses of Probability ModelsRandom Variables and Their Probability DistributionsIntroductionTypes of Random Variables: Nominal, Ordinal, and ContinuousDiscrete Probability Distribution FunctionsContinuous Probability Distribution FunctionsSome Calculus-Derivatives and Least SquaresMore Calculus-Integrals and Cumulative Distribution FunctionsProbability Calculation and SimulationIntroductionAnalytic Calculations, Discrete and Continuous CasesSimulation-Based ApproximationGenerating Random NumbersIdentifying DistributionsIntroductionIdentifying Distributions from Theory AloneUsing Data: Estimating Distributions via the HistogramQuantiles: Theoretical and Data-Based Estimate...