statistical tests for frequency distribution of mean gravity anomalies
African Journals Online (AJOL)
ES Obe
1980-03-01
Mar 1, 1980 ... STATISTICAL TESTS FOR FREQUENCY DISTRIBUTION OF MEAN. GRAVITY ANOMALIES. By ... approach. Kaula [1,2] discussed the method of applying statistical techniques in the ..... mathematical foundation of physical ...
Recurrent frequency-size distribution of characteristic events
Directory of Open Access Journals (Sweden)
S. G. Abaimov
2009-04-01
Full Text Available Statistical frequency-size (frequency-magnitude properties of earthquake occurrence play an important role in seismic hazard assessments. The behavior of earthquakes is represented by two different statistics: interoccurrent behavior in a region and recurrent behavior at a given point on a fault (or at a given fault. The interoccurrent frequency-size behavior has been investigated by many authors and generally obeys the power-law Gutenberg-Richter distribution to a good approximation. It is expected that the recurrent frequency-size behavior should obey different statistics. However, this problem has received little attention because historic earthquake sequences do not contain enough events to reconstruct the necessary statistics. To overcome this lack of data, this paper investigates the recurrent frequency-size behavior for several problems. First, the sequences of creep events on a creeping section of the San Andreas fault are investigated. The applicability of the Brownian passage-time, lognormal, and Weibull distributions to the recurrent frequency-size statistics of slip events is tested and the Weibull distribution is found to be the best-fit distribution. To verify this result the behaviors of numerical slider-block and sand-pile models are investigated and the Weibull distribution is confirmed as the applicable distribution for these models as well. Exponents β of the best-fit Weibull distributions for the observed creep event sequences and for the slider-block model are found to have similar values ranging from 1.6 to 2.2 with the corresponding aperiodicities C_{V} of the applied distribution ranging from 0.47 to 0.64. We also note similarities between recurrent time-interval statistics and recurrent frequency-size statistics.
Dynamic Response to Pedestrian Loads with Statistical Frequency Distribution
DEFF Research Database (Denmark)
Krenk, Steen
2012-01-01
on the magnitude of the resulting response. A frequency representation of vertical pedestrian load is developed, and a compact explicit formula is developed for the magnitude of the resulting response, in terms of the damping ratio of the structure, the bandwidth of the pedestrian load, and the mean footfall...... frequency. The accuracy of the formula is verified by a statistical moment analysis using the Lyapunov equations....
International Nuclear Information System (INIS)
Dai, Wu-Sheng; Xie, Mi
2013-01-01
In this paper, we give a general discussion on the calculation of the statistical distribution from a given operator relation of creation, annihilation, and number operators. Our result shows that as long as the relation between the number operator and the creation and annihilation operators can be expressed as a † b=Λ(N) or N=Λ −1 (a † b), where N, a † , and b denote the number, creation, and annihilation operators, i.e., N is a function of quadratic product of the creation and annihilation operators, the corresponding statistical distribution is the Gentile distribution, a statistical distribution in which the maximum occupation number is an arbitrary integer. As examples, we discuss the statistical distributions corresponding to various operator relations. In particular, besides the Bose–Einstein and Fermi–Dirac cases, we discuss the statistical distributions for various schemes of intermediate statistics, especially various q-deformation schemes. Our result shows that the statistical distributions corresponding to various q-deformation schemes are various Gentile distributions with different maximum occupation numbers which are determined by the deformation parameter q. This result shows that the results given in much literature on the q-deformation distribution are inaccurate or incomplete. -- Highlights: ► A general discussion on calculating statistical distribution from relations of creation, annihilation, and number operators. ► A systemic study on the statistical distributions corresponding to various q-deformation schemes. ► Arguing that many results of q-deformation distributions in literature are inaccurate or incomplete
Predicting Statistical Distributions of Footbridge Vibrations
DEFF Research Database (Denmark)
Pedersen, Lars; Frier, Christian
2009-01-01
The paper considers vibration response of footbridges to pedestrian loading. Employing Newmark and Monte Carlo simulation methods, a statistical distribution of bridge vibration levels is calculated modelling walking parameters such as step frequency and stride length as random variables...
Effect of ultrasound frequency on the Nakagami statistics of human liver tissues.
Directory of Open Access Journals (Sweden)
Po-Hsiang Tsui
Full Text Available The analysis of the backscattered statistics using the Nakagami parameter is an emerging ultrasound technique for assessing hepatic steatosis and fibrosis. Previous studies indicated that the echo amplitude distribution of a normal liver follows the Rayleigh distribution (the Nakagami parameter m is close to 1. However, using different frequencies may change the backscattered statistics of normal livers. This study explored the frequency dependence of the backscattered statistics in human livers and then discussed the sources of ultrasound scattering in the liver. A total of 30 healthy participants were enrolled to undergo a standard care ultrasound examination on the liver, which is a natural model containing diffuse and coherent scatterers. The liver of each volunteer was scanned from the right intercostal view to obtain image raw data at different central frequencies ranging from 2 to 3.5 MHz. Phantoms with diffuse scatterers only were also made to perform ultrasound scanning using the same protocol for comparisons with clinical data. The Nakagami parameter-frequency correlation was evaluated using Pearson correlation analysis. The median and interquartile range of the Nakagami parameter obtained from livers was 1.00 (0.98-1.05 for 2 MHz, 0.93 (0.89-0.98 for 2.3 MHz, 0.87 (0.84-0.92 for 2.5 MHz, 0.82 (0.77-0.88 for 3.3 MHz, and 0.81 (0.76-0.88 for 3.5 MHz. The Nakagami parameter decreased with the increasing central frequency (r = -0.67, p < 0.0001. However, the effect of ultrasound frequency on the statistical distribution of the backscattered envelopes was not found in the phantom results (r = -0.147, p = 0.0727. The current results demonstrated that the backscattered statistics of normal livers is frequency-dependent. Moreover, the coherent scatterers may be the primary factor to dominate the frequency dependence of the backscattered statistics in a liver.
Influences on flood frequency distributions in Irish river catchments
Directory of Open Access Journals (Sweden)
S. Ahilan
2012-04-01
Full Text Available This study explores influences on flood frequency distributions in Irish rivers. A Generalised Extreme Value (GEV type I distribution is recommended in Ireland for estimating flood quantiles in a single site flood frequency analysis. This paper presents the findings of an investigation that identified the GEV statistical distributions that best fit the annual maximum (AM data series extracted from 172 gauging stations of 126 rivers in Ireland. Analysis of these data was undertaken to explore hydraulic and hydro-geological factors that influence flood frequency distributions. A hierarchical approach of increasing statistical power that used probability plots, moment and L-moment diagrams, the Hosking goodness of fit algorithm and a modified Anderson-Darling (A-D statistical test was followed to determine whether a type I, type II or type III distribution was valid. Results of the Hosking et al. method indicated that of the 143 stations with flow records exceeding 25 yr, data for 95 (67% was best represented by GEV type I distributions and a further 9 (6% and 39 (27% stations followed type II and type III distributions respectively. Type I, type II and type III distributions were determined for 83 (58%, 16 (11% and 34 (24% stations respectively using the modified A-D method (data from 10 stations was not represented by GEV family distributions. The influence of karst terrain on these flood frequency distributions was assessed by incorporating results on an Arc-GIS platform showing karst features and using Monte Carlo simulations to assess the significance of the number and clustering of the observed distributions. Floodplain effects were identified by using two-sample t-tests to identify statistical correlations between the distributions and catchment properties that are indicative of strong floodplain activity. The data reveals that type I distributions are spatially well represented throughout the country. While also well represented throughout
Frequency-volume Statistics of Rock Falls: Examples From France, Italy and California
Dussauge-Peisser, C.; Guzzetti, F.; Wieczorek, G. F.
There is accumulating evidence that the distribution of rock-fall volume exhibits power law (fractal) statistics in different physiographic and geologic environments. We have studied the frequency-volume statistics of rock falls in three areas: Grenoble, France; Umbria, Italy; and Yosemite Valley, California, USA. We present a compari- son of the datasets currently available. For the Grenoble area a catalogue of rock falls between 1248 and 1995 occurred along a 120 km long limestone cliff. The dataset contains information on 105 rock-fall events ranging in size from 3xE-2 to 5xE8 m3. Only the time window 1935-1995 is considered in the study, involving 87 events from 1E-2 to 1E6 m3. The cumulative frequency-volume statistics follow a power-law (frac- tal) relationship with exponent b = -0.4 over the range 50 m3 Yosemite Valley the database contains information on historical (1851-2001) rock falls (122), rock slides (251) and prehistoric rock avalanches (5). For Yosemite, the non-cumulative frequency-volume statistics of rock falls and rock slides are very sim- ilar and correlate well with a power-law (fractal) relation with exponent beta = -1.4, over the range 30 m3
Statistical distributions of optimal global alignment scores of random protein sequences
Directory of Open Access Journals (Sweden)
Tang Jiaowei
2005-10-01
Full Text Available Abstract Background The inference of homology from statistically significant sequence similarity is a central issue in sequence alignments. So far the statistical distribution function underlying the optimal global alignments has not been completely determined. Results In this study, random and real but unrelated sequences prepared in six different ways were selected as reference datasets to obtain their respective statistical distributions of global alignment scores. All alignments were carried out with the Needleman-Wunsch algorithm and optimal scores were fitted to the Gumbel, normal and gamma distributions respectively. The three-parameter gamma distribution performs the best as the theoretical distribution function of global alignment scores, as it agrees perfectly well with the distribution of alignment scores. The normal distribution also agrees well with the score distribution frequencies when the shape parameter of the gamma distribution is sufficiently large, for this is the scenario when the normal distribution can be viewed as an approximation of the gamma distribution. Conclusion We have shown that the optimal global alignment scores of random protein sequences fit the three-parameter gamma distribution function. This would be useful for the inference of homology between sequences whose relationship is unknown, through the evaluation of gamma distribution significance between sequences.
Scaling laws and fluctuations in the statistics of word frequencies
Gerlach, Martin; Altmann, Eduardo G.
2014-11-01
In this paper, we combine statistical analysis of written texts and simple stochastic models to explain the appearance of scaling laws in the statistics of word frequencies. The average vocabulary of an ensemble of fixed-length texts is known to scale sublinearly with the total number of words (Heaps’ law). Analyzing the fluctuations around this average in three large databases (Google-ngram, English Wikipedia, and a collection of scientific articles), we find that the standard deviation scales linearly with the average (Taylor's law), in contrast to the prediction of decaying fluctuations obtained using simple sampling arguments. We explain both scaling laws (Heaps’ and Taylor) by modeling the usage of words using a Poisson process with a fat-tailed distribution of word frequencies (Zipf's law) and topic-dependent frequencies of individual words (as in topic models). Considering topical variations lead to quenched averages, turn the vocabulary size a non-self-averaging quantity, and explain the empirical observations. For the numerous practical applications relying on estimations of vocabulary size, our results show that uncertainties remain large even for long texts. We show how to account for these uncertainties in measurements of lexical richness of texts with different lengths.
Similarity of Symbol Frequency Distributions with Heavy Tails
Directory of Open Access Journals (Sweden)
Martin Gerlach
2016-04-01
Full Text Available Quantifying the similarity between symbolic sequences is a traditional problem in information theory which requires comparing the frequencies of symbols in different sequences. In numerous modern applications, ranging from DNA over music to texts, the distribution of symbol frequencies is characterized by heavy-tailed distributions (e.g., Zipf’s law. The large number of low-frequency symbols in these distributions poses major difficulties to the estimation of the similarity between sequences; e.g., they hinder an accurate finite-size estimation of entropies. Here, we show analytically how the systematic (bias and statistical (fluctuations errors in these estimations depend on the sample size N and on the exponent γ of the heavy-tailed distribution. Our results are valid for the Shannon entropy (α=1, its corresponding similarity measures (e.g., the Jensen-Shanon divergence, and also for measures based on the generalized entropy of order α. For small α’s, including α=1, the errors decay slower than the 1/N decay observed in short-tailed distributions. For α larger than a critical value α^{*}=1+1/γ≤2, the 1/N decay is recovered. We show the practical significance of our results by quantifying the evolution of the English language over the last two centuries using a complete α spectrum of measures. We find that frequent words change more slowly than less frequent words and that α=2 provides the most robust measure to quantify language change.
International Nuclear Information System (INIS)
Chernov, N.I.; Kurbatov, V.S.; Ososkov, G.A.
1988-01-01
Parameter estimation for multivariate probability distributions is studied in experiments where data are presented as one-dimensional hystograms. For this model a statistics defined as a quadratic form of the observed frequencies which has a limitig x 2 -distribution is proposed. The efficiency of the estimator minimizing the value of that statistics is proved whithin the class of all unibased estimates obtained via minimization of quadratic forms of observed frequencies. The elaborated method was applied to the physical problem of analysis of the secondary pion energy distribution in the isobar model of pion-nucleon interactions with the production of an additional pion. The numerical experiments showed that the accuracy of estimation is twice as much if comparing the conventional methods
Statistical distribution sampling
Johnson, E. S.
1975-01-01
Determining the distribution of statistics by sampling was investigated. Characteristic functions, the quadratic regression problem, and the differential equations for the characteristic functions are analyzed.
Scaling laws and fluctuations in the statistics of word frequencies
International Nuclear Information System (INIS)
Gerlach, Martin; Altmann, Eduardo G
2014-01-01
In this paper, we combine statistical analysis of written texts and simple stochastic models to explain the appearance of scaling laws in the statistics of word frequencies. The average vocabulary of an ensemble of fixed-length texts is known to scale sublinearly with the total number of words (Heaps’ law). Analyzing the fluctuations around this average in three large databases (Google-ngram, English Wikipedia, and a collection of scientific articles), we find that the standard deviation scales linearly with the average (Taylor's law), in contrast to the prediction of decaying fluctuations obtained using simple sampling arguments. We explain both scaling laws (Heaps’ and Taylor) by modeling the usage of words using a Poisson process with a fat-tailed distribution of word frequencies (Zipf's law) and topic-dependent frequencies of individual words (as in topic models). Considering topical variations lead to quenched averages, turn the vocabulary size a non-self-averaging quantity, and explain the empirical observations. For the numerous practical applications relying on estimations of vocabulary size, our results show that uncertainties remain large even for long texts. We show how to account for these uncertainties in measurements of lexical richness of texts with different lengths. (paper)
Statistical distribution for generalized ideal gas of fractional-statistics particles
International Nuclear Information System (INIS)
Wu, Y.
1994-01-01
We derive the occupation-number distribution in a generalized ideal gas of particles obeying fractional statistics, including mutual statistics, by adopting a state-counting definition. When there is no mutual statistics, the statistical distribution interpolates between bosons and fermions, and respects a fractional exclusion principle (except for bosons). Anyons in a strong magnetic field at low temperatures constitute such a physical system. Applications to the thermodynamic properties of quasiparticle excitations in the Laughlin quantum Hall fluid are discussed
DEFF Research Database (Denmark)
Bohlin, J; Skjerve, E; Ussery, David
2008-01-01
with here are mainly used to examine similarities between archaeal and bacterial DNA from different genomes. These methods compare observed genomic frequencies of fixed-sized oligonucleotides with expected values, which can be determined by genomic nucleotide content, smaller oligonucleotide frequencies......, or be based on specific statistical distributions. Advantages with these statistical methods include measurements of phylogenetic relationship with relatively small pieces of DNA sampled from almost anywhere within genomes, detection of foreign/conserved DNA, and homology searches. Our aim was to explore...... the reliability and best suited applications for some popular methods, which include relative oligonucleotide frequencies (ROF), di- to hexanucleotide zero'th order Markov methods (ZOM) and 2.order Markov chain Method (MCM). Tests were performed on distant homology searches with large DNA sequences, detection...
Dependence of exponents on text length versus finite-size scaling for word-frequency distributions
Corral, Álvaro; Font-Clos, Francesc
2017-08-01
Some authors have recently argued that a finite-size scaling law for the text-length dependence of word-frequency distributions cannot be conceptually valid. Here we give solid quantitative evidence for the validity of this scaling law, using both careful statistical tests and analytical arguments based on the generalized central-limit theorem applied to the moments of the distribution (and obtaining a novel derivation of Heaps' law as a by-product). We also find that the picture of word-frequency distributions with power-law exponents that decrease with text length [X. Yan and P. Minnhagen, Physica A 444, 828 (2016), 10.1016/j.physa.2015.10.082] does not stand with rigorous statistical analysis. Instead, we show that the distributions are perfectly described by power-law tails with stable exponents, whose values are close to 2, in agreement with the classical Zipf's law. Some misconceptions about scaling are also clarified.
Statistical distributions applications and parameter estimates
Thomopoulos, Nick T
2017-01-01
This book gives a description of the group of statistical distributions that have ample application to studies in statistics and probability. Understanding statistical distributions is fundamental for researchers in almost all disciplines. The informed researcher will select the statistical distribution that best fits the data in the study at hand. Some of the distributions are well known to the general researcher and are in use in a wide variety of ways. Other useful distributions are less understood and are not in common use. The book describes when and how to apply each of the distributions in research studies, with a goal to identify the distribution that best applies to the study. The distributions are for continuous, discrete, and bivariate random variables. In most studies, the parameter values are not known a priori, and sample data is needed to estimate parameter values. In other scenarios, no sample data is available, and the researcher seeks some insight that allows the estimate of ...
Statistical Distributions of Optical Flares from Gamma-Ray Bursts
International Nuclear Information System (INIS)
Yi, Shuang-Xi; Yu, Hai; Wang, F. Y.; Dai, Zi-Gao
2017-01-01
We statistically study gamma-ray burst (GRB) optical flares from the Swift /UVOT catalog. We compile 119 optical flares, including 77 flares with redshift measurements. Some tight correlations among the timescales of optical flares are found. For example, the rise time is correlated with the decay time, and the duration time is correlated with the peak time of optical flares. These two tight correlations indicate that longer rise times are associated with longer decay times of optical flares and also suggest that broader optical flares peak at later times, which are consistent with the corresponding correlations of X-ray flares. We also study the frequency distributions of optical flare parameters, including the duration time, rise time, decay time, peak time, and waiting time. Similar power-law distributions for optical and X-ray flares are found. Our statistic results imply that GRB optical flares and X-ray flares may share the similar physical origin, and both of them are possibly related to central engine activities.
Statistical Distributions of Optical Flares from Gamma-Ray Bursts
Energy Technology Data Exchange (ETDEWEB)
Yi, Shuang-Xi [College of Physics and Engineering, Qufu Normal University, Qufu 273165 (China); Yu, Hai; Wang, F. Y.; Dai, Zi-Gao, E-mail: fayinwang@nju.edu.cn [School of Astronomy and Space Science, Nanjing University, Nanjing 210093 (China)
2017-07-20
We statistically study gamma-ray burst (GRB) optical flares from the Swift /UVOT catalog. We compile 119 optical flares, including 77 flares with redshift measurements. Some tight correlations among the timescales of optical flares are found. For example, the rise time is correlated with the decay time, and the duration time is correlated with the peak time of optical flares. These two tight correlations indicate that longer rise times are associated with longer decay times of optical flares and also suggest that broader optical flares peak at later times, which are consistent with the corresponding correlations of X-ray flares. We also study the frequency distributions of optical flare parameters, including the duration time, rise time, decay time, peak time, and waiting time. Similar power-law distributions for optical and X-ray flares are found. Our statistic results imply that GRB optical flares and X-ray flares may share the similar physical origin, and both of them are possibly related to central engine activities.
A method for statistically comparing spatial distribution maps
Directory of Open Access Journals (Sweden)
Reynolds Mary G
2009-01-01
Full Text Available Abstract Background Ecological niche modeling is a method for estimation of species distributions based on certain ecological parameters. Thus far, empirical determination of significant differences between independently generated distribution maps for a single species (maps which are created through equivalent processes, but with different ecological input parameters, has been challenging. Results We describe a method for comparing model outcomes, which allows a statistical evaluation of whether the strength of prediction and breadth of predicted areas is measurably different between projected distributions. To create ecological niche models for statistical comparison, we utilized GARP (Genetic Algorithm for Rule-Set Production software to generate ecological niche models of human monkeypox in Africa. We created several models, keeping constant the case location input records for each model but varying the ecological input data. In order to assess the relative importance of each ecological parameter included in the development of the individual predicted distributions, we performed pixel-to-pixel comparisons between model outcomes and calculated the mean difference in pixel scores. We used a two sample Student's t-test, (assuming as null hypothesis that both maps were identical to each other regardless of which input parameters were used to examine whether the mean difference in corresponding pixel scores from one map to another was greater than would be expected by chance alone. We also utilized weighted kappa statistics, frequency distributions, and percent difference to look at the disparities in pixel scores. Multiple independent statistical tests indicated precipitation as the single most important independent ecological parameter in the niche model for human monkeypox disease. Conclusion In addition to improving our understanding of the natural factors influencing the distribution of human monkeypox disease, such pixel-to-pixel comparison
Distributional Properties of Order Statistics and Record Statistics
Directory of Open Access Journals (Sweden)
Abdul Hamid Khan
2012-07-01
Full Text Available Normal 0 false false false EN-US X-NONE X-NONE /* Style Definitions */ table.MsoNormalTable {mso-style-name:"Table Normal"; mso-tstyle-rowband-size:0; mso-tstyle-colband-size:0; mso-style-noshow:yes; mso-style-priority:99; mso-style-parent:""; mso-padding-alt:0in 5.4pt 0in 5.4pt; mso-para-margin:0in; mso-para-margin-bottom:.0001pt; mso-pagination:widow-orphan; font-size:10.0pt; font-family:"Times New Roman","serif";} Distributional properties of the order statistics, upper and lower records have been utilized to characterize distributions of interest. Further, one sided random dilation and contraction are utilized to obtain the distribution of non-adjacent ordered statistics and also their important deductions are discussed.
Blind Separation of Nonstationary Sources Based on Spatial Time-Frequency Distributions
Directory of Open Access Journals (Sweden)
Zhang Yimin
2006-01-01
Full Text Available Blind source separation (BSS based on spatial time-frequency distributions (STFDs provides improved performance over blind source separation methods based on second-order statistics, when dealing with signals that are localized in the time-frequency (t-f domain. In this paper, we propose the use of STFD matrices for both whitening and recovery of the mixing matrix, which are two stages commonly required in many BSS methods, to provide robust BSS performance to noise. In addition, a simple method is proposed to select the auto- and cross-term regions of time-frequency distribution (TFD. To further improve the BSS performance, t-f grouping techniques are introduced to reduce the number of signals under consideration, and to allow the receiver array to separate more sources than the number of array sensors, provided that the sources have disjoint t-f signatures. With the use of one or more techniques proposed in this paper, improved performance of blind separation of nonstationary signals can be achieved.
Frequency distributions: from the sun to the earth
Directory of Open Access Journals (Sweden)
N. B. Crosby
2011-11-01
Full Text Available The space environment is forever changing on all spatial and temporal scales. Energy releases are observed in numerous dynamic phenomena (e.g. solar flares, coronal mass ejections, solar energetic particle events where measurements provide signatures of the dynamics. Parameters (e.g. peak count rate, total energy released, etc. describing these phenomena are found to have frequency size distributions that follow power-law behavior. Natural phenomena on Earth, such as earthquakes and landslides, display similar power-law behavior. This suggests an underlying universality in nature and poses the question of whether the distribution of energy is the same for all these phenomena. Frequency distributions provide constraints for models that aim to simulate the physics and statistics observed in the individual phenomenon. The concept of self-organized criticality (SOC, also known as the "avalanche concept", was introduced by Bak et al. (1987, 1988, to characterize the behavior of dissipative systems that contain a large number of elements interacting over a short range. The systems evolve to a critical state in which a minor event starts a chain reaction that can affect any number of elements in the system. It is found that frequency distributions of the output parameters from the chain reaction taken over a period of time can be represented by power-laws. During the last decades SOC has been debated from all angles. New SOC models, as well as non-SOC models have been proposed to explain the power-law behavior that is observed. Furthermore, since Bak's pioneering work in 1987, people have searched for signatures of SOC everywhere. This paper will review how SOC behavior has become one way of interpreting the power-law behavior observed in natural occurring phenomenon in the Sun down to the Earth.
Distributions with given marginals and statistical modelling
Fortiana, Josep; Rodriguez-Lallena, José
2002-01-01
This book contains a selection of the papers presented at the meeting `Distributions with given marginals and statistical modelling', held in Barcelona (Spain), July 17-20, 2000. In 24 chapters, this book covers topics such as the theory of copulas and quasi-copulas, the theory and compatibility of distributions, models for survival distributions and other well-known distributions, time series, categorical models, definition and estimation of measures of dependence, monotonicity and stochastic ordering, shape and separability of distributions, hidden truncation models, diagonal families, orthogonal expansions, tests of independence, and goodness of fit assessment. These topics share the use and properties of distributions with given marginals, this being the fourth specialised text on this theme. The innovative aspect of the book is the inclusion of statistical aspects such as modelling, Bayesian statistics, estimation, and tests.
Karian, Zaven A
2000-01-01
Throughout the physical and social sciences, researchers face the challenge of fitting statistical distributions to their data. Although the study of statistical modelling has made great strides in recent years, the number and variety of distributions to choose from-all with their own formulas, tables, diagrams, and general properties-continue to create problems. For a specific application, which of the dozens of distributions should one use? What if none of them fit well?Fitting Statistical Distributions helps answer those questions. Focusing on techniques used successfully across many fields, the authors present all of the relevant results related to the Generalized Lambda Distribution (GLD), the Generalized Bootstrap (GB), and Monte Carlo simulation (MC). They provide the tables, algorithms, and computer programs needed for fitting continuous probability distributions to data in a wide variety of circumstances-covering bivariate as well as univariate distributions, and including situations where moments do...
Oxide vapor distribution from a high-frequency sweep e-beam system
Chow, R.; Tassano, P. L.; Tsujimoto, N.
1995-03-01
Oxide vapor distributions have been determined as a function of operating parameters of a high frequency sweep e-beam source combined with a programmable sweep controller. We will show which parameters are significant, the parameters that yield the broadest oxide deposition distribution, and the procedure used to arrive at these conclusions. A design-of-experimental strategy was used with five operating parameters: evaporation rate, sweep speed, sweep pattern (pre-programmed), phase speed (azimuthal rotation of the pattern), profile (dwell time as a function of radial position). A design was chosen that would show which of the parameters and parameter pairs have a statistically significant effect on the vapor distribution. Witness flats were placed symmetrically across a 25 inches diameter platen. The stationary platen was centered 24 inches above the e-gun crucible. An oxide material was evaporated under 27 different conditions. Thickness measurements were made with a stylus profilometer. The information will enable users of the high frequency e-gun systems to optimally locate the source in a vacuum system and understand which parameters have a major effect on the vapor distribution.
Gerdes, Lars; Busch, Ulrich; Pecoraro, Sven
2014-12-14
According to Regulation (EU) No 619/2011, trace amounts of non-authorised genetically modified organisms (GMO) in feed are tolerated within the EU if certain prerequisites are met. Tolerable traces must not exceed the so-called 'minimum required performance limit' (MRPL), which was defined according to the mentioned regulation to correspond to 0.1% mass fraction per ingredient. Therefore, not yet authorised GMO (and some GMO whose approvals have expired) have to be quantified at very low level following the qualitative detection in genomic DNA extracted from feed samples. As the results of quantitative analysis can imply severe legal and financial consequences for producers or distributors of feed, the quantification results need to be utterly reliable. We developed a statistical approach to investigate the experimental measurement variability within one 96-well PCR plate. This approach visualises the frequency distribution as zygosity-corrected relative content of genetically modified material resulting from different combinations of transgene and reference gene Cq values. One application of it is the simulation of the consequences of varying parameters on measurement results. Parameters could be for example replicate numbers or baseline and threshold settings, measurement results could be for example median (class) and relative standard deviation (RSD). All calculations can be done using the built-in functions of Excel without any need for programming. The developed Excel spreadsheets are available (see section 'Availability of supporting data' for details). In most cases, the combination of four PCR replicates for each of the two DNA isolations already resulted in a relative standard deviation of 15% or less. The aims of the study are scientifically based suggestions for minimisation of uncertainty of measurement especially in -but not limited to- the field of GMO quantification at low concentration levels. Four PCR replicates for each of the two DNA isolations
Statistical Analysis of Solar PV Power Frequency Spectrum for Optimal Employment of Building Loads
Energy Technology Data Exchange (ETDEWEB)
Olama, Mohammed M [ORNL; Sharma, Isha [ORNL; Kuruganti, Teja [ORNL; Fugate, David L [ORNL
2017-01-01
In this paper, a statistical analysis of the frequency spectrum of solar photovoltaic (PV) power output is conducted. This analysis quantifies the frequency content that can be used for purposes such as developing optimal employment of building loads and distributed energy resources. One year of solar PV power output data was collected and analyzed using one-second resolution to find ideal bounds and levels for the different frequency components. The annual, seasonal, and monthly statistics of the PV frequency content are computed and illustrated in boxplot format. To examine the compatibility of building loads for PV consumption, a spectral analysis of building loads such as Heating, Ventilation and Air-Conditioning (HVAC) units and water heaters was performed. This defined the bandwidth over which these devices can operate. Results show that nearly all of the PV output (about 98%) is contained within frequencies lower than 1 mHz (equivalent to ~15 min), which is compatible for consumption with local building loads such as HVAC units and water heaters. Medium frequencies in the range of ~15 min to ~1 min are likely to be suitable for consumption by fan equipment of variable air volume HVAC systems that have time constants in the range of few seconds to few minutes. This study indicates that most of the PV generation can be consumed by building loads with the help of proper control strategies, thereby reducing impact on the grid and the size of storage systems.
Statistical Distribution of Fatigue Life for Cast TiAl Alloy
Directory of Open Access Journals (Sweden)
WAN Wenjuan
2016-08-01
Full Text Available Statistic distribution of fatigue life data and its controls of cast Ti-47.5Al-2.5V-1.0Cr-0.2Zr (atom fraction/% alloy were investigated. Fatigue tests were operated by means of load-controlled rotating bending fatigue tests (R=-1 performed at a frequency of 100 Hz at 750 ℃ in air. The fracture mechanism was analyzed by observing the fracture surface morphologies through scanning electron microscope,and the achieved fatigue life data were analyzed by Weibull statistics. The results show that the fatigue life data present a remarkable scatter ranging from 103 to 106 cycles, and distribute mainly in short and long life regime. The reason for this phenomenon is that the fatigue crack initiators are different with different specimens. The crack initiators for short-life specimens are caused by shrinkage porosity, and for long-life ones are caused by bridged porosity interface and soft-oriented lamellar interface. Based on the observation results of fracture surface, two-parameter Weibull distribution model for fatigue life data can be used for the prediction of fatigue life at a certain failure probability. It has also shown that the shrinkage porosity causes the most detrimental effect to fatigue life.
Adaptive Maneuvering Frequency Method of Current Statistical Model
Institute of Scientific and Technical Information of China (English)
Wei Sun; Yongjian Yang
2017-01-01
Current statistical model(CSM) has a good performance in maneuvering target tracking. However, the fixed maneuvering frequency will deteriorate the tracking results, such as a serious dynamic delay, a slowly converging speedy and a limited precision when using Kalman filter(KF) algorithm. In this study, a new current statistical model and a new Kalman filter are proposed to improve the performance of maneuvering target tracking. The new model which employs innovation dominated subjection function to adaptively adjust maneuvering frequency has a better performance in step maneuvering target tracking, while a fluctuant phenomenon appears. As far as this problem is concerned, a new adaptive fading Kalman filter is proposed as well. In the new Kalman filter, the prediction values are amended in time by setting judgment and amendment rules,so that tracking precision and fluctuant phenomenon of the new current statistical model are improved. The results of simulation indicate the effectiveness of the new algorithm and the practical guiding significance.
Statistical Tests for Frequency Distribution of Mean Gravity Anomalies
African Journals Online (AJOL)
The hypothesis that a very large number of lOx 10mean gravity anomalies are normally distributed has been rejected at 5% Significance level based on the X2 and the unit normal deviate tests. However, the 50 equal area mean anomalies derived from the lOx 10data, have been found to be normally distributed at the same ...
Football goal distributions and extremal statistics
Greenhough, J.; Birch, P. C.; Chapman, S. C.; Rowlands, G.
2002-12-01
We analyse the distributions of the number of goals scored by home teams, away teams, and the total scored in the match, in domestic football games from 169 countries between 1999 and 2001. The probability density functions (PDFs) of goals scored are too heavy-tailed to be fitted over their entire ranges by Poisson or negative binomial distributions which would be expected for uncorrelated processes. Log-normal distributions cannot include zero scores and here we find that the PDFs are consistent with those arising from extremal statistics. In addition, we show that it is sufficient to model English top division and FA Cup matches in the seasons of 1970/71-2000/01 on Poisson or negative binomial distributions, as reported in analyses of earlier seasons, and that these are not consistent with extremal statistics.
Zemba, Michael; Nessel, James; Houts, Jacquelynne; Luini, Lorenzo; Riva, Carlo
2016-01-01
The rain rate data and statistics of a location are often used in conjunction with models to predict rain attenuation. However, the true attenuation is a function not only of rain rate, but also of the drop size distribution (DSD). Generally, models utilize an average drop size distribution (Laws and Parsons or Marshall and Palmer. However, individual rain events may deviate from these models significantly if their DSD is not well approximated by the average. Therefore, characterizing the relationship between the DSD and attenuation is valuable in improving modeled predictions of rain attenuation statistics. The DSD may also be used to derive the instantaneous frequency scaling factor and thus validate frequency scaling models. Since June of 2014, NASA Glenn Research Center (GRC) and the Politecnico di Milano (POLIMI) have jointly conducted a propagation study in Milan, Italy utilizing the 20 and 40 GHz beacon signals of the Alphasat TDP#5 Aldo Paraboni payload. The Ka- and Q-band beacon receivers provide a direct measurement of the signal attenuation while concurrent weather instrumentation provides measurements of the atmospheric conditions at the receiver. Among these instruments is a Thies Clima Laser Precipitation Monitor (optical disdrometer) which yields droplet size distributions (DSD); this DSD information can be used to derive a scaling factor that scales the measured 20 GHz data to expected 40 GHz attenuation. Given the capability to both predict and directly observe 40 GHz attenuation, this site is uniquely situated to assess and characterize such predictions. Previous work using this data has examined the relationship between the measured drop-size distribution and the measured attenuation of the link]. The focus of this paper now turns to a deeper analysis of the scaling factor, including the prediction error as a function of attenuation level, correlation between the scaling factor and the rain rate, and the temporal variability of the drop size
On positivity of time-frequency distributions.
Janssen, A.J.E.M.; Claasen, T.A.C.M.
1985-01-01
Consideration is given to the problem of how to regard the fundamental impossibility with time-frequency energy distributions of Cohen's class always to be nonnegative and, at the same time, to have correct marginal distributions. It is shown that the Wigner distribution is the only member of a
International Nuclear Information System (INIS)
Ballini, J.-P.; Cazes, P.; Turpin, P.-Y.
1976-01-01
Analysing the histogram of anode pulse amplitudes allows a discussion of the hypothesis that has been proposed to account for the statistical processes of secondary multiplication in a photomultiplier. In an earlier work, good agreement was obtained between experimental and reconstructed spectra, assuming a first dynode distribution including two Poisson distributions of distinct mean values. This first approximation led to a search for a method which could give the weights of several Poisson distributions of distinct mean values. Three methods have been briefly exposed: classical linear regression, constraint regression (d'Esopo's method), and regression on variables subject to error. The use of these methods gives an approach of the frequency function which represents the dispersion of the punctual mean gain around the whole first dynode mean gain value. Comparison between this function and the one employed in Polya distribution allows the statement that the latter is inadequate to describe the statistical process of secondary multiplication. Numerous spectra obtained with two kinds of photomultiplier working under different physical conditions have been analysed. Then two points are discussed: - Does the frequency function represent the dynode structure and the interdynode collection process. - Is the model (the multiplication process of all dynodes but the first one, is Poissonian) valid whatever the photomultiplier and the utilization conditions. (Auth.)
Statistical frequency in perception affects children's lexical production.
Richtsmeier, Peter T; Gerken, LouAnn; Goffman, Lisa; Hogan, Tiffany
2009-06-01
Children's early word production is influenced by the statistical frequency of speech sounds and combinations. Three experiments asked whether this production effect can be explained by a perceptual learning mechanism that is sensitive to word-token frequency and/or variability. Four-year-olds were exposed to nonwords that were either frequent (presented 10 times) or infrequent (presented once). When the frequent nonwords were spoken by the same talker, children showed no significant effect of perceptual frequency on production. When the frequent nonwords were spoken by different talkers, children produced them with fewer errors and shorter latencies. The results implicate token variability in perceptual learning.
Best Statistical Distribution of flood variables for Johor River in Malaysia
Salarpour Goodarzi, M.; Yusop, Z.; Yusof, F.
2012-12-01
A complex flood event is always characterized by a few characteristics such as flood peak, flood volume, and flood duration, which might be mutually correlated. This study explored the statistical distribution of peakflow, flood duration and flood volume at Rantau Panjang gauging station on the Johor River in Malaysia. Hourly data were recorded for 45 years. The data were analysed based on water year (July - June). Five distributions namely, Log Normal, Generalize Pareto, Log Pearson, Normal and Generalize Extreme Value (GEV) were used to model the distribution of all the three variables. Anderson-Darling and Kolmogorov-Smirnov goodness-of-fit tests were used to evaluate the best fit. Goodness-of-fit tests at 5% level of significance indicate that all the models can be used to model the distribution of peakflow, flood duration and flood volume. However, Generalize Pareto distribution is found to be the most suitable model when tested with the Anderson-Darling test and the, Kolmogorov-Smirnov suggested that GEV is the best for peakflow. The result of this research can be used to improve flood frequency analysis. Comparison between Generalized Extreme Value, Generalized Pareto and Log Pearson distributions in the Cumulative Distribution Function of peakflow
Correcting length-frequency distributions for imperfect detection
Breton, André R.; Hawkins, John A.; Winkelman, Dana L.
2013-01-01
Sampling gear selects for specific sizes of fish, which may bias length-frequency distributions that are commonly used to assess population size structure, recruitment patterns, growth, and survival. To properly correct for sampling biases caused by gear and other sources, length-frequency distributions need to be corrected for imperfect detection. We describe a method for adjusting length-frequency distributions when capture and recapture probabilities are a function of fish length, temporal variation, and capture history. The method is applied to a study involving the removal of Smallmouth Bass Micropterus dolomieu by boat electrofishing from a 38.6-km reach on the Yampa River, Colorado. Smallmouth Bass longer than 100 mm were marked and released alive from 2005 to 2010 on one or more electrofishing passes and removed on all other passes from the population. Using the Huggins mark–recapture model, we detected a significant effect of fish total length, previous capture history (behavior), year, pass, year×behavior, and year×pass on capture and recapture probabilities. We demonstrate how to partition the Huggins estimate of abundance into length frequencies to correct for these effects. Uncorrected length frequencies of fish removed from Little Yampa Canyon were negatively biased in every year by as much as 88% relative to mark–recapture estimates for the smallest length-class in our analysis (100–110 mm). Bias declined but remained high even for adult length-classes (≥200 mm). The pattern of bias across length-classes was variable across years. The percentage of unadjusted counts that were below the lower 95% confidence interval from our adjusted length-frequency estimates were 95, 89, 84, 78, 81, and 92% from 2005 to 2010, respectively. Length-frequency distributions are widely used in fisheries science and management. Our simple method for correcting length-frequency estimates for imperfect detection could be widely applied when mark–recapture data
Modified Distribution-Free Goodness-of-Fit Test Statistic.
Chun, So Yeon; Browne, Michael W; Shapiro, Alexander
2018-03-01
Covariance structure analysis and its structural equation modeling extensions have become one of the most widely used methodologies in social sciences such as psychology, education, and economics. An important issue in such analysis is to assess the goodness of fit of a model under analysis. One of the most popular test statistics used in covariance structure analysis is the asymptotically distribution-free (ADF) test statistic introduced by Browne (Br J Math Stat Psychol 37:62-83, 1984). The ADF statistic can be used to test models without any specific distribution assumption (e.g., multivariate normal distribution) of the observed data. Despite its advantage, it has been shown in various empirical studies that unless sample sizes are extremely large, this ADF statistic could perform very poorly in practice. In this paper, we provide a theoretical explanation for this phenomenon and further propose a modified test statistic that improves the performance in samples of realistic size. The proposed statistic deals with the possible ill-conditioning of the involved large-scale covariance matrices.
Positivity of time-frequency distribution functions
Janssen, A.J.E.M.
1988-01-01
This paper deals with the question how various 'natural' conditions posed on time-frequency distribution functions prevent them to be nonnegative everywhere for all signals. The attention is restricted mainly to distribution functions that involve the signal bilinearly. This paper summarizes and
International Nuclear Information System (INIS)
Gorokhovski, M A; Saveliev, V L
2008-01-01
This paper analyses statistical universalities that arise over time during constant frequency fragmentation under scaling symmetry. The explicit expression of particle-size distribution obtained from the evolution kinetic equation shows that, with increasing time, the initial distribution tends to the ultimate steady-state delta function through at least two intermediate universal asymptotics. The earlier asymptotic is the well-known log-normal distribution of Kolmogorov (1941 Dokl. Akad. Nauk. SSSR 31 99-101). This distribution is the first universality and has two parameters: the first and the second logarithmic moments of the fragmentation intensity spectrum. The later asymptotic is a power function (stronger universality) with a single parameter that is given by the ratio of the first two logarithmic moments. At large times, the first universality implies that the evolution equation can be reduced exactly to the Fokker-Planck equation instead of making the widely used but inconsistent assumption about the smallness of higher than second order moments. At even larger times, the second universality shows evolution towards a fractal state with dimension identified as a measure of the fracture resistance of the medium
Statistical distribution of quantum particles
Indian Academy of Sciences (India)
S B Khasare
2018-02-08
Feb 8, 2018 ... In this work, the statistical distribution functions for boson, fermions and their mixtures have been ... index is greater than unity, then it is easy in the present approach to ... ability W. Section 3 gives the derivation and graphical.
Towards a systematic approach to comparing distributions used in flood frequency analysis
Bobée, B.; Cavadias, G.; Ashkar, F.; Bernier, J.; Rasmussen, P.
1993-02-01
The estimation of flood quantiles from available streamflow records has been a topic of extensive research in this century. However, the large number of distributions and estimation methods proposed in the scientific literature has led to a state of confusion, and a gap prevails between theory and practice. This concerns both at-site and regional flood frequency estimation. To facilitate the work of "hydrologists, designers of hydraulic structures, irrigation engineers and planners of water resources", the World Meteorological Organization recently published a report which surveys and compares current methodologies, and recommends a number of statistical distributions and estimation procedures. This report is an important step towards the clarification of this difficult topic, but we think that it does not effectively satisfy the needs of practitioners as intended, because it contains some statements which are not statistically justified and which require further discussion. In the present paper we review commonly used procedures for flood frequency estimation, point out some of the reasons for the present state of confusion concerning the advantages and disadvantages of the various methods, and propose the broad lines of a possible comparison strategy. We recommend that the results of such comparisons be discussed in an international forum of experts, with the purpose of attaining a more coherent and broadly accepted strategy for estimating floods.
Entropy-based derivation of generalized distributions for hydrometeorological frequency analysis
Chen, Lu; Singh, Vijay P.
2018-02-01
Frequency analysis of hydrometeorological and hydrological extremes is needed for the design of hydraulic and civil infrastructure facilities as well as water resources management. A multitude of distributions have been employed for frequency analysis of these extremes. However, no single distribution has been accepted as a global standard. Employing the entropy theory, this study derived five generalized distributions for frequency analysis that used different kinds of information encoded as constraints. These distributions were the generalized gamma (GG), the generalized beta distribution of the second kind (GB2), and the Halphen type A distribution (Hal-A), Halphen type B distribution (Hal-B) and Halphen type inverse B distribution (Hal-IB), among which the GG and GB2 distribution were previously derived by Papalexiou and Koutsoyiannis (2012) and the Halphen family was first derived using entropy theory in this paper. The entropy theory allowed to estimate parameters of the distributions in terms of the constraints used for their derivation. The distributions were tested using extreme daily and hourly rainfall data. Results show that the root mean square error (RMSE) values were very small, which indicated that the five generalized distributions fitted the extreme rainfall data well. Among them, according to the Akaike information criterion (AIC) values, generally the GB2 and Halphen family gave a better fit. Therefore, those general distributions are one of the best choices for frequency analysis. The entropy-based derivation led to a new way for frequency analysis of hydrometeorological extremes.
Statistical Frequency in Perception Affects Children’s Lexical Production
Richtsmeier, Peter T.; Gerken, LouAnn; Goffman, Lisa; Hogan, Tiffany
2009-01-01
Children’s early word production is influenced by the statistical frequency of speech sounds and combinations. Three experiments asked whether this production effect can be explained by a perceptual learning mechanism that is sensitive to word-token frequency and/or variability. Four-year-olds were exposed to nonwords that were either frequent (presented 10 times) or infrequent (presented once). When the frequent nonwords were spoken by the same talker, children showed no significant effect of perceptual frequency on production. When the frequent nonwords were spoken by different talkers, children produced them with fewer errors and shorter latencies. The results implicate token variability in perceptual learning. PMID:19338981
Soil nuclide distribution coefficients and their statistical distributions
International Nuclear Information System (INIS)
Sheppard, M.I.; Beals, D.I.; Thibault, D.H.; O'Connor, P.
1984-12-01
Environmental assessments of the disposal of nuclear fuel waste in plutonic rock formations require analysis of the migration of nuclides from the disposal vault to the biosphere. Analyses of nuclide migration via groundwater through the disposal vault, the buffer and backfill, the plutonic rock, and the consolidated and unconsolidated overburden use models requiring distribution coefficients (Ksub(d)) to describe the interaction of the nuclides with the geological and man-made materials. This report presents element-specific soil distribution coefficients and their statistical distributions, based on a detailed survey of the literature. Radioactive elements considered were actinium, americium, bismuth, calcium, carbon, cerium, cesium, iodine, lead, molybdenum, neptunium, nickel, niobium, palladium, plutonium, polonium, protactinium, radium, samarium, selenium, silver, strontium, technetium, terbium, thorium, tin, uranium and zirconium. Stable elements considered were antimony, boron, cadmium, tellurium and zinc. Where sufficient data were available, distribution coefficients and their distributions are given for sand, silt, clay and organic soils. Our values are recommended for use in assessments for the Canadian Nuclear Fuel Waste Management Program
Optimizing Power–Frequency Droop Characteristics of Distributed Energy Resources
Energy Technology Data Exchange (ETDEWEB)
Guggilam, Swaroop S.; Zhao, Changhong; Dall Anese, Emiliano; Chen, Yu Christine; Dhople, Sairaj V.
2018-05-01
This paper outlines a procedure to design power-frequency droop slopes for distributed energy resources (DERs) installed in distribution networks to optimally participate in primary frequency response. In particular, the droop slopes are engineered such that DERs respond in proportion to their power ratings and they are not unfairly penalized in power provisioning based on their location in the distribution network. The main contribution of our approach is that a guaranteed level of frequency regulation can be guaranteed at the feeder head, while ensuring that the outputs of individual DERs conform to some well-defined notion of fairness. The approach we adopt leverages an optimization-based perspective and suitable linearizations of the power-flow equations to embed notions of fairness and information regarding the physics of the power flows within the distribution network into the droop slopes. Time-domain simulations from a differential algebraic equation model of the 39-bus New England test-case system augmented with three instances of the IEEE 37-node distribution-network with frequency-sensitive DERs are provided to validate our approach.
Global Earthquake Hazard Frequency and Distribution
National Aeronautics and Space Administration — Global Earthquake Hazard Frequency and Distribution is a 2.5 minute grid utilizing Advanced National Seismic System (ANSS) Earthquake Catalog data of actual...
Log-concave Probability Distributions: Theory and Statistical Testing
DEFF Research Database (Denmark)
An, Mark Yuing
1996-01-01
This paper studies the broad class of log-concave probability distributions that arise in economics of uncertainty and information. For univariate, continuous, and log-concave random variables we prove useful properties without imposing the differentiability of density functions. Discrete...... and multivariate distributions are also discussed. We propose simple non-parametric testing procedures for log-concavity. The test statistics are constructed to test one of the two implicati ons of log-concavity: increasing hazard rates and new-is-better-than-used (NBU) property. The test for increasing hazard...... rates are based on normalized spacing of the sample order statistics. The tests for NBU property fall into the category of Hoeffding's U-statistics...
Global Drought Hazard Frequency and Distribution
National Aeronautics and Space Administration — Global Drought Hazard Frequency and Distribution is a 2.5 minute grid based upon the International Research Institute for Climate Prediction's (IRI) Weighted Anomaly...
Directory of Open Access Journals (Sweden)
Bismark R.D.K. Agbelie
2016-08-01
Full Text Available The present study conducted an empirical highway segment crash frequency analysis on the basis of fixed-parameters negative binomial and random-parameters negative binomial models. Using a 4-year data from a total of 158 highway segments, with a total of 11,168 crashes, the results from both models were presented, discussed, and compared. About 58% of the selected variables produced normally distributed parameters across highway segments, while the remaining produced fixed parameters. The presence of a noise barrier along a highway segment would increase mean annual crash frequency by 0.492 for 88.21% of the highway segments, and would decrease crash frequency for 11.79% of the remaining highway segments. Besides, the number of vertical curves per mile along a segment would increase mean annual crash frequency by 0.006 for 84.13% of the highway segments, and would decrease crash frequency for 15.87% of the remaining highway segments. Thus, constraining the parameters to be fixed across all highway segments would lead to an inaccurate conclusion. Although, the estimated parameters from both models showed consistency in direction, the magnitudes were significantly different. Out of the two models, the random-parameters negative binomial model was found to be statistically superior in evaluating highway segment crashes compared with the fixed-parameters negative binomial model. On average, the marginal effects from the fixed-parameters negative binomial model were observed to be significantly overestimated compared with those from the random-parameters negative binomial model.
The frequency characteristics of medium voltage distribution system impedances
Directory of Open Access Journals (Sweden)
Liviu Emil Petrean
2009-10-01
Full Text Available In this paper we present the frequency characteristics of impedances involved in the electrical equivalent circuit of a large medium voltage distribution system. These impedances influence harmonics distortions propagation occurring due to the nonsinusoidal loads. We analyse the case of a 10 kV large urban distribution system which supplies industrial, commercial and residential customers. The influence of various parameters of the distribution network on the frequency characteristics are presented, in order to assess the interaction of harmonic distortion and distribution system network.
Energy Technology Data Exchange (ETDEWEB)
Kreth, G.; Hase, J.V.; Finsterle, J.; Cremer, C. [Kirchhoff Institute for Physics, INF, Heidelber (Germany); Greulich, K. [German Cancer Research Center, INF, Heidelberg (Germany); Cremer, M. [Institute of Anthropology and Human Genetics, Muenchen (Germany)
2003-07-01
To explore the influence of chromosome territory morphology and the positioning of certain chromosomes in the nuclear volume on aberration frequencies, in the present study geometric computer models of all Chromosome Territories (CTs) in a human cell nucleus were used to investigate these constraints quantitatively. For this purpose a geometric representation of a CT in a given nuclear volume was approximated by a linear polymer chain of 500 nm sized spherical 1 Mbp domains connected by entropic spring potentials. The morphology aspect was investigated for the active and inactive X-chromosome of female cells. Assuming a statistical distribution of Xa, Xi and the autosomes a quite good agreement of virtually calculated translocation break frequencies with observed frequencies determined from Hiroshima A-bomb survivors was found. To regard in a first step the aspect of the experimentally observed different locations of certain chromosomes, a simulated gene density correlated distribution of modeled lymphocytes was realized. The respective calculated translocation frequencies were compared with fish experiments of irradiated lymphocyte cells. (author)
Exact null distributions of quadratic distribution-free statistics for two-way classification
Wiel, van de M.A.
2004-01-01
Abstract We present new techniques for computing exact distributions of `Friedman-type¿ statistics. Representing the null distribution by a generating function allows for the use of general, not necessarily integer-valued rank scores. Moreover, we use symmetry properties of the multivariate
Seasonally adjusted birth frequencies follow the Poisson distribution.
Barra, Mathias; Lindstrøm, Jonas C; Adams, Samantha S; Augestad, Liv A
2015-12-15
Variations in birth frequencies have an impact on activity planning in maternity wards. Previous studies of this phenomenon have commonly included elective births. A Danish study of spontaneous births found that birth frequencies were well modelled by a Poisson process. Somewhat unexpectedly, there were also weekly variations in the frequency of spontaneous births. Another study claimed that birth frequencies follow the Benford distribution. Our objective was to test these results. We analysed 50,017 spontaneous births at Akershus University Hospital in the period 1999-2014. To investigate the Poisson distribution of these births, we plotted their variance over a sliding average. We specified various Poisson regression models, with the number of births on a given day as the outcome variable. The explanatory variables included various combinations of years, months, days of the week and the digit sum of the date. The relationship between the variance and the average fits well with an underlying Poisson process. A Benford distribution was disproved by a goodness-of-fit test (p Poisson process when monthly and day-of-the-week variation is included. The frequency is highest in summer towards June and July, Friday and Tuesday stand out as particularly busy days, and the activity level is at its lowest during weekends.
Botbol, Joseph Moses; Evenden, Gerald Ian
1989-01-01
Tables, graphs, and maps are used to portray the frequency characteristics and spatial distribution of manganese oxide-rich phase geochemical data, to characterize the northern Pacific in terms of publicly available nodule geochemical data, and to develop data portrayal methods that will facilitate data analysis. Source data are a subset of the Scripps Institute of Oceanography's Sediment Data Bank. The study area is bounded by 0° N., 40° N., 120° E., and 100° W. and is arbitrarily subdivided into 14-20°x20° geographic subregions. Frequency distributions of trace metals characterized in the original raw data are graphed as ogives, and salient parameters are tabulated. All variables are transformed to enrichment values relative to median concentration within their host subregions. Scatter plots of all pairs of original variables and their enrichment transforms are provided as an aid to the interpretation of correlations between variables. Gridded spatial distributions of all variables are portrayed as gray-scale maps. The use of tables and graphs to portray frequency statistics and gray-scale maps to portray spatial distributions is an effective way to prepare for and facilitate multivariate data analysis.
A goodness of fit statistic for the geometric distribution
J.A. Ferreira
2003-01-01
textabstractWe propose a goodness of fit statistic for the geometric distribution and compare it in terms of power, via simulation, with the chi-square statistic. The statistic is based on the Lau-Rao theorem and can be seen as a discrete analogue of the total time on test statistic. The results
Goodness-of-Fit Tests for Generalized Normal Distribution for Use in Hydrological Frequency Analysis
Das, Samiran
2018-04-01
The use of three-parameter generalized normal (GNO) as a hydrological frequency distribution is well recognized, but its application is limited due to unavailability of popular goodness-of-fit (GOF) test statistics. This study develops popular empirical distribution function (EDF)-based test statistics to investigate the goodness-of-fit of the GNO distribution. The focus is on the case most relevant to the hydrologist, namely, that in which the parameter values are unidentified and estimated from a sample using the method of L-moments. The widely used EDF tests such as Kolmogorov-Smirnov, Cramer von Mises, and Anderson-Darling (AD) are considered in this study. A modified version of AD, namely, the Modified Anderson-Darling (MAD) test, is also considered and its performance is assessed against other EDF tests using a power study that incorporates six specific Wakeby distributions (WA-1, WA-2, WA-3, WA-4, WA-5, and WA-6) as the alternative distributions. The critical values of the proposed test statistics are approximated using Monte Carlo techniques and are summarized in chart and regression equation form to show the dependence of shape parameter and sample size. The performance results obtained from the power study suggest that the AD and a variant of the MAD (MAD-L) are the most powerful tests. Finally, the study performs case studies involving annual maximum flow data of selected gauged sites from Irish and US catchments to show the application of the derived critical values and recommends further assessments to be carried out on flow data sets of rivers with various hydrological regimes.
Directory of Open Access Journals (Sweden)
Tanja Dromnes
2009-03-01
Full Text Available In this article we examine the title terms of Jane Austen's Pride and Prejudice (1813 with particular attention to their distribution and frequency in the text. Our method is to connect the statistical material gathered on frequency and distribution to a narratological analysis of the terms, with special emphasis on whether they occur within the focalization of the external narrator, or that of character-focalizers. In order to approach this task, we have availed ourselves of the narratological theories of Mieke Bal. We conclude that there is a differentiation among types of focalization in the novel that enhances the thematic structure of match-making. Although Jane Austen wrote and published her major works two centuries ago, they continue to fascinate literary scholars and general readers alike.
Frequency distributions from birth, death, and creation processes.
Bartley, David L; Ogden, Trevor; Song, Ruiguang
2002-01-01
The time-dependent frequency distribution of groups of individuals versus group size was investigated within a continuum approximation, assuming a simplified individual growth, death and creation model. The analogy of the system to a physical fluid exhibiting both convection and diffusion was exploited in obtaining various solutions to the distribution equation. A general solution was approximated through the application of a Green's function. More specific exact solutions were also found to be useful. The solutions were continually checked against the continuum approximation through extensive simulation of the discrete system. Over limited ranges of group size, the frequency distributions were shown to closely exhibit a power-law dependence on group size, as found in many realizations of this type of system, ranging from colonies of mutated bacteria to the distribution of surnames in a given population. As an example, the modeled distributions were successfully fit to the distribution of surnames in several countries by adjusting the parameters specifying growth, death and creation rates.
Farthouat, Juliane; Franco, Ana; Mary, Alison; Delpouve, Julie; Wens, Vincent; Op de Beeck, Marc; De Tiège, Xavier; Peigneux, Philippe
2017-03-01
Humans are highly sensitive to statistical regularities in their environment. This phenomenon, usually referred as statistical learning, is most often assessed using post-learning behavioural measures that are limited by a lack of sensibility and do not monitor the temporal dynamics of learning. In the present study, we used magnetoencephalographic frequency-tagged responses to investigate the neural sources and temporal development of the ongoing brain activity that supports the detection of regularities embedded in auditory streams. Participants passively listened to statistical streams in which tones were grouped as triplets, and to random streams in which tones were randomly presented. Results show that during exposure to statistical (vs. random) streams, tritone frequency-related responses reflecting the learning of regularities embedded in the stream increased in the left supplementary motor area and left posterior superior temporal sulcus (pSTS), whereas tone frequency-related responses decreased in the right angular gyrus and right pSTS. Tritone frequency-related responses rapidly developed to reach significance after 3 min of exposure. These results suggest that the incidental extraction of novel regularities is subtended by a gradual shift from rhythmic activity reflecting individual tone succession toward rhythmic activity synchronised with triplet presentation, and that these rhythmic processes are subtended by distinct neural sources.
New distributions of the statistical time delay of electrical breakdown in nitrogen
International Nuclear Information System (INIS)
Markovic, V Lj; Gocic, S R; Stamenkovic, S N
2006-01-01
Two new distributions of the statistical time delay of electrical breakdown in nitrogen are reported in this paper. The Gaussian and Gauss-exponential distributions of statistical time delay have been obtained on the basis of thousands of time delay measurements on a gas tube with a plane-parallel electrode system. Distributions of the statistical time delay are theoretically founded on binomial distribution for the occurrence of initiating electrons and described by using simple analytical and numerical models. The shapes of distributions depend on the electron yields in the interelectrode space originating from residual states. It is shown that a distribution of the statistical time delay changes from exponential and Gauss-exponential to Gaussian distribution due to the influence of residual ionization
Statistical distributions as applied to environmental surveillance data
International Nuclear Information System (INIS)
Speer, D.R.; Waite, D.A.
1975-09-01
Application of normal, log normal, and Weibull distributions to environmental surveillance data was investigated for approximately 300 nuclide-medium-year-location combinations. Corresponding W test calculations were made to determine the probability of a particular data set falling within the distribution of interest. Conclusions are drawn as to the fit of any data group to the various distributions. The significance of fitting statistical distributions to the data is discussed
Exact distributions of two-sample rank statistics and block rank statistics using computer algebra
Wiel, van de M.A.
1998-01-01
We derive generating functions for various rank statistics and we use computer algebra to compute the exact null distribution of these statistics. We present various techniques for reducing time and memory space used by the computations. We use the results to write Mathematica notebooks for
A goodness of fit statistic for the geometric distribution
Ferreira, J.A.
2003-01-01
textabstractWe propose a goodness of fit statistic for the geometric distribution and compare it in terms of power, via simulation, with the chi-square statistic. The statistic is based on the Lau-Rao theorem and can be seen as a discrete analogue of the total time on test statistic. The results suggest that the test based on the new statistic is generally superior to the chi-square test.
Time-Frequency Distribution of Music based on Sparse Wavelet Packet Representations
DEFF Research Database (Denmark)
Endelt, Line Ørtoft
We introduce a new method for generating time-frequency distributions, which is particularly useful for the analysis of music signals. The method presented here is based on $\\ell1$ sparse representations of music signals in a redundant wavelet packet dictionary. The representations are found using...... the minimization methods basis pursuit and best orthogonal basis. Visualizations of the time-frequency distribution are constructed based on a simplified energy distribution in the wavelet packet decomposition. The time-frequency distributions emphasizes structured musical content, including non-stationary content...
Statistical models based on conditional probability distributions
International Nuclear Information System (INIS)
Narayanan, R.S.
1991-10-01
We present a formulation of statistical mechanics models based on conditional probability distribution rather than a Hamiltonian. We show that it is possible to realize critical phenomena through this procedure. Closely linked with this formulation is a Monte Carlo algorithm, in which a configuration generated is guaranteed to be statistically independent from any other configuration for all values of the parameters, in particular near the critical point. (orig.)
Damage Detection Based on Cross-Term Extraction from Bilinear Time-Frequency Distributions
Directory of Open Access Journals (Sweden)
Ma Yuchao
2014-01-01
Full Text Available Abundant damage information is implicated in the bilinear time-frequency distribution of structural dynamic signals, which could provide effective support for structural damage identification. Signal time-frequency analysis methods are reviewed, and the characters of linear time-frequency distribution and bilinear time-frequency distribution typically represented by the Wigner-Ville distribution are compared. The existence of the cross-term and its application in structural damage detection are demonstrated. A method of extracting the dominant term is proposed, which combines the short-time Fourier spectrum and Wigner-Ville distribution; then two-dimensional time-frequency transformation matrix is constructed and the complete cross-term is extracted finally. The distribution character of which could be applied to the structural damage identification. Through theoretical analysis, model experiment and numerical simulation of the girder structure, the change rate of cross-term amplitude is validated to identify the damage location and degree. The effectiveness of the cross-term of bilinear time-frequency distribution for damage detection is confirmed and the analytical method of damage identification used in structural engineering is available.
A statistical-dynamical downscaling procedure for global climate simulations
International Nuclear Information System (INIS)
Frey-Buness, A.; Heimann, D.; Sausen, R.; Schumann, U.
1994-01-01
A statistical-dynamical downscaling procedure for global climate simulations is described. The procedure is based on the assumption that any regional climate is associated with a specific frequency distribution of classified large-scale weather situations. The frequency distributions are derived from multi-year episodes of low resolution global climate simulations. Highly resolved regional distributions of wind and temperature are calculated with a regional model for each class of large-scale weather situation. They are statistically evaluated by weighting them with the according climate-specific frequency. The procedure is exemplarily applied to the Alpine region for a global climate simulation of the present climate. (orig.)
Illustrating Sampling Distribution of a Statistic: Minitab Revisited
Johnson, H. Dean; Evans, Marc A.
2008-01-01
Understanding the concept of the sampling distribution of a statistic is essential for the understanding of inferential procedures. Unfortunately, this topic proves to be a stumbling block for students in introductory statistics classes. In efforts to aid students in their understanding of this concept, alternatives to a lecture-based mode of…
A study of outliers in statistical distributions of mechanical properties of structural steels
International Nuclear Information System (INIS)
Oefverbeck, P.; Oestberg, G.
1977-01-01
The safety against failure of pressure vessels can be assessed by statistical methods, so-called probabilistic fracture mechanics. The data base for such estimations is admittedly rather meagre, making it necessary to assume certain conventional statistical distributions. Since the failure rates arrived at are low, for nuclear vessels of the order of 10 - to 10 - per year, the extremes of the variables involved, among other things the mechanical properties of the steel used, are of particular interest. A question sometimes raised is whether outliers, or values exceeding the extremes in the assumed distributions, might occur. In order to explore this possibility a study has been made of strength values of three qualities of structural steels, available in samples of up to about 12,000. Statistical evaluation of these samples with respect to outliers, using standard methods for this purpose, revealed the presence of such outliers in most cases, with a frequency of occurrence of, typically, a few values per thousand, estimated by the methods described. Obviously, statistical analysis alone cannot be expected to shed any light on the causes of outliers. Thus, the interpretation of these results with respect to their implication for the probabilistic estimation of the integrety of pressure vessels must await further studies of a similar nature in which the test specimens corresponding to outliers can be recovered and examined metallographically. For the moment the results should be regarded only as a factor to be considered in discussions of the safety of pressure vessels. (author)
Statistical Analysis Of Failure Strength Of Material Using Weibull Distribution
International Nuclear Information System (INIS)
Entin Hartini; Mike Susmikanti; Antonius Sitompul
2008-01-01
In evaluation of ceramic and glass materials strength a statistical approach is necessary Strength of ceramic and glass depend on its measure and size distribution of flaws in these material. The distribution of strength for ductile material is narrow and close to a Gaussian distribution while strength of brittle materials as ceramic and glass following Weibull distribution. The Weibull distribution is an indicator of the failure of material strength resulting from a distribution of flaw size. In this paper, cumulative probability of material strength to failure probability, cumulative probability of failure versus fracture stress and cumulative probability of reliability of material were calculated. Statistical criteria calculation supporting strength analysis of Silicon Nitride material were done utilizing MATLAB. (author)
Comparative frequency and allelic distribution of ABO and Rh (D ...
African Journals Online (AJOL)
Background: Allelic distribution of major blood groups (ABO and rhesus) has not been defined in Bangladeshi population. Determinants of blood group frequency in this region have not been studied properly. Aim: To determine ABO and rhesus blood group frequency and allelic distribution in a multiethnic area of ...
Higher moments method for generalized Pareto distribution in flood frequency analysis
Zhou, C. R.; Chen, Y. F.; Huang, Q.; Gu, S. H.
2017-08-01
The generalized Pareto distribution (GPD) has proven to be the ideal distribution in fitting with the peak over threshold series in flood frequency analysis. Several moments-based estimators are applied to estimating the parameters of GPD. Higher linear moments (LH moments) and higher probability weighted moments (HPWM) are the linear combinations of Probability Weighted Moments (PWM). In this study, the relationship between them will be explored. A series of statistical experiments and a case study are used to compare their performances. The results show that if the same PWM are used in LH moments and HPWM methods, the parameter estimated by these two methods is unbiased. Particularly, when the same PWM are used, the PWM method (or the HPWM method when the order equals 0) shows identical results in parameter estimation with the linear Moments (L-Moments) method. Additionally, this phenomenon is significant when r ≥ 1 that the same order PWM are used in HPWM and LH moments method.
Statistical analysis of hydrodynamic cavitation events
Gimenez, G.; Sommer, R.
1980-10-01
The frequency (number of events per unit time) of pressure pulses produced by hydrodynamic cavitation bubble collapses is investigated using statistical methods. The results indicate that this frequency is distributed according to a normal law, its parameters not being time-evolving.
Southard, Rodney E.
2013-01-01
The weather and precipitation patterns in Missouri vary considerably from year to year. In 2008, the statewide average rainfall was 57.34 inches and in 2012, the statewide average rainfall was 30.64 inches. This variability in precipitation and resulting streamflow in Missouri underlies the necessity for water managers and users to have reliable streamflow statistics and a means to compute select statistics at ungaged locations for a better understanding of water availability. Knowledge of surface-water availability is dependent on the streamflow data that have been collected and analyzed by the U.S. Geological Survey for more than 100 years at approximately 350 streamgages throughout Missouri. The U.S. Geological Survey, in cooperation with the Missouri Department of Natural Resources, computed streamflow statistics at streamgages through the 2010 water year, defined periods of drought and defined methods to estimate streamflow statistics at ungaged locations, and developed regional regression equations to compute selected streamflow statistics at ungaged locations. Streamflow statistics and flow durations were computed for 532 streamgages in Missouri and in neighboring States of Missouri. For streamgages with more than 10 years of record, Kendall’s tau was computed to evaluate for trends in streamflow data. If trends were detected, the variable length method was used to define the period of no trend. Water years were removed from the dataset from the beginning of the record for a streamgage until no trend was detected. Low-flow frequency statistics were then computed for the entire period of record and for the period of no trend if 10 or more years of record were available for each analysis. Three methods are presented for computing selected streamflow statistics at ungaged locations. The first method uses power curve equations developed for 28 selected streams in Missouri and neighboring States that have multiple streamgages on the same streams. Statistical
Robustness of S1 statistic with Hodges-Lehmann for skewed distributions
Ahad, Nor Aishah; Yahaya, Sharipah Soaad Syed; Yin, Lee Ping
2016-10-01
Analysis of variance (ANOVA) is a common use parametric method to test the differences in means for more than two groups when the populations are normally distributed. ANOVA is highly inefficient under the influence of non- normal and heteroscedastic settings. When the assumptions are violated, researchers are looking for alternative such as Kruskal-Wallis under nonparametric or robust method. This study focused on flexible method, S1 statistic for comparing groups using median as the location estimator. S1 statistic was modified by substituting the median with Hodges-Lehmann and the default scale estimator with the variance of Hodges-Lehmann and MADn to produce two different test statistics for comparing groups. Bootstrap method was used for testing the hypotheses since the sampling distributions of these modified S1 statistics are unknown. The performance of the proposed statistic in terms of Type I error was measured and compared against the original S1 statistic, ANOVA and Kruskal-Wallis. The propose procedures show improvement compared to the original statistic especially under extremely skewed distribution.
Statistical distributions as applied to environmental surveillance data
International Nuclear Information System (INIS)
Speer, D.R.; Waite, D.A.
1976-01-01
Application of normal, lognormal, and Weibull distributions to radiological environmental surveillance data was investigated for approximately 300 nuclide-medium-year-location combinations. The fit of data to distributions was compared through probability plotting (special graph paper provides a visual check) and W test calculations. Results show that 25% of the data fit the normal distribution, 50% fit the lognormal, and 90% fit the Weibull.Demonstration of how to plot each distribution shows that normal and lognormal distributions are comparatively easy to use while Weibull distribution is complicated and difficult to use. Although current practice is to use normal distribution statistics, normal fit the least number of data groups considered in this study
Energy Technology Data Exchange (ETDEWEB)
Goodman, A L [Tulane Univ., New Orleans, LA (United States)
1992-08-01
Statistical orientation fluctuations are calculated with two alternative assumptions: the rotational frequency remains constant as the shape orientation fluctuates; and, the average angular momentum remains constant as the shape orientation fluctuates. (author). 2 refs., 3 figs.
Simple method of generating and distributing frequency-entangled qudits
Jin, Rui-Bo; Shimizu, Ryosuke; Fujiwara, Mikio; Takeoka, Masahiro; Wakabayashi, Ryota; Yamashita, Taro; Miki, Shigehito; Terai, Hirotaka; Gerrits, Thomas; Sasaki, Masahide
2016-11-01
High-dimensional, frequency-entangled photonic quantum bits (qudits for d-dimension) are promising resources for quantum information processing in an optical fiber network and can also be used to improve channel capacity and security for quantum communication. However, up to now, it is still challenging to prepare high-dimensional frequency-entangled qudits in experiments, due to technical limitations. Here we propose and experimentally implement a novel method for a simple generation of frequency-entangled qudts with d\\gt 10 without the use of any spectral filters or cavities. The generated state is distributed over 15 km in total length. This scheme combines the technique of spectral engineering of biphotons generated by spontaneous parametric down-conversion and the technique of spectrally resolved Hong-Ou-Mandel interference. Our frequency-entangled qudits will enable quantum cryptographic experiments with enhanced performances. This distribution of distinct entangled frequency modes may also be useful for improved metrology, quantum remote synchronization, as well as for fundamental test of stronger violation of local realism.
Encryption of covert information into multiple statistical distributions
International Nuclear Information System (INIS)
Venkatesan, R.C.
2007-01-01
A novel strategy to encrypt covert information (code) via unitary projections into the null spaces of ill-conditioned eigenstructures of multiple host statistical distributions, inferred from incomplete constraints, is presented. The host pdf's are inferred using the maximum entropy principle. The projection of the covert information is dependent upon the pdf's of the host statistical distributions. The security of the encryption/decryption strategy is based on the extreme instability of the encoding process. A self-consistent procedure to derive keys for both symmetric and asymmetric cryptography is presented. The advantages of using a multiple pdf model to achieve encryption of covert information are briefly highlighted. Numerical simulations exemplify the efficacy of the model
Derivation of some new distributions in statistical mechanics using maximum entropy approach
Directory of Open Access Journals (Sweden)
Ray Amritansu
2014-01-01
Full Text Available The maximum entropy principle has been earlier used to derive the Bose Einstein(B.E., Fermi Dirac(F.D. & Intermediate Statistics(I.S. distribution of statistical mechanics. The central idea of these distributions is to predict the distribution of the microstates, which are the particle of the system, on the basis of the knowledge of some macroscopic data. The latter information is specified in the form of some simple moment constraints. One distribution differs from the other in the way in which the constraints are specified. In the present paper, we have derived some new distributions similar to B.E., F.D. distributions of statistical mechanics by using maximum entropy principle. Some proofs of B.E. & F.D. distributions are shown, and at the end some new results are discussed.
International Nuclear Information System (INIS)
Zhao, J; Tang, J; Wang, K W
2008-01-01
The frequency-shift-based damage detection method entertains advantages such as global detection capability and easy implementation, but also suffers from drawbacks that include low detection accuracy and sensitivity and the difficulty in identifying damage using a small number of measurable frequencies. Moreover, the damage detection/identification performance is inevitably affected by the uncertainty/variations in the baseline model. In this research, we investigate an enhanced statistical damage identification method using the tunable piezoelectric transducer circuitry. The tunable piezoelectric transducer circuitry can lead to much enriched information on frequency shift (before and after damage occurrence). The circuitry elements, meanwhile, can be directly and accurately measured and thus can be considered uncertainty-free. A statistical damage identification algorithm is formulated which can identify both the mean and variance of the elemental property change. Our analysis indicates that the integration of the tunable piezoelectric transducer circuitry can significantly enhance the robustness of the frequency-shift-based damage identification approach under uncertainty and noise
Directory of Open Access Journals (Sweden)
V. E. Merzlikin
2015-01-01
Full Text Available The article deals with the search for optimal parameter estimation of the parameters of the process of homogenization of dairy products. Provides a theoretical basis for relationship of the relaxation time of the fat globules and attenuation coefficient of ultrasonic oscillations in dairy products. Suggested from the measured acoustic properties of milk to make the calculations of the mass distribution of fat globules. Studies on the proof of this hypothesis. Morphological analysis procedure carried out for homogenized milk samples at different pressures, as well as homogenized. As a result of research obtained distribution histogram of fat globules in dependence on the homogenization pressure. Also performed acoustic studies to obtain the frequency characteristics of loss modulus as a function of homogenization pressure. For further research the choice of method for approximating dependences is obtained using statistical moments of distributions. The parameters for the approximation of the distribution of fat globules and loss modulus versus pressure homogenization were obtained. Was carried out to test the hypothesis on the relationship parameters of approximation of the distribution of the fat globules and loss modulus as a function of pressure homogenization. Correlation analysis showed a clear dependence of the first and second statistical moment distributions of the pressure homogenization. The obtain ed dependence is consistent with the physical meaning of the first two moments of a statistical distribution. Correlation analysis was carried out according to the statistical moments of the distribution of the fat globules from moments of loss modulus. It is concluded that the possibility of ultrasonic testing the degree of homogenization and mass distribution of the fat globules of milk products.
Milewski, Emil G
2012-01-01
REA's Essentials provide quick and easy access to critical information in a variety of different fields, ranging from the most basic to the most advanced. As its name implies, these concise, comprehensive study guides summarize the essentials of the field covered. Essentials are helpful when preparing for exams, doing homework and will remain a lasting reference source for students, teachers, and professionals. Statistics I covers include frequency distributions, numerical methods of describing data, measures of variability, parameters of distributions, probability theory, and distributions.
Alternative derivations of the statistical mechanical distribution laws.
Wall, F T
1971-08-01
A new approach is presented for the derivation of statistical mechanical distribution laws. The derivations are accomplished by minimizing the Helmholtz free energy under constant temperature and volume, instead of maximizing the entropy under constant energy and volume. An alternative method involves stipulating equality of chemical potential, or equality of activity, for particles in different energy levels. This approach leads to a general statement of distribution laws applicable to all systems for which thermodynamic probabilities can be written. The methods also avoid use of the calculus of variations, Lagrangian multipliers, and Stirling's approximation for the factorial. The results are applied specifically to Boltzmann, Fermi-Dirac, and Bose-Einstein statistics. The special significance of chemical potential and activity is discussed for microscopic systems.
A NEW STATISTICAL PERSPECTIVE TO THE COSMIC VOID DISTRIBUTION
International Nuclear Information System (INIS)
Pycke, J-R; Russell, E.
2016-01-01
In this study, we obtain the size distribution of voids as a three-parameter redshift-independent log-normal void probability function (VPF) directly from the Cosmic Void Catalog (CVC). Although many statistical models of void distributions are based on the counts in randomly placed cells, the log-normal VPF that we obtain here is independent of the shape of the voids due to the parameter-free void finder of the CVC. We use three void populations drawn from the CVC generated by the Halo Occupation Distribution (HOD) Mocks, which are tuned to three mock SDSS samples to investigate the void distribution statistically and to investigate the effects of the environments on the size distribution. As a result, it is shown that void size distributions obtained from the HOD Mock samples are satisfied by the three-parameter log-normal distribution. In addition, we find that there may be a relation between the hierarchical formation, skewness, and kurtosis of the log-normal distribution for each catalog. We also show that the shape of the three-parameter distribution from the samples is strikingly similar to the galaxy log-normal mass distribution obtained from numerical studies. This similarity between void size and galaxy mass distributions may possibly indicate evidence of nonlinear mechanisms affecting both voids and galaxies, such as large-scale accretion and tidal effects. Considering the fact that in this study, all voids are generated by galaxy mocks and show hierarchical structures in different levels, it may be possible that the same nonlinear mechanisms of mass distribution affect the void size distribution.
The frequency-independent control method for distributed generation systems
DEFF Research Database (Denmark)
Naderi, Siamak; Pouresmaeil, Edris; Gao, Wenzhong David
2012-01-01
In this paper a novel frequency-independent control method suitable for distributed generation (DG) is presented. This strategy is derived based on the . abc/. αβ transformation and . abc/. dq transformation of the ac system variables. The active and reactive currents injected by the DG are contr......In this paper a novel frequency-independent control method suitable for distributed generation (DG) is presented. This strategy is derived based on the . abc/. αβ transformation and . abc/. dq transformation of the ac system variables. The active and reactive currents injected by the DG...
DEFF Research Database (Denmark)
Crosby, N.; Vilmer, N.; Lund, Niels
1998-01-01
be observed as low as 10 keV. A statistical study is performed on the total WATCH solar database and frequency distributions are built on measured X-ray flare parameters. It is also investigated how the properties of these frequency distributions behave when subgroups of events defined by different ranges......Solar flare observations in the deka-keV range are performed by the WATCH experiment on board the GRANAT satellite. The WATCH experiment is presented, including the energy calibration as applied in the present work. The creation of the solar burst catalogue covering two years of observation...... is described and some examples of solar observations are given. The estimated energy releases in the flares presented here are found to extend below the range of hard X-ray flares which were previously studied by ISEE-3 and HXRBS/SMM detectors. The X-ray emitting component cannot be exclusively explained...
Kroese, A.H.; van der Meulen, E.A.; Poortema, Klaas; Schaafsma, W.
1995-01-01
The making of statistical inferences in distributional form is conceptionally complicated because the epistemic 'probabilities' assigned are mixtures of fact and fiction. In this respect they are essentially different from 'physical' or 'frequency-theoretic' probabilities. The distributional form is
The statistics of low frequency radio interference at the Murchison Radio-astronomy Observatory
Sokolowski, Marcin; Wayth, Randall B.; Lewis, Morgan
2016-01-01
We characterize the low frequency radio-frequency interference (RFI) environment at the Murchison Radio-astronomy Observatory (MRO), the location selected for the low-frequency component of the Square Kilometre Array. Data were collected from the BIGHORNS instrument, located at the MRO, which records a contiguous bandwidth between 70 and 300 MHz, between November 2014 to March 2015 inclusive. The data were processed to identify RFI, and we describe a series of statistics in both the time and ...
Experimental investigation of statistical models describing distribution of counts
International Nuclear Information System (INIS)
Salma, I.; Zemplen-Papp, E.
1992-01-01
The binomial, Poisson and modified Poisson models which are used for describing the statistical nature of the distribution of counts are compared theoretically, and conclusions for application are considered. The validity of the Poisson and the modified Poisson statistical distribution for observing k events in a short time interval is investigated experimentally for various measuring times. The experiments to measure the influence of the significant radioactive decay were performed with 89 Y m (T 1/2 =16.06 s), using a multichannel analyser (4096 channels) in the multiscaling mode. According to the results, Poisson statistics describe the counting experiment for short measuring times (up to T=0.5T 1/2 ) and its application is recommended. However, analysis of the data demonstrated, with confidence, that for long measurements (T≥T 1/2 ) Poisson distribution is not valid and the modified Poisson function is preferable. The practical implications in calculating uncertainties and in optimizing the measuring time are discussed. Differences between the standard deviations evaluated on the basis of the Poisson and binomial models are especially significant for experiments with long measuring time (T/T 1/2 ≥2) and/or large detection efficiency (ε>0.30). Optimization of the measuring time for paired observations yields the same solution for either the binomial or the Poisson distribution. (orig.)
Prototyping a Distributed Information Retrieval System That Uses Statistical Ranking.
Harman, Donna; And Others
1991-01-01
Built using a distributed architecture, this prototype distributed information retrieval system uses statistical ranking techniques to provide better service to the end user. Distributed architecture was shown to be a feasible alternative to centralized or CD-ROM information retrieval, and user testing of the ranking methodology showed both…
Football fever: goal distributions and non-Gaussian statistics
Bittner, E.; Nußbaumer, A.; Janke, W.; Weigel, M.
2009-02-01
Analyzing football score data with statistical techniques, we investigate how the not purely random, but highly co-operative nature of the game is reflected in averaged properties such as the probability distributions of scored goals for the home and away teams. As it turns out, especially the tails of the distributions are not well described by the Poissonian or binomial model resulting from the assumption of uncorrelated random events. Instead, a good effective description of the data is provided by less basic distributions such as the negative binomial one or the probability densities of extreme value statistics. To understand this behavior from a microscopical point of view, however, no waiting time problem or extremal process need be invoked. Instead, modifying the Bernoulli random process underlying the Poissonian model to include a simple component of self-affirmation seems to describe the data surprisingly well and allows to understand the observed deviation from Gaussian statistics. The phenomenological distributions used before can be understood as special cases within this framework. We analyzed historical football score data from many leagues in Europe as well as from international tournaments, including data from all past tournaments of the “FIFA World Cup” series, and found the proposed models to be applicable rather universally. In particular, here we analyze the results of the German women’s premier football league and consider the two separate German men’s premier leagues in the East and West during the cold war times as well as the unified league after 1990 to see how scoring in football and the component of self-affirmation depend on cultural and political circumstances.
Nishino, Ko; Lombardi, Stephen
2011-01-01
We introduce a novel parametric bidirectional reflectance distribution function (BRDF) model that can accurately encode a wide variety of real-world isotropic BRDFs with a small number of parameters. The key observation we make is that a BRDF may be viewed as a statistical distribution on a unit hemisphere. We derive a novel directional statistics distribution, which we refer to as the hemispherical exponential power distribution, and model real-world isotropic BRDFs as mixtures of it. We derive a canonical probabilistic method for estimating the parameters, including the number of components, of this novel directional statistics BRDF model. We show that the model captures the full spectrum of real-world isotropic BRDFs with high accuracy, but a small footprint. We also demonstrate the advantages of the novel BRDF model by showing its use for reflection component separation and for exploring the space of isotropic BRDFs.
Statistical Inference for a Class of Multivariate Negative Binomial Distributions
DEFF Research Database (Denmark)
Rubak, Ege H.; Møller, Jesper; McCullagh, Peter
This paper considers statistical inference procedures for a class of models for positively correlated count variables called -permanental random fields, and which can be viewed as a family of multivariate negative binomial distributions. Their appealing probabilistic properties have earlier been...... studied in the literature, while this is the first statistical paper on -permanental random fields. The focus is on maximum likelihood estimation, maximum quasi-likelihood estimation and on maximum composite likelihood estimation based on uni- and bivariate distributions. Furthermore, new results...
Asymptotic Time Averages and Frequency Distributions
Directory of Open Access Journals (Sweden)
Muhammad El-Taha
2016-01-01
Full Text Available Consider an arbitrary nonnegative deterministic process (in a stochastic setting {X(t, t≥0} is a fixed realization, i.e., sample-path of the underlying stochastic process with state space S=(-∞,∞. Using a sample-path approach, we give necessary and sufficient conditions for the long-run time average of a measurable function of process to be equal to the expectation taken with respect to the same measurable function of its long-run frequency distribution. The results are further extended to allow unrestricted parameter (time space. Examples are provided to show that our condition is not superfluous and that it is weaker than uniform integrability. The case of discrete-time processes is also considered. The relationship to previously known sufficient conditions, usually given in stochastic settings, will also be discussed. Our approach is applied to regenerative processes and an extension of a well-known result is given. For researchers interested in sample-path analysis, our results will give them the choice to work with the time average of a process or its frequency distribution function and go back and forth between the two under a mild condition.
Fitting the Statistical Distribution for Daily Rainfall in Ibadan, Based ...
African Journals Online (AJOL)
PROF. O. E. OSUAGWU
2013-06-01
Jun 1, 2013 ... Abstract. This paper presents several types of statistical distributions to describe rainfall distribution in Ibadan metropolis over a period of 30 years. The exponential, gamma, normal and poison distributions are compared to identify the optimal model for daily rainfall amount based on data recorded at rain ...
Galaxies distribution in the universe: large-scale statistics and structures
International Nuclear Information System (INIS)
Maurogordato, Sophie
1988-01-01
This research thesis addresses the distribution of galaxies in the Universe, and more particularly large scale statistics and structures. Based on an assessment of the main used statistical techniques, the author outlines the need to develop additional tools to correlation functions in order to characterise the distribution. She introduces a new indicator: the probability of a volume randomly tested in the distribution to be void. This allows a characterisation of void properties at the work scales (until 10h"-"1 Mpc) in the Harvard Smithsonian Center for Astrophysics Redshift Survey, or CfA catalog. A systematic analysis of statistical properties of different sub-samples has then been performed with respect to the size and location, luminosity class, and morphological type. This analysis is then extended to different scenarios of structure formation. A program of radial speed measurements based on observations allows the determination of possible relationships between apparent structures. The author also presents results of the search for south extensions of Perseus supernova [fr
Statistical test for the distribution of galaxies on plates
International Nuclear Information System (INIS)
Garcia Lambas, D.
1985-01-01
A statistical test for the distribution of galaxies on plates is presented. We apply the test to synthetic astronomical plates obtained by means of numerical simulation (Garcia Lambas and Sersic 1983) with three different models for the 3-dimensional distribution, comparison with an observational plate, suggest the presence of filamentary structure. (author)
Statistical analysis of partial reduced width distributions
International Nuclear Information System (INIS)
Tran Quoc Thuong.
1973-01-01
The aim of this study was to develop rigorous methods for analysing experimental event distributions according to a law in chi 2 and to check if the number of degrees of freedom ν is compatible with the value 1 for the reduced neutron width distribution. Two statistical methods were used (the maximum-likelihood method and the method of moments); it was shown, in a few particular cases, that ν is compatible with 1. The difference between ν and 1, if it exists, should not exceed 3%. These results confirm the validity of the compound nucleus model [fr
Word frequencies: A comparison of Pareto type distributions
Wiegand, Martin; Nadarajah, Saralees; Si, Yuancheng
2018-03-01
Mehri and Jamaati (2017) [18] used Zipf's law to model word frequencies in Holy Bible translations for one hundred live languages. We compare the fit of Zipf's law to a number of Pareto type distributions. The latter distributions are shown to provide the best fit, as judged by a number of comparative plots and error measures. The fit of Zipf's law appears generally poor.
Use of commercial vessels in survey augmentation: the size-frequency distribution
Directory of Open Access Journals (Sweden)
Eric N. Powell
2006-09-01
Full Text Available The trend towards use of commercial vessels to enhance survey data requires assessment of the advantages and limitations of various options for their use. One application is to augment information on size-frequency distributions obtained in multispecies trawl surveys where stratum boundaries and sampling density are not optimal for all species. Analysis focused on ten recreationally and commercially important species: bluefish, butterfish, Loligo squid, weakfish, summer flounder, winter flounder, silver hake (whiting, black sea bass, striped bass, and scup (porgy. The commercial vessel took 59 tows in the sampled domain south of Long Island, New York and the survey vessel 18. Black sea bass, Loligo squid, and summer flounder demonstrated an onshore-offshore gradient such that smaller fish were caught disproportionately inshore and larger fish offshore. Butterfish, silver hake, and weakfish were characterized by a southwest-northeast gradient such that larger fish were caught disproportionately northeast of the southwestern-most sector. All sizes of scup, striped bass, and bluefish were caught predominately inshore. Winter flounder were caught predominately offshore. The commercial vessel was characterized by an increased frequency of large catches for most species. Consequently, patchiness was assayed to be higher by the commercial vessel in nearly all cases. The size-frequency distribution obtained by the survey vessel for six of the ten species, bluefish, butterfish, Loligo squid, summer flounder, weakfish, and silver hake, could not be obtained by chance from the size-frequency distribution obtained by the commercial vessel. The difference in sample density did not significantly influence the size-frequency distribution. Of the six species characterized by significant differences in size-frequency distribution between boats, all but one was patchy at the population level and all had one or more size classes so characterized. Although the
Fast Grid Frequency Support from Distributed Inverter-Based Resources
Energy Technology Data Exchange (ETDEWEB)
Hoke, Anderson F [National Renewable Energy Laboratory (NREL), Golden, CO (United States)
2018-05-04
This presentation summarizes power hardware-in-the-loop testing performed to evaluate the ability of distributed inverter-coupled generation to support grid frequency on the fastest time scales. The research found that distributed PV inverters and other DERs can effectively support the grid on sub-second time scales.
Statistical Multipath Model Based on Experimental GNSS Data in Static Urban Canyon Environment
Directory of Open Access Journals (Sweden)
Yuze Wang
2018-04-01
Full Text Available A deep understanding of multipath characteristics is essential to design signal simulators and receivers in global navigation satellite system applications. As a new constellation is deployed and more applications occur in the urban environment, the statistical multipath models of navigation signal need further study. In this paper, we present statistical distribution models of multipath time delay, multipath power attenuation, and multipath fading frequency based on the experimental data in the urban canyon environment. The raw data of multipath characteristics are obtained by processing real navigation signal to study the statistical distribution. By fitting the statistical data, it shows that the probability distribution of time delay follows a gamma distribution which is related to the waiting time of Poisson distributed events. The fading frequency follows an exponential distribution, and the mean of multipath power attenuation decreases linearly with an increasing time delay. In addition, the detailed statistical characteristics for different elevations and orbits satellites is studied, and the parameters of each distribution are quite different. The research results give useful guidance for navigation simulator and receiver designers.
Basic statistics with Microsoft Excel: a review.
Divisi, Duilio; Di Leonardo, Gabriella; Zaccagna, Gino; Crisci, Roberto
2017-06-01
The scientific world is enriched daily with new knowledge, due to new technologies and continuous discoveries. The mathematical functions explain the statistical concepts particularly those of mean, median and mode along with those of frequency and frequency distribution associated to histograms and graphical representations, determining elaborative processes on the basis of the spreadsheet operations. The aim of the study is to highlight the mathematical basis of statistical models that regulate the operation of spreadsheets in Microsoft Excel.
Statistical γ-ray multiplicity distributions in Dy and Yb nuclei
International Nuclear Information System (INIS)
Tveter, T.S.; Bergholt, L.; Guttormsen, M.; Rekstad, J.
1994-03-01
The statistical γ-ray multiplicity distributions following the reactions 163 Dy( 3 He,αxn) 162-x Dy and 173 Yb( 3 He,αxn) 172-x Yb have been studied. The mean value and standard deviation have been extracted as functions of excitation energy. The method is based on the probability distribution of k-fold events, where an α-particle is observed in coincidence with signals in k γ-ray detectors. Techniques for isolating statistical γ-rays and subtracting random background, cross-talk and neutron contributions are discussed. 22 refs., 10 figs., 3 tabs
Statistical thermodynamics and the size distributions of tropical convective clouds.
Garrett, T. J.; Glenn, I. B.; Krueger, S. K.; Ferlay, N.
2017-12-01
Parameterizations for sub-grid cloud dynamics are commonly developed by using fine scale modeling or measurements to explicitly resolve the mechanistic details of clouds to the best extent possible, and then to formulating these behaviors cloud state for use within a coarser grid. A second is to invoke physical intuition and some very general theoretical principles from equilibrium statistical thermodynamics. This second approach is quite widely used elsewhere in the atmospheric sciences: for example to explain the heat capacity of air, blackbody radiation, or even the density profile or air in the atmosphere. Here we describe how entrainment and detrainment across cloud perimeters is limited by the amount of available air and the range of moist static energy in the atmosphere, and that constrains cloud perimeter distributions to a power law with a -1 exponent along isentropes and to a Boltzmann distribution across isentropes. Further, the total cloud perimeter density in a cloud field is directly tied to the buoyancy frequency of the column. These simple results are shown to be reproduced within a complex dynamic simulation of a tropical convective cloud field and in passive satellite observations of cloud 3D structures. The implication is that equilibrium tropical cloud structures can be inferred from the bulk thermodynamic structure of the atmosphere without having to analyze computationally expensive dynamic simulations.
Kovalevsky, Louis; Langley, Robin S.; Caro, Stephane
2016-05-01
Due to the high cost of experimental EMI measurements significant attention has been focused on numerical simulation. Classical methods such as Method of Moment or Finite Difference Time Domain are not well suited for this type of problem, as they require a fine discretisation of space and failed to take into account uncertainties. In this paper, the authors show that the Statistical Energy Analysis is well suited for this type of application. The SEA is a statistical approach employed to solve high frequency problems of electromagnetically reverberant cavities at a reduced computational cost. The key aspects of this approach are (i) to consider an ensemble of system that share the same gross parameter, and (ii) to avoid solving Maxwell's equations inside the cavity, using the power balance principle. The output is an estimate of the field magnitude distribution in each cavity. The method is applied on a typical aircraft structure.
A spatial scan statistic for survival data based on Weibull distribution.
Bhatt, Vijaya; Tiwari, Neeraj
2014-05-20
The spatial scan statistic has been developed as a geographical cluster detection analysis tool for different types of data sets such as Bernoulli, Poisson, ordinal, normal and exponential. We propose a scan statistic for survival data based on Weibull distribution. It may also be used for other survival distributions, such as exponential, gamma, and log normal. The proposed method is applied on the survival data of tuberculosis patients for the years 2004-2005 in Nainital district of Uttarakhand, India. Simulation studies reveal that the proposed method performs well for different survival distribution functions. Copyright © 2013 John Wiley & Sons, Ltd.
The Role of the Sampling Distribution in Understanding Statistical Inference
Lipson, Kay
2003-01-01
Many statistics educators believe that few students develop the level of conceptual understanding essential for them to apply correctly the statistical techniques at their disposal and to interpret their outcomes appropriately. It is also commonly believed that the sampling distribution plays an important role in developing this understanding.…
Distribution function of frequency of stellar flares in the Orion association
International Nuclear Information System (INIS)
Parsamyan, Eh.S.
1980-01-01
Using the chronology of discoveries of new flares and the chronology of confirmation i.e. the time distribution of second flares (Ambartsumian's method), the distribution function of frequency of flares on stars in the Orion association is obtained. A number of stars having different frequencies is also found. It is shown that flare stars with high flare frequency (ν -1 13sup(m). The quantities of flare stars in aggregates determined by two independent methods show that the number of flare stars in Orion association is about 1.5 times greater than in the Pleiades cluster [ru
Fissure formation in coke. 3: Coke size distribution and statistical analysis
Energy Technology Data Exchange (ETDEWEB)
D.R. Jenkins; D.E. Shaw; M.R. Mahoney [CSIRO, North Ryde, NSW (Australia). Mathematical and Information Sciences
2010-07-15
A model of coke stabilization, based on a fundamental model of fissuring during carbonisation is used to demonstrate the applicability of the fissuring model to actual coke size distributions. The results indicate that the degree of stabilization is important in determining the size distribution. A modified form of the Weibull distribution is shown to provide a better representation of the whole coke size distribution compared to the Rosin-Rammler distribution, which is generally only fitted to the lump coke. A statistical analysis of a large number of experiments in a pilot scale coke oven shows reasonably good prediction of the coke mean size, based on parameters related to blend rank, amount of low rank coal, fluidity and ash. However, the prediction of measures of the spread of the size distribution is more problematic. The fissuring model, the size distribution representation and the statistical analysis together provide a comprehensive capability for understanding and predicting the mean size and distribution of coke lumps produced during carbonisation. 12 refs., 16 figs., 4 tabs.
International Nuclear Information System (INIS)
EI-Shanshoury, G.I.
2011-01-01
Several statistical distributions are used to model various reliability and maintainability parameters. The applied distribution depends on the' nature of the data being analyzed. The presented paper deals with analysis of some statistical distributions used in reliability to reach the best fit of distribution analysis. The calculations rely on circuit quantity parameters obtained by using Relex 2009 computer program. The statistical analysis of ten different distributions indicated that Weibull distribution gives the best fit distribution for modeling the reliability of the data set of Temperature Alarm Circuit (TAC). However, the Exponential distribution is found to be the best fit distribution for modeling the failure rate
A statistical analysis of North East Atlantic (submicron aerosol size distributions
Directory of Open Access Journals (Sweden)
M. Dall'Osto
2011-12-01
Full Text Available The Global Atmospheric Watch research station at Mace Head (Ireland offers the possibility to sample some of the cleanest air masses being imported into Europe as well as some of the most polluted being exported out of Europe. We present a statistical cluster analysis of the physical characteristics of aerosol size distributions in air ranging from the cleanest to the most polluted for the year 2008. Data coverage achieved was 75% throughout the year. By applying the Hartigan-Wong k-Means method, 12 clusters were identified as systematically occurring. These 12 clusters could be further combined into 4 categories with similar characteristics, namely: coastal nucleation category (occurring 21.3 % of the time, open ocean nucleation category (occurring 32.6% of the time, background clean marine category (occurring 26.1% of the time and anthropogenic category (occurring 20% of the time aerosol size distributions. The coastal nucleation category is characterised by a clear and dominant nucleation mode at sizes less than 10 nm while the open ocean nucleation category is characterised by a dominant Aitken mode between 15 nm and 50 nm. The background clean marine aerosol exhibited a clear bimodality in the sub-micron size distribution, with although it should be noted that either the Aitken mode or the accumulation mode may dominate the number concentration. However, peculiar background clean marine size distributions with coarser accumulation modes are also observed during winter months. By contrast, the continentally-influenced size distributions are generally more monomodal (accumulation, albeit with traces of bimodality. The open ocean category occurs more often during May, June and July, corresponding with the North East (NE Atlantic high biological period. Combined with the relatively high percentage frequency of occurrence (32.6%, this suggests that the marine biota is an important source of new nano aerosol particles in NE Atlantic Air.
Eigenmode frequency distribution of rapidly rotating neutron stars
International Nuclear Information System (INIS)
Boutloukos, Stratos; Nollert, Hans-Peter
2007-01-01
We use perturbation theory and the relativistic Cowling approximation to numerically compute characteristic oscillation modes of rapidly rotating relativistic stars which consist of a perfect fluid obeying a polytropic equation of state. We present a code that allows the computation of modes of arbitrary order. We focus here on the overall distribution of frequencies. As expected, we find an infinite pressure mode spectrum extending to infinite frequency. In addition we obtain an infinite number of inertial mode solutions confined to a finite, well-defined frequency range which depends on the compactness and the rotation frequency of the star. For nonaxisymmetric modes we observe how this range is shifted with respect to the axisymmetric ones, moving towards negative frequencies and thus making all m>2 modes unstable. We discuss whether our results indicate that the star's spectrum must have a continuous part, as opposed to simply containing an infinite number of discrete modes
New advances in the statistical parton distributions approach*
Directory of Open Access Journals (Sweden)
Soffer Jacques
2016-01-01
Full Text Available The quantum statistical parton distributions approach proposed more than one decade ago is revisited by considering a larger set of recent and accurate Deep Inelastic Scattering experimental results. It enables us to improve the description of the data by means of a new determination of the parton distributions. This global next-to-leading order QCD analysis leads to a good description of several structure functions, involving unpolarized parton distributions and helicity distributions, in terms of a rather small number of free parameters. There are many serious challenging issues. The predictions of this theoretical approach will be tested for single-jet production and charge asymmetry in W± production in p̄p and pp collisions up to LHC energies, using recent data and also for forthcoming experimental results.
Directory of Open Access Journals (Sweden)
Masoud Ghodrati
2016-12-01
Full Text Available Humans are fast and accurate in categorizing complex natural images. It is, however, unclear what features of visual information are exploited by brain to perceive the images with such speed and accuracy. It has been shown that low-level contrast statistics of natural scenes can explain the variance of amplitude of event-related potentials (ERP in response to rapidly presented images. In this study, we investigated the effect of these statistics on frequency content of ERPs. We recorded ERPs from human subjects, while they viewed natural images each presented for 70 ms. Our results showed that Weibull contrast statistics, as a biologically plausible model, explained the variance of ERPs the best, compared to other image statistics that we assessed. Our time-frequency analysis revealed a significant correlation between these statistics and ERPs’ power within theta frequency band (~3-7 Hz. This is interesting, as theta band is believed to be involved in context updating and semantic encoding. This correlation became significant at ~110 ms after stimulus onset, and peaked at 138 ms. Our results show that not only the amplitude but also the frequency of neural responses can be modulated with low-level contrast statistics of natural images and highlights their potential role in scene perception.
Precision Statistical Analysis of Images Based on Brightness Distribution
Directory of Open Access Journals (Sweden)
Muzhir Shaban Al-Ani
2017-07-01
Full Text Available Study the content of images is considered an important topic in which reasonable and accurate analysis of images are generated. Recently image analysis becomes a vital field because of huge number of images transferred via transmission media in our daily life. These crowded media with images lead to highlight in research area of image analysis. In this paper, the implemented system is passed into many steps to perform the statistical measures of standard deviation and mean values of both color and grey images. Whereas the last step of the proposed method concerns to compare the obtained results in different cases of the test phase. In this paper, the statistical parameters are implemented to characterize the content of an image and its texture. Standard deviation, mean and correlation values are used to study the intensity distribution of the tested images. Reasonable results are obtained for both standard deviation and mean value via the implementation of the system. The major issue addressed in the work is concentrated on brightness distribution via statistical measures applying different types of lighting.
Measuring streetscape complexity based on the statistics of local contrast and spatial frequency.
Directory of Open Access Journals (Sweden)
André Cavalcante
Full Text Available Streetscapes are basic urban elements which play a major role in the livability of a city. The visual complexity of streetscapes is known to influence how people behave in such built spaces. However, how and which characteristics of a visual scene influence our perception of complexity have yet to be fully understood. This study proposes a method to evaluate the complexity perceived in streetscapes based on the statistics of local contrast and spatial frequency. Here, 74 streetscape images from four cities, including daytime and nighttime scenes, were ranked for complexity by 40 participants. Image processing was then used to locally segment contrast and spatial frequency in the streetscapes. The statistics of these characteristics were extracted and later combined to form a single objective measure. The direct use of statistics revealed structural or morphological patterns in streetscapes related to the perception of complexity. Furthermore, in comparison to conventional measures of visual complexity, the proposed objective measure exhibits a higher correlation with the opinion of the participants. Also, the performance of this method is more robust regarding different time scenarios.
Analysis of thrips distribution: application of spatial statistics and Kriging
John Aleong; Bruce L. Parker; Margaret Skinner; Diantha Howard
1991-01-01
Kriging is a statistical technique that provides predictions for spatially and temporally correlated data. Observations of thrips distribution and density in Vermont soils are made in both space and time. Traditional statistical analysis of such data assumes that the counts taken over space and time are independent, which is not necessarily true. Therefore, to analyze...
Handbook of tables for order statistics from lognormal distributions with applications
Balakrishnan, N
1999-01-01
Lognormal distributions are one of the most commonly studied models in the sta tistical literature while being most frequently used in the applied literature. The lognormal distributions have been used in problems arising from such diverse fields as hydrology, biology, communication engineering, environmental science, reliability, agriculture, medical science, mechanical engineering, material science, and pharma cology. Though the lognormal distributions have been around from the beginning of this century (see Chapter 1), much of the work concerning inferential methods for the parameters of lognormal distributions has been done in the recent past. Most of these methods of inference, particUlarly those based on censored samples, involve extensive use of numerical methods to solve some nonlinear equations. Order statistics and their moments have been discussed quite extensively in the literature for many distributions. It is very well known that the moments of order statistics can be derived explicitly only...
[Rank distributions in community ecology from the statistical viewpoint].
Maksimov, V N
2004-01-01
Traditional statistical methods for definition of empirical functions of abundance distribution (population, biomass, production, etc.) of species in a community are applicable for processing of multivariate data contained in the above quantitative indices of the communities. In particular, evaluation of moments of distribution suffices for convolution of the data contained in a list of species and their abundance. At the same time, the species should be ranked in the list in ascending rather than descending population and the distribution models should be analyzed on the basis of the data on abundant species only.
International Nuclear Information System (INIS)
Gupta, S.S.; Panchapakesan, S.
1975-01-01
A quantile selection procedure in reliability problems pertaining to a restricted family of probability distributions is discussed. This family is assumed to be star-ordered with respect to the standard normal distribution folded at the origin. Motivation for this formulation of the problem is described. Both exact and asymptotic results dealing with the distribution of the maximum of ratios of order statistics from such a family are obtained and tables of the appropriate constants, percentiles of this statistic, are given in order to facilitate the use of the selection procedure
Statistical distribution of components of energy eigenfunctions: from nearly-integrable to chaotic
International Nuclear Information System (INIS)
Wang, Jiaozi; Wang, Wen-ge
2016-01-01
We study the statistical distribution of components in the non-perturbative parts of energy eigenfunctions (EFs), in which main bodies of the EFs lie. Our numerical simulations in five models show that deviation of the distribution from the prediction of random matrix theory (RMT) is useful in characterizing the process from nearly-integrable to chaotic, in a way somewhat similar to the nearest-level-spacing distribution. But, the statistics of EFs reveals some more properties, as described below. (i) In the process of approaching quantum chaos, the distribution of components shows a delay feature compared with the nearest-level-spacing distribution in most of the models studied. (ii) In the quantum chaotic regime, the distribution of components always shows small but notable deviation from the prediction of RMT in models possessing classical counterparts, while, the deviation can be almost negligible in models not possessing classical counterparts. (iii) In models whose Hamiltonian matrices possess a clear band structure, tails of EFs show statistical behaviors obviously different from those in the main bodies, while, the difference is smaller for Hamiltonian matrices without a clear band structure.
Statistical model of natural stimuli predicts edge-like pooling of spatial frequency channels in V2
Directory of Open Access Journals (Sweden)
Gutmann Michael
2005-02-01
Full Text Available Abstract Background It has been shown that the classical receptive fields of simple and complex cells in the primary visual cortex emerge from the statistical properties of natural images by forcing the cell responses to be maximally sparse or independent. We investigate how to learn features beyond the primary visual cortex from the statistical properties of modelled complex-cell outputs. In previous work, we showed that a new model, non-negative sparse coding, led to the emergence of features which code for contours of a given spatial frequency band. Results We applied ordinary independent component analysis to modelled outputs of complex cells that span different frequency bands. The analysis led to the emergence of features which pool spatially coherent across-frequency activity in the modelled primary visual cortex. Thus, the statistically optimal way of processing complex-cell outputs abandons separate frequency channels, while preserving and even enhancing orientation tuning and spatial localization. As a technical aside, we found that the non-negativity constraint is not necessary: ordinary independent component analysis produces essentially the same results as our previous work. Conclusion We propose that the pooling that emerges allows the features to code for realistic low-level image features related to step edges. Further, the results prove the viability of statistical modelling of natural images as a framework that produces quantitative predictions of visual processing.
Development of optical fiber frequency and time distribution systems
Lutes, G.
1982-01-01
The development of ultra stable optical fiber distribution systems for the dissemination of frequency and timing references is reported. The ultimate design goals for these systems are a frequency stability of 10 to the -17 power for tau or = 100 sec and time stability of + or - 0.1 ns for 1 year and operation over distances or = 30 km. A prototype system is reviewed and progress is discussed.
Ziegeweid, Jeffrey R.; Lorenz, David L.; Sanocki, Chris A.; Czuba, Christiana R.
2015-12-24
Knowledge of the magnitude and frequency of low flows in streams, which are flows in a stream during prolonged dry weather, is fundamental for water-supply planning and design; waste-load allocation; reservoir storage design; and maintenance of water quality and quantity for irrigation, recreation, and wildlife conservation. This report presents the results of a statewide study for which regional regression equations were developed for estimating 13 flow-duration curve statistics and 10 low-flow frequency statistics at ungaged stream locations in Minnesota. The 13 flow-duration curve statistics estimated by regression equations include the 0.0001, 0.001, 0.02, 0.05, 0.1, 0.25, 0.50, 0.75, 0.9, 0.95, 0.99, 0.999, and 0.9999 exceedance-probability quantiles. The low-flow frequency statistics include annual and seasonal (spring, summer, fall, winter) 7-day mean low flows, seasonal 30-day mean low flows, and summer 122-day mean low flows for a recurrence interval of 10 years. Estimates of the 13 flow-duration curve statistics and the 10 low-flow frequency statistics are provided for 196 U.S. Geological Survey continuous-record streamgages using streamflow data collected through September 30, 2012.
Statistical distribution of solar soft X-ray bursts
International Nuclear Information System (INIS)
Kaufmann, P.; Piazza, L.R.; Schaal, R.E.
1979-01-01
Nearly 1000 solar events with fluxes measured in 0.5-3A 0 , 1-8A 0 and 8-20A 0 bands by Explorer 37 (US NRL Solrad) satelite are statistically analysed. The differential distribution of peak fluxes can be represented by power laws with exponents -1.4, -2.2, -2.9 respectively, which are compared to 2-12A 0 results. At the 0.5-3A 0 band there is a suggested peak in the distribution. Autocorrelation analysis of the distribution have shown that in the harder band (0.5-3A 0 ) there is a concentration of events at preferred values multiplied of about 10x10 -5 erg cm -2 S -1 of unknown origin [pt
Statistical analysis of the spatial distribution of galaxies and clusters
International Nuclear Information System (INIS)
Cappi, Alberto
1993-01-01
This thesis deals with the analysis of the distribution of galaxies and clusters, describing some observational problems and statistical results. First chapter gives a theoretical introduction, aiming to describe the framework of the formation of structures, tracing the history of the Universe from the Planck time, t_p = 10"-"4"3 sec and temperature corresponding to 10"1"9 GeV, to the present epoch. The most usual statistical tools and models of the galaxy distribution, with their advantages and limitations, are described in chapter two. A study of the main observed properties of galaxy clustering, together with a detailed statistical analysis of the effects of selecting galaxies according to apparent magnitude or diameter, is reported in chapter three. Chapter four delineates some properties of groups of galaxies, explaining the reasons of discrepant results on group distributions. Chapter five is a study of the distribution of galaxy clusters, with different statistical tools, like correlations, percolation, void probability function and counts in cells; it is found the same scaling-invariant behaviour of galaxies. Chapter six describes our finding that rich galaxy clusters too belong to the fundamental plane of elliptical galaxies, and gives a discussion of its possible implications. Finally chapter seven reviews the possibilities offered by multi-slit and multi-fibre spectrographs, and I present some observational work on nearby and distant galaxy clusters. In particular, I show the opportunities offered by ongoing surveys of galaxies coupled with multi-object fibre spectrographs, focusing on the ESO Key Programme A galaxy redshift survey in the south galactic pole region to which I collaborate and on MEFOS, a multi-fibre instrument with automatic positioning. Published papers related to the work described in this thesis are reported in the last appendix. (author) [fr
Nonparametric Bayesian predictive distributions for future order statistics
Richard A. Johnson; James W. Evans; David W. Green
1999-01-01
We derive the predictive distribution for a specified order statistic, determined from a future random sample, under a Dirichlet process prior. Two variants of the approach are treated and some limiting cases studied. A practical application to monitoring the strength of lumber is discussed including choices of prior expectation and comparisons made to a Bayesian...
Larwin, Karen H.; Larwin, David A.
2011-01-01
Bootstrapping methods and random distribution methods are increasingly recommended as better approaches for teaching students about statistical inference in introductory-level statistics courses. The authors examined the effect of teaching undergraduate business statistics students using random distribution and bootstrapping simulations. It is the…
Comparing Simulated and Theoretical Sampling Distributions of the U3 Person-Fit Statistic.
Emons, Wilco H. M.; Meijer, Rob R.; Sijtsma, Klaas
2002-01-01
Studied whether the theoretical sampling distribution of the U3 person-fit statistic is in agreement with the simulated sampling distribution under different item response theory models and varying item and test characteristics. Simulation results suggest that the use of standard normal deviates for the standardized version of the U3 statistic may…
Secondary Frequency and Voltage Control of Islanded Microgrids via Distributed Averaging
DEFF Research Database (Denmark)
W. Simpson-Porco, John; Shafiee, Qobad; Dorfler, Florian
2015-01-01
actions. The frequency controller rapidly regulates the microgrid frequency to its nominal value while maintaining active power sharing among the distributed generators. Tuning of the voltage controller provides a simple and intuitive trade-off between the conflicting goals of voltage regulation...
International Nuclear Information System (INIS)
Lopez Hidalgo, Juana Ines
2000-01-01
The characteristic of ionizing radiation suggests that induced chromosomal damage in the form of translocations would appear to be randomly distributed. However, the outcome of tests performed in vitro and in vivo (irradiated individuals) are contradictories. The most translocation-related chromosomes, as far as some studies reveal on one hand, appear to be less involved in accordance with others. These data, together with those related to molecular mechanisms involved in translocations production suggest that in G 0 -irradiated cells, the frequency and distribution of this kind of chromosomal rearrangement, does not take place at random. They seem to be affected by in-nucleus chromosome distribution, by each chromosome's DNA length and functional features, by the efficiency of DNA repair mechanisms, and by inter individual differences. The objective of this study was to establish the frequency pattern of each human chromosome involved in radio-induced translocations, as well as to analyze the importance the chromosome length, the activity of DNA polymerase- dependant repair mechanisms, and inter individual differences within the scope of such distribution. To achieve the goals, peripheral blood lymphocytes from healthy donors were irradiated in presence and absence of 2'-3' dideoxithimidine (ddThd), a Β - DNA polymerase inhibitor, which takes part in the base repair mechanism (B E R). The results showed that: The presence of ddThd during the irradiation increase the basal frequency of radioinduced translocations in 60 %. This result suggests that ddThd repair synthesis inhibition can be in itself a valid methodology for radiation-induced bases damage assessment, damage which if not BER-repaired may result in translocation-leading double strand breaks. A statistically significant correlation between translocation frequency and chromosome length, in terms of percentage of genome, has been noticed both in (basal) irradiation and in irradiation with ddThd inhibitor
Directory of Open Access Journals (Sweden)
Wenzhi Wang
2016-07-01
Full Text Available Modeling the random fiber distribution of a fiber-reinforced composite is of great importance for studying the progressive failure behavior of the material on the micro scale. In this paper, we develop a new algorithm for generating random representative volume elements (RVEs with statistical equivalent fiber distribution against the actual material microstructure. The realistic statistical data is utilized as inputs of the new method, which is archived through implementation of the probability equations. Extensive statistical analysis is conducted to examine the capability of the proposed method and to compare it with existing methods. It is found that the proposed method presents a good match with experimental results in all aspects including the nearest neighbor distance, nearest neighbor orientation, Ripley’s K function, and the radial distribution function. Finite element analysis is presented to predict the effective elastic properties of a carbon/epoxy composite, to validate the generated random representative volume elements, and to provide insights of the effect of fiber distribution on the elastic properties. The present algorithm is shown to be highly accurate and can be used to generate statistically equivalent RVEs for not only fiber-reinforced composites but also other materials such as foam materials and particle-reinforced composites.
The exact probability distribution of the rank product statistics for replicated experiments.
Eisinga, Rob; Breitling, Rainer; Heskes, Tom
2013-03-18
The rank product method is a widely accepted technique for detecting differentially regulated genes in replicated microarray experiments. To approximate the sampling distribution of the rank product statistic, the original publication proposed a permutation approach, whereas recently an alternative approximation based on the continuous gamma distribution was suggested. However, both approximations are imperfect for estimating small tail probabilities. In this paper we relate the rank product statistic to number theory and provide a derivation of its exact probability distribution and the true tail probabilities. Copyright © 2013 Federation of European Biochemical Societies. Published by Elsevier B.V. All rights reserved.
Fisher information and statistical inference for phase-type distributions
DEFF Research Database (Denmark)
Bladt, Mogens; Esparza, Luz Judith R; Nielsen, Bo Friis
2011-01-01
This paper is concerned with statistical inference for both continuous and discrete phase-type distributions. We consider maximum likelihood estimation, where traditionally the expectation-maximization (EM) algorithm has been employed. Certain numerical aspects of this method are revised and we...
Estimation of modal parameters using bilinear joint time frequency distributions
Roshan-Ghias, A.; Shamsollahi, M. B.; Mobed, M.; Behzad, M.
2007-07-01
In this paper, a new method is proposed for modal parameter estimation using time-frequency representations. Smoothed Pseudo Wigner-Ville distribution which is a member of the Cohen's class distributions is used to decouple vibration modes completely in order to study each mode separately. This distribution reduces cross-terms which are troublesome in Wigner-Ville distribution and retains the resolution as well. The method was applied to highly damped systems, and results were superior to those obtained via other conventional methods.
Statistical distribution of solar soft X-ray bursts
Energy Technology Data Exchange (ETDEWEB)
Kaufmann, P; Piazza, L R; Schaal, R E [Universidade Mackenzie, Sao Paulo (Brazil). Centro de Radio-Astronomia e Astrofisica
1979-03-01
Nearly 1000 solar events with fluxes measured in 0.5-3A/sup 0/, 1-8A/sup 0/ and 8-20A/sup 0/ bands by Explorer 37 (US NRL Solrad) satellite are statistically analyzed. The differential distribution of peak fluxes can be represented by power laws with exponents -1.4, -2.2, -2.9 respectively, which are compared to 2-12A/sup 0/ results. For the 0.5-3A/sup 0/ band there is a suggested peak in the distribution. Autocorrelation analyses of the distribution have shown that in the harder band (0.5-3A/sup 0/) there is a concentration of events at preferred values multiplied of about 10x10/sup -5/erg cm/sup -2/S/sup -1/ of unknown origin.
Improving Statistics Education through Simulations: The Case of the Sampling Distribution.
Earley, Mark A.
This paper presents a summary of action research investigating statistics students' understandings of the sampling distribution of the mean. With four sections of an introductory Statistics in Education course (n=98 students), a computer simulation activity (R. delMas, J. Garfield, and B. Chance, 1999) was implemented and evaluated to show…
Neti, Prasad V.S.V.; Howell, Roger W.
2010-01-01
Recently, the distribution of radioactivity among a population of cells labeled with 210Po was shown to be well described by a log-normal (LN) distribution function (J Nucl Med. 2006;47:1049–1058) with the aid of autoradiography. To ascertain the influence of Poisson statistics on the interpretation of the autoradiographic data, the present work reports on a detailed statistical analysis of these earlier data. Methods The measured distributions of α-particle tracks per cell were subjected to statistical tests with Poisson, LN, and Poisson-lognormal (P-LN) models. Results The LN distribution function best describes the distribution of radioactivity among cell populations exposed to 0.52 and 3.8 kBq/mL of 210Po-citrate. When cells were exposed to 67 kBq/mL, the P-LN distribution function gave a better fit; however, the underlying activity distribution remained log-normal. Conclusion The present analysis generally provides further support for the use of LN distributions to describe the cellular uptake of radioactivity. Care should be exercised when analyzing autoradiographic data on activity distributions to ensure that Poisson processes do not distort the underlying LN distribution. PMID:18483086
STATISTICAL DISTRIBUTION PATTERNS IN MECHANICAL AND FATIGUE PROPERTIES OF METALLIC MATERIALS
Tatsuo, SAKAI; Masaki, NAKAJIMA; Keiro, TOKAJI; Norihiko, HASEGAWA; Department of Mechanical Engineering, Ritsumeikan University; Department of Mechanical Engineering, Toyota College of Technology; Department of Mechanical Engineering, Gifu University; Department of Mechanical Engineering, Gifu University
1997-01-01
Many papers on the statistical aspect of materials strength have been collected and reviewed by The Research Group for Statistical Aspects of Materials Strength.A book of "Statistical Aspects of Materials Strength" was written by this group, and published in 1992.Based on the experimental data compiled in this book, distribution patterns of mechanical properties are systematically surveyed paying an attention to metallic materials.Thus one can obtain the fundamental knowledge for a reliabilit...
Statistical Modelling of Synaptic Vesicles Distribution and Analysing their Physical Characteristics
DEFF Research Database (Denmark)
Khanmohammadi, Mahdieh
transmission electron microscopy is used to acquire images from two experimental groups of rats: 1) rats subjected to a behavioral model of stress and 2) rats subjected to sham stress as the control group. The synaptic vesicle distribution and interactions are modeled by employing a point process approach......This Ph.D. thesis deals with mathematical and statistical modeling of synaptic vesicle distribution, shape, orientation and interactions. The first major part of this thesis treats the problem of determining the effect of stress on synaptic vesicle distribution and interactions. Serial section...... on differences of statistical measures in section and the same measures in between sections. Three-dimensional (3D) datasets are reconstructed by using image registration techniques and estimated thicknesses. We distinguish the effect of stress by estimating the synaptic vesicle densities and modeling...
Local multiplicity adjustment for the spatial scan statistic using the Gumbel distribution.
Gangnon, Ronald E
2012-03-01
The spatial scan statistic is an important and widely used tool for cluster detection. It is based on the simultaneous evaluation of the statistical significance of the maximum likelihood ratio test statistic over a large collection of potential clusters. In most cluster detection problems, there is variation in the extent of local multiplicity across the study region. For example, using a fixed maximum geographic radius for clusters, urban areas typically have many overlapping potential clusters, whereas rural areas have relatively few. The spatial scan statistic does not account for local multiplicity variation. We describe a previously proposed local multiplicity adjustment based on a nested Bonferroni correction and propose a novel adjustment based on a Gumbel distribution approximation to the distribution of a local scan statistic. We compare the performance of all three statistics in terms of power and a novel unbiased cluster detection criterion. These methods are then applied to the well-known New York leukemia dataset and a Wisconsin breast cancer incidence dataset. © 2011, The International Biometric Society.
Frequency distribution function of stellar flares in the Orion association
International Nuclear Information System (INIS)
Parsamian, E.S.
1981-01-01
The temporal distributions of flare stars in the Orion association and the numbers of stars with different flare frequencies are determined by means of Ambartsumian's (1978) method, which uses the chronology of discovery of 'first' flares and the chronology of confirmations, i.e., the temporal distributions of 'repeated' flares. It is shown that flare stars with high flare frequency (not greater than 1000 hours) in the Pleiades are basically stars of low luminosity with M(U) not less than 13m. Two independent methods of determining the number of flare stars in the aggregates confirm that there are about 1.5 times more flare stars in the Orion association than in the Pleiades
Wang, Xinghu; Hong, Yiguang; Yi, Peng; Ji, Haibo; Kang, Yu
2017-05-24
In this paper, a distributed optimization problem is studied for continuous-time multiagent systems with unknown-frequency disturbances. A distributed gradient-based control is proposed for the agents to achieve the optimal consensus with estimating unknown frequencies and rejecting the bounded disturbance in the semi-global sense. Based on convex optimization analysis and adaptive internal model approach, the exact optimization solution can be obtained for the multiagent system disturbed by exogenous disturbances with uncertain parameters.
Aftershock Energy Distribution by Statistical Mechanics Approach
Daminelli, R.; Marcellini, A.
2015-12-01
The aim of our work is to research the most probable distribution of the energy of aftershocks. We started by applying one of the fundamental principles of statistical mechanics that, in case of aftershock sequences, it could be expressed as: the greater the number of different ways in which the energy of aftershocks can be arranged among the energy cells in phase space the more probable the distribution. We assume that each cell in phase space has the same possibility to be occupied, and that more than one cell in the phase space can have the same energy. Seeing that seismic energy is proportional to products of different parameters, a number of different combinations of parameters can produce different energies (e.g., different combination of stress drop and fault area can release the same seismic energy). Let us assume that there are gi cells in the aftershock phase space characterised by the same energy released ɛi. Therefore we can assume that the Maxwell-Boltzmann statistics can be applied to aftershock sequences with the proviso that the judgment on the validity of this hypothesis is the agreement with the data. The aftershock energy distribution can therefore be written as follow: n(ɛ)=Ag(ɛ)exp(-βɛ)where n(ɛ) is the number of aftershocks with energy, ɛ, A and β are constants. Considering the above hypothesis, we can assume g(ɛ) is proportional to ɛ. We selected and analysed different aftershock sequences (data extracted from Earthquake Catalogs of SCEC, of INGV-CNT and other institutions) with a minimum magnitude retained ML=2 (in some cases ML=2.6) and a time window of 35 days. The results of our model are in agreement with the data, except in the very low energy band, where our model resulted in a moderate overestimation.
Statistical significance of epidemiological data. Seminar: Evaluation of epidemiological studies
International Nuclear Information System (INIS)
Weber, K.H.
1993-01-01
In stochastic damages, the numbers of events, e.g. the persons who are affected by or have died of cancer, and thus the relative frequencies (incidence or mortality) are binomially distributed random variables. Their statistical fluctuations can be characterized by confidence intervals. For epidemiologic questions, especially for the analysis of stochastic damages in the low dose range, the following issues are interesting: - Is a sample (a group of persons) with a definite observed damage frequency part of the whole population? - Is an observed frequency difference between two groups of persons random or statistically significant? - Is an observed increase or decrease of the frequencies with increasing dose random or statistically significant and how large is the regression coefficient (= risk coefficient) in this case? These problems can be solved by sttistical tests. So-called distribution-free tests and tests which are not bound to the supposition of normal distribution are of particular interest, such as: - χ 2 -independence test (test in contingency tables); - Fisher-Yates-test; - trend test according to Cochran; - rank correlation test given by Spearman. These tests are explained in terms of selected epidemiologic data, e.g. of leukaemia clusters, of the cancer mortality of the Japanese A-bomb survivors especially in the low dose range as well as on the sample of the cancer mortality in the high background area in Yangjiang (China). (orig.) [de
International Nuclear Information System (INIS)
Vardavas, I.M.
1992-01-01
A simple procedure is presented for the statistical analysis of measurement data where the primary concern is the determination of the value corresponding to a specified average exceedance probability. The analysis employs the normal and log-normal frequency distributions together with a χ 2 -test and an error analysis. The error analysis introduces the concept of a counting error criterion, or ζ-test, to test whether the data are sufficient to make the Z 2 -test reliable. The procedure is applied to the analysis of annual rainfall data recorded at stations in the tropical Top End of Australia where the Ranger uranium deposit is situated. 9 refs., 12 tabs., 9 figs
Model for neural signaling leap statistics
International Nuclear Information System (INIS)
Chevrollier, Martine; Oria, Marcos
2011-01-01
We present a simple model for neural signaling leaps in the brain considering only the thermodynamic (Nernst) potential in neuron cells and brain temperature. We numerically simulated connections between arbitrarily localized neurons and analyzed the frequency distribution of the distances reached. We observed qualitative change between Normal statistics (with T 37.5 0 C, awaken regime) and Levy statistics (T = 35.5 0 C, sleeping period), characterized by rare events of long range connections.
Model for neural signaling leap statistics
Chevrollier, Martine; Oriá, Marcos
2011-03-01
We present a simple model for neural signaling leaps in the brain considering only the thermodynamic (Nernst) potential in neuron cells and brain temperature. We numerically simulated connections between arbitrarily localized neurons and analyzed the frequency distribution of the distances reached. We observed qualitative change between Normal statistics (with T = 37.5°C, awaken regime) and Lévy statistics (T = 35.5°C, sleeping period), characterized by rare events of long range connections.
Zhang, Yu; Li, Fei; Zhang, Shengkai; Zhu, Tingting
2017-04-01
Synthetic Aperture Radar (SAR) is significantly important for polar remote sensing since it can provide continuous observations in all days and all weather. SAR can be used for extracting the surface roughness information characterized by the variance of dielectric properties and different polarization channels, which make it possible to observe different ice types and surface structure for deformation analysis. In November, 2016, Chinese National Antarctic Research Expedition (CHINARE) 33rd cruise has set sails in sea ice zone in Antarctic. Accurate leads spatial distribution in sea ice zone for routine planning of ship navigation is essential. In this study, the semantic relationship between leads and sea ice categories has been described by the Conditional Random Fields (CRF) model, and leads characteristics have been modeled by statistical distributions in SAR imagery. In the proposed algorithm, a mixture statistical distribution based CRF is developed by considering the contexture information and the statistical characteristics of sea ice for improving leads detection in Sentinel-1A dual polarization SAR imagery. The unary potential and pairwise potential in CRF model is constructed by integrating the posteriori probability estimated from statistical distributions. For mixture statistical distribution parameter estimation, Method of Logarithmic Cumulants (MoLC) is exploited for single statistical distribution parameters estimation. The iteration based Expectation Maximal (EM) algorithm is investigated to calculate the parameters in mixture statistical distribution based CRF model. In the posteriori probability inference, graph-cut energy minimization method is adopted in the initial leads detection. The post-processing procedures including aspect ratio constrain and spatial smoothing approaches are utilized to improve the visual result. The proposed method is validated on Sentinel-1A SAR C-band Extra Wide Swath (EW) Ground Range Detected (GRD) imagery with a
Comment on the asymptotics of a distribution-free goodness of fit test statistic.
Browne, Michael W; Shapiro, Alexander
2015-03-01
In a recent article Jennrich and Satorra (Psychometrika 78: 545-552, 2013) showed that a proof by Browne (British Journal of Mathematical and Statistical Psychology 37: 62-83, 1984) of the asymptotic distribution of a goodness of fit test statistic is incomplete because it fails to prove that the orthogonal component function employed is continuous. Jennrich and Satorra (Psychometrika 78: 545-552, 2013) showed how Browne's proof can be completed satisfactorily but this required the development of an extensive and mathematically sophisticated framework for continuous orthogonal component functions. This short note provides a simple proof of the asymptotic distribution of Browne's (British Journal of Mathematical and Statistical Psychology 37: 62-83, 1984) test statistic by using an equivalent form of the statistic that does not involve orthogonal component functions and consequently avoids all complicating issues associated with them.
Inverted rank distributions: Macroscopic statistics, universality classes, and critical exponents
Eliazar, Iddo; Cohen, Morrel H.
2014-01-01
An inverted rank distribution is an infinite sequence of positive sizes ordered in a monotone increasing fashion. Interlacing together Lorenzian and oligarchic asymptotic analyses, we establish a macroscopic classification of inverted rank distributions into five “socioeconomic” universality classes: communism, socialism, criticality, feudalism, and absolute monarchy. We further establish that: (i) communism and socialism are analogous to a “disordered phase”, feudalism and absolute monarchy are analogous to an “ordered phase”, and criticality is the “phase transition” between order and disorder; (ii) the universality classes are characterized by two critical exponents, one governing the ordered phase, and the other governing the disordered phase; (iii) communism, criticality, and absolute monarchy are characterized by sharp exponent values, and are inherently deterministic; (iv) socialism is characterized by a continuous exponent range, is inherently stochastic, and is universally governed by continuous power-law statistics; (v) feudalism is characterized by a continuous exponent range, is inherently stochastic, and is universally governed by discrete exponential statistics. The results presented in this paper yield a universal macroscopic socioeconophysical perspective of inverted rank distributions.
Polarimetric Segmentation Using Wishart Test Statistic
DEFF Research Database (Denmark)
Skriver, Henning; Schou, Jesper; Nielsen, Allan Aasbjerg
2002-01-01
A newly developed test statistic for equality of two complex covariance matrices following the complex Wishart distribution and an associated asymptotic probability for the test statistic has been used in a segmentation algorithm. The segmentation algorithm is based on the MUM (merge using moments......) approach, which is a merging algorithm for single channel SAR images. The polarimetric version described in this paper uses the above-mentioned test statistic for merging. The segmentation algorithm has been applied to polarimetric SAR data from the Danish dual-frequency, airborne polarimetric SAR, EMISAR...
Estimating Predictive Variance for Statistical Gas Distribution Modelling
International Nuclear Information System (INIS)
Lilienthal, Achim J.; Asadi, Sahar; Reggente, Matteo
2009-01-01
Recent publications in statistical gas distribution modelling have proposed algorithms that model mean and variance of a distribution. This paper argues that estimating the predictive concentration variance entails not only a gradual improvement but is rather a significant step to advance the field. This is, first, since the models much better fit the particular structure of gas distributions, which exhibit strong fluctuations with considerable spatial variations as a result of the intermittent character of gas dispersal. Second, because estimating the predictive variance allows to evaluate the model quality in terms of the data likelihood. This offers a solution to the problem of ground truth evaluation, which has always been a critical issue for gas distribution modelling. It also enables solid comparisons of different modelling approaches, and provides the means to learn meta parameters of the model, to determine when the model should be updated or re-initialised, or to suggest new measurement locations based on the current model. We also point out directions of related ongoing or potential future research work.
Reliability of equipments and theory of frequency statistics and Bayesian decision
International Nuclear Information System (INIS)
Procaccia, H.; Clarotti, C.A.
1992-01-01
The rapid development of Bayesian techniques use in the domain of industrial risk is a recent phenomenon linked to the development of powerful computers. These techniques involve a reasoning well adapted to experimental logics, based on the dynamical knowledge enrichment with experience data. In the framework of reliability studies and statistical decision making, these methods differ slightly from the methods commonly used to evaluate the reliability of systems and from classical theoretical frequency statistics. This particular approach is described in this book and illustrated with many examples of application (power plants, pressure vessels, industrial installations etc..). These examples generally concern the risk management in the cases where the application of rules and the respect of norms become insufficient. It is now well known that the risk cannot be reduced to zero and that its evaluation must be performed using statistics, taking into account the possible accident processes and also the investments necessary to avoid them (service life, failure, maintenance costs and availability of materials). The result is the optimizing of a decision process about rare or uncertain events. (J.S.)
Perneger, Thomas V; Combescure, Christophe
2017-07-01
Published P-values provide a window into the global enterprise of medical research. The aim of this study was to use the distribution of published P-values to estimate the relative frequencies of null and alternative hypotheses and to seek irregularities suggestive of publication bias. This cross-sectional study included P-values published in 120 medical research articles in 2016 (30 each from the BMJ, JAMA, Lancet, and New England Journal of Medicine). The observed distribution of P-values was compared with expected distributions under the null hypothesis (i.e., uniform between 0 and 1) and the alternative hypothesis (strictly decreasing from 0 to 1). P-values were categorized according to conventional levels of statistical significance and in one-percent intervals. Among 4,158 recorded P-values, 26.1% were highly significant (P values values equal to 1, and (3) about twice as many P-values less than 0.05 compared with those more than 0.05. The latter finding was seen in both randomized trials and observational studies, and in most types of analyses, excepting heterogeneity tests and interaction tests. Under plausible assumptions, we estimate that about half of the tested hypotheses were null and the other half were alternative. This analysis suggests that statistical tests published in medical journals are not a random sample of null and alternative hypotheses but that selective reporting is prevalent. In particular, significant results are about twice as likely to be reported as nonsignificant results. Copyright © 2017 Elsevier Inc. All rights reserved.
Assessing the copula selection for bivariate frequency analysis ...
Indian Academy of Sciences (India)
58
Copulas are applied to overcome the restriction of traditional bivariate frequency ... frequency analysis methods cannot describe the random variable properties that ... In order to overcome the limitation of multivariate distributions, a copula is a ..... The Mann-Kendall (M-K) test is a non-parametric statistical test which is used ...
Directory of Open Access Journals (Sweden)
G. Moretti
2008-08-01
Full Text Available The estimation of the peak river flow for ungauged river sections is a topical issue in applied hydrology. Spatially distributed rainfall-runoff models can be a useful tool to this end, since they are potentially able to simulate the river flow at any location of the watershed drainage network. However, it is not fully clear to what extent these models can provide reliable simulations over a wide range of spatial scales. This issue is investigated here by applying a spatially distributed, continuous simulation rainfall-runoff model to infer the flood frequency distribution of the Riarbero River. This is an ungauged mountain creek located in northern Italy, whose drainage area is 17 km^{2}. The hydrological model is first calibrated by using a 1-year record of hourly meteorological data and river flows observed at the outlet of the 1294 km^{2} wide Secchia River basin, of which the Riarbero is a tributary. The model is then validated by performing a 100-year long simulation of synthetic river flow data, which allowed us to compare the simulated and observed flood frequency distributions at the Secchia River outlet and the internal cross river section of Cavola Bridge, where the basin area is 337 km^{2}. Finally, another simulation of hourly river flows was performed by referring to the outlet of the Riarbero River, therefore allowing us to estimate the related flood frequency distribution. The results were validated by using estimates of peak river flow obtained by applying hydrological similarity principles and a regional method. The results show that the flood flow estimated through the application of the distributed model is consistent with the estimate provided by the regional procedure as well as the behaviors of the river banks. Conversely, the method based on hydrological similarity delivers an estimate that seems to be not as reliable. The analysis highlights interesting perspectives for the application of
Device for flattening statistically distributed pulses
International Nuclear Information System (INIS)
Il'kanaev, G.I.; Iskenderov, V.G.; Rudnev, O.V.; Teller, V.S.
1976-01-01
The description is given of a device that converts the series of statistically distributed pulses into a pseudo-uniform one. The inlet pulses switch over the first counter, and the second one is switched over by the clock pulses each time the uniformity of the counters' states is violated. This violation is recorded by the logic circuit which passes to the output the clock pulses in the amount equal to that of the pulses that reached the device inlet. Losses at the correlation between the light velocity and the sampling rate up to 0.3 do not exceed 0.7 per cent for the memory of pulse counters 3, and 0.035 per cent for memory 7
Statistics of natural binaural sounds.
Directory of Open Access Journals (Sweden)
Wiktor Młynarski
Full Text Available Binaural sound localization is usually considered a discrimination task, where interaural phase (IPD and level (ILD disparities at narrowly tuned frequency channels are utilized to identify a position of a sound source. In natural conditions however, binaural circuits are exposed to a stimulation by sound waves originating from multiple, often moving and overlapping sources. Therefore statistics of binaural cues depend on acoustic properties and the spatial configuration of the environment. Distribution of cues encountered naturally and their dependence on physical properties of an auditory scene have not been studied before. In the present work we analyzed statistics of naturally encountered binaural sounds. We performed binaural recordings of three auditory scenes with varying spatial configuration and analyzed empirical cue distributions from each scene. We have found that certain properties such as the spread of IPD distributions as well as an overall shape of ILD distributions do not vary strongly between different auditory scenes. Moreover, we found that ILD distributions vary much weaker across frequency channels and IPDs often attain much higher values, than can be predicted from head filtering properties. In order to understand the complexity of the binaural hearing task in the natural environment, sound waveforms were analyzed by performing Independent Component Analysis (ICA. Properties of learned basis functions indicate that in natural conditions soundwaves in each ear are predominantly generated by independent sources. This implies that the real-world sound localization must rely on mechanisms more complex than a mere cue extraction.
Statistics of natural binaural sounds.
Młynarski, Wiktor; Jost, Jürgen
2014-01-01
Binaural sound localization is usually considered a discrimination task, where interaural phase (IPD) and level (ILD) disparities at narrowly tuned frequency channels are utilized to identify a position of a sound source. In natural conditions however, binaural circuits are exposed to a stimulation by sound waves originating from multiple, often moving and overlapping sources. Therefore statistics of binaural cues depend on acoustic properties and the spatial configuration of the environment. Distribution of cues encountered naturally and their dependence on physical properties of an auditory scene have not been studied before. In the present work we analyzed statistics of naturally encountered binaural sounds. We performed binaural recordings of three auditory scenes with varying spatial configuration and analyzed empirical cue distributions from each scene. We have found that certain properties such as the spread of IPD distributions as well as an overall shape of ILD distributions do not vary strongly between different auditory scenes. Moreover, we found that ILD distributions vary much weaker across frequency channels and IPDs often attain much higher values, than can be predicted from head filtering properties. In order to understand the complexity of the binaural hearing task in the natural environment, sound waveforms were analyzed by performing Independent Component Analysis (ICA). Properties of learned basis functions indicate that in natural conditions soundwaves in each ear are predominantly generated by independent sources. This implies that the real-world sound localization must rely on mechanisms more complex than a mere cue extraction.
Highly stable microwave carrier generation using a dual-frequency distributed feedback laser
Khan, M.R.H.; Bernhardi, Edward; Marpaung, D.A.I.; Burla, M.; de Ridder, R.M.; Worhoff, Kerstin; Pollnau, Markus; Roeloffzen, C.G.H.
2012-01-01
Photonic generation of microwave carriers by using a dual-frequency distributed feedback waveguide laser in ytterbium-doped aluminum oxide is demonstrated. A highperformance optical frequency locked loop is implemented to stabilize the microwave carrier. This approach results in a microwave
Directory of Open Access Journals (Sweden)
Stephen Wee Hun eLim
2016-03-01
Full Text Available The effects of orthographic neighborhood density and word frequency in visual word recognition were investigated using distributional analyses of response latencies in visual lexical decision. Main effects of density and frequency were observed in mean latencies. Distributional analyses, in addition, revealed a density x frequency interaction: for low-frequency words, density effects were mediated predominantly by distributional shifting whereas for high-frequency words, density effects were absent except at the slower RTs, implicating distributional skewing. The present findings suggest that density effects in low-frequency words reflect processes involved in early lexical access, while the effects observed in high-frequency words reflect late postlexical checking processes.
Influence of the statistical distribution of bioassay measurement errors on the intake estimation
International Nuclear Information System (INIS)
Lee, T. Y; Kim, J. K
2006-01-01
The purpose of this study is to provide the guidance necessary for making a selection of error distributions by analyzing influence of statistical distribution for a type of bioassay measurement error on the intake estimation. For this purpose, intakes were estimated using maximum likelihood method for cases that error distributions are normal and lognormal, and comparisons between two distributions for the estimated intakes were made. According to the results of this study, in case that measurement results for lung retention are somewhat greater than the limit of detection it appeared that distribution types have negligible influence on the results. Whereas in case of measurement results for the daily excretion rate, the results obtained from assumption of a lognormal distribution were 10% higher than those obtained from assumption of a normal distribution. In view of these facts, in case where uncertainty component is governed by counting statistics it is considered that distribution type have no influence on intake estimation. Whereas in case where the others are predominant, it is concluded that it is clearly desirable to estimate the intake assuming a lognormal distribution
Directory of Open Access Journals (Sweden)
Rawid Banchuin
2014-01-01
Full Text Available In this research, the analysis of statistical variations in subthreshold MOSFET's high frequency characteristics defined in terms of gate capacitance and transition frequency, have been shown and the resulting comprehensive analytical models of such variations in terms of their variances have been proposed. Major imperfection in the physical level properties including random dopant fluctuation and effects of variations in MOSFET's manufacturing process, have been taken into account in the proposed analysis and modeling. The up to dated comprehensive analytical model of statistical variation in MOSFET's parameter has been used as the basis of analysis and modeling. The resulting models have been found to be both analytic and comprehensive as they are the precise mathematical expressions in terms of physical level variables of MOSFET. Furthermore, they have been verified at the nanometer level by using 65~nm level BSIM4 based benchmarks and have been found to be very accurate with smaller than 5 % average percentages of errors. Hence, the performed analysis gives the resulting models which have been found to be the potential mathematical tool for the statistical and variability aware analysis and design of subthreshold MOSFET based VHF circuits, systems and applications.
Phelps, Michael; Latif, Asad; Thomsen, Robert; Slodzinski, Martin; Raghavan, Rahul; Paul, Sharon Leigh; Stonemetz, Jerry
2017-08-01
Use of an anesthesia information management system (AIMS) has been reported to improve accuracy of recorded information. We tested the hypothesis that analyzing the distribution of times charted on paper and computerized records could reveal possible rounding errors, and that this effect could be modulated by differences in the user interface for documenting certain event times with an AIMS. We compared the frequency distribution of start and end times for anesthesia cases completed with paper records and an AIMS. Paper anesthesia records had significantly more times ending with "0" and "5" compared to those from the AIMS (p < 0.001). For case start times, AIMS still exhibited end-digit preference, with times whose last digits had significantly higher frequencies of "0" and "5" than other integers. This effect, however, was attenuated compared to that for paper anesthesia records. For case end times, the distribution of minutes recorded with AIMS was almost evenly distributed, unlike those from paper records that still showed significant end-digit preference. The accuracy of anesthesia case start times and case end times, as inferred by statistical analysis of the distribution of the times, is enhanced with the use of an AIMS. Furthermore, the differences in AIMS user interface for documenting case start and case end times likely affects the degree of end-digit preference, and likely accuracy, of those times.
Poppe, L.J.; Eliason, A.H.; Hastings, M.E.
2004-01-01
Measures that describe and summarize sediment grain-size distributions are important to geologists because of the large amount of information contained in textural data sets. Statistical methods are usually employed to simplify the necessary comparisons among samples and quantify the observed differences. The two statistical methods most commonly used by sedimentologists to describe particle distributions are mathematical moments (Krumbein and Pettijohn, 1938) and inclusive graphics (Folk, 1974). The choice of which of these statistical measures to use is typically governed by the amount of data available (Royse, 1970). If the entire distribution is known, the method of moments may be used; if the next to last accumulated percent is greater than 95, inclusive graphics statistics can be generated. Unfortunately, earlier programs designed to describe sediment grain-size distributions statistically do not run in a Windows environment, do not allow extrapolation of the distribution's tails, or do not generate both moment and graphic statistics (Kane and Hubert, 1963; Collias et al., 1963; Schlee and Webster, 1967; Poppe et al., 2000)1.Owing to analytical limitations, electro-resistance multichannel particle-size analyzers, such as Coulter Counters, commonly truncate the tails of the fine-fraction part of grain-size distributions. These devices do not detect fine clay in the 0.6–0.1 μm range (part of the 11-phi and all of the 12-phi and 13-phi fractions). Although size analyses performed down to 0.6 μm microns are adequate for most freshwater and near shore marine sediments, samples from many deeper water marine environments (e.g. rise and abyssal plain) may contain significant material in the fine clay fraction, and these analyses benefit from extrapolation.The program (GSSTAT) described herein generates statistics to characterize sediment grain-size distributions and can extrapolate the fine-grained end of the particle distribution. It is written in Microsoft
Model for neural signaling leap statistics
Energy Technology Data Exchange (ETDEWEB)
Chevrollier, Martine; Oria, Marcos, E-mail: oria@otica.ufpb.br [Laboratorio de Fisica Atomica e Lasers Departamento de Fisica, Universidade Federal da ParaIba Caixa Postal 5086 58051-900 Joao Pessoa, Paraiba (Brazil)
2011-03-01
We present a simple model for neural signaling leaps in the brain considering only the thermodynamic (Nernst) potential in neuron cells and brain temperature. We numerically simulated connections between arbitrarily localized neurons and analyzed the frequency distribution of the distances reached. We observed qualitative change between Normal statistics (with T 37.5{sup 0}C, awaken regime) and Levy statistics (T = 35.5{sup 0}C, sleeping period), characterized by rare events of long range connections.
Fuzzy statistical decision-making theory and applications
Kabak, Özgür
2016-01-01
This book offers a comprehensive reference guide to fuzzy statistics and fuzzy decision-making techniques. It provides readers with all the necessary tools for making statistical inference in the case of incomplete information or insufficient data, where classical statistics cannot be applied. The respective chapters, written by prominent researchers, explain a wealth of both basic and advanced concepts including: fuzzy probability distributions, fuzzy frequency distributions, fuzzy Bayesian inference, fuzzy mean, mode and median, fuzzy dispersion, fuzzy p-value, and many others. To foster a better understanding, all the chapters include relevant numerical examples or case studies. Taken together, they form an excellent reference guide for researchers, lecturers and postgraduate students pursuing research on fuzzy statistics. Moreover, by extending all the main aspects of classical statistical decision-making to its fuzzy counterpart, the book presents a dynamic snapshot of the field that is expected to stimu...
Wu, Hao
2018-05-01
In structural equation modelling (SEM), a robust adjustment to the test statistic or to its reference distribution is needed when its null distribution deviates from a χ 2 distribution, which usually arises when data do not follow a multivariate normal distribution. Unfortunately, existing studies on this issue typically focus on only a few methods and neglect the majority of alternative methods in statistics. Existing simulation studies typically consider only non-normal distributions of data that either satisfy asymptotic robustness or lead to an asymptotic scaled χ 2 distribution. In this work we conduct a comprehensive study that involves both typical methods in SEM and less well-known methods from the statistics literature. We also propose the use of several novel non-normal data distributions that are qualitatively different from the non-normal distributions widely used in existing studies. We found that several under-studied methods give the best performance under specific conditions, but the Satorra-Bentler method remains the most viable method for most situations. © 2017 The British Psychological Society.
Statistical inference for a class of multivariate negative binomial distributions
DEFF Research Database (Denmark)
Rubak, Ege Holger; Møller, Jesper; McCullagh, Peter
This paper considers statistical inference procedures for a class of models for positively correlated count variables called α-permanental random fields, and which can be viewed as a family of multivariate negative binomial distributions. Their appealing probabilistic properties have earlier been...
Application of Maximum Entropy Distribution to the Statistical Properties of Wave Groups
Institute of Scientific and Technical Information of China (English)
无
2007-01-01
The new distributions of the statistics of wave groups based on the maximum entropy principle are presented. The maximum entropy distributions appear to be superior to conventional distributions when applied to a limited amount of information. Its applications to the wave group properties show the effectiveness of the maximum entropy distribution. FFT filtering method is employed to obtain the wave envelope fast and efficiently. Comparisons of both the maximum entropy distribution and the distribution of Longuet-Higgins (1984) with the laboratory wind-wave data show that the former gives a better fit.
Ultra-stable long distance optical frequency distribution using the Internet fiber network.
Lopez, Olivier; Haboucha, Adil; Chanteau, Bruno; Chardonnet, Christian; Amy-Klein, Anne; Santarelli, Giorgio
2012-10-08
We report an optical link of 540 km for ultrastable frequency distribution over the Internet fiber network. The stable frequency optical signal is processed enabling uninterrupted propagation on both directions. The robustness and the performance of the link are enhanced by a cost effective fully automated optoelectronic station. This device is able to coherently regenerate the return optical signal with a heterodyne optical phase locking of a low noise laser diode. Moreover the incoming signal polarization variation are tracked and processed in order to maintain beat note amplitudes within the operation range. Stable fibered optical interferometer enables optical detection of the link round trip phase signal. The phase-noise compensated link shows a fractional frequency instability in 10 Hz bandwidth of 5 × 10(-15) at one second measurement time and 2 × 10(-19) at 30,000 s. This work is a significant step towards a sustainable wide area ultrastable optical frequency distribution and comparison network.
Time-frequency representation of a highly nonstationary signal via the modified Wigner distribution
Zoladz, T. F.; Jones, J. H.; Jong, J.
1992-01-01
A new signal analysis technique called the modified Wigner distribution (MWD) is presented. The new signal processing tool has been very successful in determining time frequency representations of highly non-stationary multicomponent signals in both simulations and trials involving actual Space Shuttle Main Engine (SSME) high frequency data. The MWD departs from the classic Wigner distribution (WD) in that it effectively eliminates the cross coupling among positive frequency components in a multiple component signal. This attribute of the MWD, which prevents the generation of 'phantom' spectral peaks, will undoubtedly increase the utility of the WD for real world signal analysis applications which more often than not involve multicomponent signals.
A generalized statistical model for the size distribution of wealth
International Nuclear Information System (INIS)
Clementi, F; Gallegati, M; Kaniadakis, G
2012-01-01
In a recent paper in this journal (Clementi et al 2009 J. Stat. Mech. P02037), we proposed a new, physically motivated, distribution function for modeling individual incomes, having its roots in the framework of the κ-generalized statistical mechanics. The performance of the κ-generalized distribution was checked against real data on personal income for the United States in 2003. In this paper we extend our previous model so as to be able to account for the distribution of wealth. Probabilistic functions and inequality measures of this generalized model for wealth distribution are obtained in closed form. In order to check the validity of the proposed model, we analyze the US household wealth distributions from 1984 to 2009 and conclude an excellent agreement with the data that is superior to any other model already known in the literature. (paper)
A generalized statistical model for the size distribution of wealth
Clementi, F.; Gallegati, M.; Kaniadakis, G.
2012-12-01
In a recent paper in this journal (Clementi et al 2009 J. Stat. Mech. P02037), we proposed a new, physically motivated, distribution function for modeling individual incomes, having its roots in the framework of the κ-generalized statistical mechanics. The performance of the κ-generalized distribution was checked against real data on personal income for the United States in 2003. In this paper we extend our previous model so as to be able to account for the distribution of wealth. Probabilistic functions and inequality measures of this generalized model for wealth distribution are obtained in closed form. In order to check the validity of the proposed model, we analyze the US household wealth distributions from 1984 to 2009 and conclude an excellent agreement with the data that is superior to any other model already known in the literature.
Frequency and distribution of Notch mutations in tumor cell lines
International Nuclear Information System (INIS)
Mutvei, Anders Peter; Fredlund, Erik; Lendahl, Urban
2015-01-01
Deregulated Notch signaling is linked to a variety of tumors and it is therefore important to learn more about the frequency and distribution of Notch mutations in a tumor context. In this report, we use data from the recently developed Cancer Cell Line Encyclopedia to assess the frequency and distribution of Notch mutations in a large panel of cancer cell lines in silico. Our results show that the mutation frequency of Notch receptor and ligand genes is at par with that for established oncogenes and higher than for a set of house-keeping genes. Mutations were found across all four Notch receptor genes, but with notable differences between protein domains, mutations were for example more prevalent in the regions encoding the LNR and PEST domains in the Notch intracellular domain. Furthermore, an in silico estimation of functional impact showed that deleterious mutations cluster to the ligand-binding and the intracellular domains of NOTCH1. For most cell line groups, the mutation frequency of Notch genes is higher than in associated primary tumors. Our results shed new light on the spectrum of Notch mutations after in vitro culturing of tumor cells. The higher mutation frequency in tumor cell lines indicates that Notch mutations are associated with a growth advantage in vitro, and thus may be considered to be driver mutations in a tumor cell line context. The online version of this article (doi:10.1186/s12885-015-1278-x) contains supplementary material, which is available to authorized users
Statistical distributions of extreme dry spell in Peninsular Malaysia
Zin, Wan Zawiah Wan; Jemain, Abdul Aziz
2010-11-01
Statistical distributions of annual extreme (AE) series and partial duration (PD) series for dry-spell event are analyzed for a database of daily rainfall records of 50 rain-gauge stations in Peninsular Malaysia, with recording period extending from 1975 to 2004. The three-parameter generalized extreme value (GEV) and generalized Pareto (GP) distributions are considered to model both series. In both cases, the parameters of these two distributions are fitted by means of the L-moments method, which provides a robust estimation of them. The goodness-of-fit (GOF) between empirical data and theoretical distributions are then evaluated by means of the L-moment ratio diagram and several goodness-of-fit tests for each of the 50 stations. It is found that for the majority of stations, the AE and PD series are well fitted by the GEV and GP models, respectively. Based on the models that have been identified, we can reasonably predict the risks associated with extreme dry spells for various return periods.
Daris, a low-frequency distributed aperture array for radio astronomy in space
Boonstra, A.J.; Saks, N.; Bentum, Marinus Jan; van 't Klooster, K.; Falcke, H.
2010-01-01
DARIS (Distributed Aperture Array for Radio Astronomy in Space) is a radio astronomy space mission concept aimed at observing the low-frequency radio sky in the range 1-10 MHz. Because of the Earth's ionospheric disturbances and opaqueness, this frequency range can only be observed from space. The
A model of seismic focus and related statistical distributions of earthquakes
International Nuclear Information System (INIS)
Apostol, Bogdan-Felix
2006-01-01
A growth model for accumulating seismic energy in a localized seismic focus is described, which introduces a fractional parameter r on geometrical grounds. The model is employed for deriving a power-type law for the statistical distribution in energy, where the parameter r contributes to the exponent, as well as corresponding time and magnitude distributions for earthquakes. The accompanying seismic activity of foreshocks and aftershocks is discussed in connection with this approach, as based on Omori distributions, and the rate of released energy is derived
Karimi, Hamid; O'Brian, Sue; Onslow, Mark; Jones, Mark; Menzies, Ross; Packman, Ann
2013-01-01
Purpose: Stuttering varies between and within speaking situations. In this study, the authors used statistical process control charts with 10 case studies to investigate variability of stuttering frequency. Method: Participants were 10 adults who stutter. The authors counted the percentage of syllables stuttered (%SS) for segments of their speech…
Statistics of meteorology for dose evaluation of crews of nuclear ship
International Nuclear Information System (INIS)
Imai, Kazuhiko; Chino, Masamichi
1981-01-01
For the purpose of the dose evaluation of crews of a nuclear ship, the statistics of wind speed and direction relative to the ship is discussed, using wind data which are reported from ships crusing sea around Japan Island. The analysis on the data shows that the occurrence frequency of wind speed can be fitted with the γ-distribution having parameter p around 3 and wind direction frequency can be treated as a uniform distribution. Using these distributions and taking the ship speed u 3 and the long-term mean speed of natural wind anti u as constant parameters, frequency distribution of wind speed and direction relative to the ship was calculated and statistical quantities necessary for dose evaluation were obtained in the way similar to the procedure for reactor sites on land. The 97% value of wind speed u 97 , which should be used in the dose evaluation for accidental releases may give conservative doses, if it is evaluated as follows, u 97 = 0.64 u sub(s) in the cases u sub(s) > anti u, and u 97 = 0.86 anti u in the cases u sub(s) < anti u including u sub(s) = 0. (author)
Kassem, M.; Soize, C.; Gagliardini, L.
2009-06-01
In this paper, an energy-density field approach applied to the vibroacoustic analysis of complex industrial structures in the low- and medium-frequency ranges is presented. This approach uses a statistical computational model. The analyzed system consists of an automotive vehicle structure coupled with its internal acoustic cavity. The objective of this paper is to make use of the statistical properties of the frequency response functions of the vibroacoustic system observed from previous experimental and numerical work. The frequency response functions are expressed in terms of a dimensionless matrix which is estimated using the proposed energy approach. Using this dimensionless matrix, a simplified vibroacoustic model is proposed.
International Nuclear Information System (INIS)
Huang Zhifu; Lin Bihong; ChenJincan
2009-01-01
In order to overcome the limitations of the original expression of the probability distribution appearing in literature of Incomplete Statistics, a new expression of the probability distribution is derived, where the Lagrange multiplier β introduced here is proved to be identical with that introduced in the second and third choices for the internal energy constraint in Tsallis' statistics and to be just equal to the physical inverse temperature. It is expounded that the probability distribution described by the new expression is invariant through uniform translation of the energy spectrum. Moreover, several fundamental thermodynamic relations are given and the relationship between the new and the original expressions of the probability distribution is discussed.
On the Frequency Distribution of Neutral Particles from Low-Energy Strong Interactions
Directory of Open Access Journals (Sweden)
Federico Colecchia
2017-01-01
Full Text Available The rejection of the contamination, or background, from low-energy strong interactions at hadron collider experiments is a topic that has received significant attention in the field of particle physics. This article builds on a particle-level view of collision events, in line with recently proposed subtraction methods. While conventional techniques in the field usually concentrate on probability distributions, our study is, to our knowledge, the first attempt at estimating the frequency distribution of background particles across the kinematic space inside individual collision events. In fact, while the probability distribution can generally be estimated given a model of low-energy strong interactions, the corresponding frequency distribution inside a single event typically deviates from the average and cannot be predicted a priori. We present preliminary results in this direction and establish a connection between our technique and the particle weighting methods that have been the subject of recent investigation at the Large Hadron Collider.
Davids, J. C.; Rutten, M.; Van De Giesen, N.
2016-12-01
Hydrologic data has traditionally been collected with permanent installations of sophisticated and relatively accurate but expensive monitoring equipment at limited numbers of sites. Consequently, the spatial coverage of the data is limited and costs are high. Achieving adequate maintenance of sophisticated monitoring equipment often exceeds local technical and resource capacity, and permanently deployed monitoring equipment is susceptible to vandalism, theft, and other hazards. Rather than using expensive, vulnerable installations at a few points, SmartPhones4Water (S4W), a form of Citizen Hydrology, leverages widely available mobile technology to gather hydrologic data at many sites in a manner that is repeatable and scalable. However, there is currently a limited understanding of the impact of decreased observational frequency on the accuracy of key streamflow statistics like minimum flow, maximum flow, and runoff. As a first step towards evaluating the tradeoffs between traditional continuous monitoring approaches and emerging Citizen Hydrology methods, we randomly selected 50 active U.S. Geological Survey (USGS) streamflow gauges in California. We used historical 15 minute flow data from 01/01/2008 through 12/31/2014 to develop minimum flow, maximum flow, and runoff values (7 year total) for each gauge. In order to mimic lower frequency Citizen Hydrology observations, we developed a bootstrap randomized subsampling with replacement procedure. We calculated the same statistics, along with their respective distributions, from 50 subsample iterations with four different subsampling intervals (i.e. daily, three day, weekly, and monthly). Based on our results we conclude that, depending on the types of questions being asked, and the watershed characteristics, Citizen Hydrology streamflow measurements can provide useful and accurate information. Depending on watershed characteristics, minimum flows were reasonably estimated with subsample intervals ranging from
International Nuclear Information System (INIS)
Zambra, M.; Favre, M.; Moreno, J.; Wyndham, E.; Chuaqui, H.; Choi, P.
1998-01-01
The charge formation processes in a hollow cathode region (HCR) of transient hollow cathode discharge have been studied at the final phase. The statistical distribution that describe different processes of ionization have been represented by Gaussian distributions. Nevertheless, was observed a better representation of these distributions when the pressure is near a minimum value, just before breakdown
Real-time updating of the flood frequency distribution through data assimilation
Aguilar, Cristina; Montanari, Alberto; Polo, María-José
2017-07-01
We explore the memory properties of catchments for predicting the likelihood of floods based on observations of average flows in pre-flood seasons. Our approach assumes that flood formation is driven by the superimposition of short- and long-term perturbations. The former is given by the short-term meteorological forcing leading to infiltration and/or saturation excess, while the latter is originated by higher-than-usual storage in the catchment. To exploit the above sensitivity to long-term perturbations, a meta-Gaussian model and a data assimilation approach are implemented for updating the flood frequency distribution a season in advance. Accordingly, the peak flow in the flood season is predicted in probabilistic terms by exploiting its dependence on the average flow in the antecedent seasons. We focus on the Po River at Pontelagoscuro and the Danube River at Bratislava. We found that the shape of the flood frequency distribution is noticeably impacted by higher-than-usual flows occurring up to several months earlier. The proposed technique may allow one to reduce the uncertainty associated with the estimation of flood frequency.
Inverter design for high frequency power distribution
King, R. J.
1985-01-01
A class of simple resonantly commutated inverters are investigated for use in a high power (100 KW - 1000 KW) high frequency (10 KHz - 20 KHz) AC power distribution system. The Mapham inverter is found to provide a unique combination of large thyristor turn-off angle and good utilization factor, much better than an alternate 'current-fed' inverter. The effects of loading the Mapham inverter entirely with rectifier loads are investigated by simulation and with an experimental 3 KW 20 KHz inverter. This inverter is found to be well suited to a power system with heavy rectifier loading.
Frequency distribution analysis of the long-lived beta-activity of air dust
International Nuclear Information System (INIS)
Bunzl, K.; Hoetzl, H.; Winkler, R.
1977-01-01
In order to compare the average annual beta activities of air dust a frequency distribution analysis of data has been carried out in order to select a representative quantity for the average value of the data group. It was found that the data to be analysed were consistent with a log-normal frequency distribution and therefore calculations were made of, as the representative average, the median of the beta activity of each year as the antilog of the arithmetric mean of the logarithms, log x, of the analytical values x. The 95% confidence limits were also obtained. The quantities thus calculated are summarized in tabular form. (U.K.)
Single-frequency thulium-doped distributed-feedback fibre laser
DEFF Research Database (Denmark)
Agger, Søren; Povlsen, Jørn Hedegaard; Varming, Poul
2004-01-01
We have successfully demonstrated a single-frequency distributed-feedback (DFB) thulium-doped silica fiber laser emitting at a wavelength of 1735 nm. The laser cavity is less than 5 cm long and is formed by intracore UV-written Bragg gratings with a phase shift. The laser is pumped at 790 nm from...... a Ti:sapphire laser and has a threshold pump power of 59 mW. The laser has a maximum output power of 1 mW in a singlefrequency, single-polarization radiation mode and is tunable over a few nanometers. To the best of the authors’ knowledge, this is the first report of a single-frequency DFB fiber laser...... that uses thulium as the amplifying medium. The lasing wavelength is the longest demonstrated with DFB fiber lasers and yet is among the shortest obtained for thulium-doped silica fiber lasers....
Maximum-likelihood methods for array processing based on time-frequency distributions
Zhang, Yimin; Mu, Weifeng; Amin, Moeness G.
1999-11-01
This paper proposes a novel time-frequency maximum likelihood (t-f ML) method for direction-of-arrival (DOA) estimation for non- stationary signals, and compares this method with conventional maximum likelihood DOA estimation techniques. Time-frequency distributions localize the signal power in the time-frequency domain, and as such enhance the effective SNR, leading to improved DOA estimation. The localization of signals with different t-f signatures permits the division of the time-frequency domain into smaller regions, each contains fewer signals than those incident on the array. The reduction of the number of signals within different time-frequency regions not only reduces the required number of sensors, but also decreases the computational load in multi- dimensional optimizations. Compared to the recently proposed time- frequency MUSIC (t-f MUSIC), the proposed t-f ML method can be applied in coherent environments, without the need to perform any type of preprocessing that is subject to both array geometry and array aperture.
Frequency and distribution of leakages in steam generators of gas-cooled reactors
International Nuclear Information System (INIS)
Bongratz, R.; Breitbach, G.; Wolters, J.
1988-01-01
In gas cooled reactors with graphitic primary circuit structures - such as HTR, AGR or Magnox - the water ingress is an event of great safety concern. Water or steam entering the primary circuit react with the hot graphite and carbon-oxide and hydrogen are produced. As the most important initiating event a leak in a steam generator must be taken into account. From the safety point of view as well as for availability reasons it is necessary to construct reliable boilers. Thus the occurrence of a boiler leak should be a rare event. In the context of a probabilistic safety study for an HTR-Project much effort was invested to get information about the frequency and the size distribution of tube failures in steam generators of gas cooled reactors. The main data base was the boiler tube failure statistics of United Kingdom gas cooled reactors. The data were selected and applied to a modern HTR steam generator design. A review of the data showed that the failure frequency is not connected with the load level (pressures, temperatures) or with the geometric size of the heating surface of the boiler. Design, construction, fabrication, examination and operation conditions have the greatest influence an the failure frequency but they are practically not to be quantified. The typical leak develops from smallest size. By erosion effects of the entering water or steam it is enlarged to perhaps some mm 2 , then usually it is detected by moisture monitors. Sudden tube breaks were not reported in the investigated period. As a rule boiler leaks in gas cooled reactors are much more, rare then leaks in steam generators of light water reactors and fossil fired boilers. (author)
Analysis of room transfer function and reverberant signal statistics
DEFF Research Database (Denmark)
Georganti, Eleftheria; Mourjopoulos, John; Jacobsen, Finn
2008-01-01
For some time now, statistical analysis has been a valuable tool in analyzing room transfer functions (RTFs). This work examines existing statistical time-frequency models and techniques for RTF analysis (e.g., Schroeder's stochastic model and the standard deviation over frequency bands for the RTF...... magnitude and phase). RTF fractional octave smoothing, as with 1-slash 3 octave analysis, may lead to RTF simplifications that can be useful for several audio applications, like room compensation, room modeling, auralisation purposes. The aim of this work is to identify the relationship of optimal response...... and the corresponding ratio of the direct and reverberant signal. In addition, this work examines the statistical quantities for speech and audio signals prior to their reproduction within rooms and when recorded in rooms. Histograms and other statistical distributions are used to compare RTF minima of typical...
Waiting time distribution revealing the internal spin dynamics in a double quantum dot
Ptaszyński, Krzysztof
2017-07-01
Waiting time distribution and the zero-frequency full counting statistics of unidirectional electron transport through a double quantum dot molecule attached to spin-polarized leads are analyzed using the quantum master equation. The waiting time distribution exhibits a nontrivial dependence on the value of the exchange coupling between the dots and the gradient of the applied magnetic field, which reveals the oscillations between the spin states of the molecule. The zero-frequency full counting statistics, on the other hand, is independent of the aforementioned quantities, thus giving no insight into the internal dynamics. The fact that the waiting time distribution and the zero-frequency full counting statistics give a nonequivalent information is associated with two factors. Firstly, it can be explained by the sensitivity to different timescales of the dynamics of the system. Secondly, it is associated with the presence of the correlation between subsequent waiting times, which makes the renewal theory, relating the full counting statistics and the waiting time distribution, no longer applicable. The study highlights the particular usefulness of the waiting time distribution for the analysis of the internal dynamics of mesoscopic systems.
On the statistical mechanics of species abundance distributions.
Bowler, Michael G; Kelly, Colleen K
2012-09-01
A central issue in ecology is that of the factors determining the relative abundance of species within a natural community. The proper application of the principles of statistical physics to species abundance distributions (SADs) shows that simple ecological properties could account for the near universal features observed. These properties are (i) a limit on the number of individuals in an ecological guild and (ii) per capita birth and death rates. They underpin the neutral theory of Hubbell (2001), the master equation approach of Volkov et al. (2003, 2005) and the idiosyncratic (extreme niche) theory of Pueyo et al. (2007); they result in an underlying log series SAD, regardless of neutral or niche dynamics. The success of statistical mechanics in this application implies that communities are in dynamic equilibrium and hence that niches must be flexible and that temporal fluctuations on all sorts of scales are likely to be important in community structure. Copyright © 2012 Elsevier Inc. All rights reserved.
Evaluation of statistical distributions to analyze the pollution of Cd and Pb in urban runoff.
Toranjian, Amin; Marofi, Safar
2017-05-01
Heavy metal pollution in urban runoff causes severe environmental damage. Identification of these pollutants and their statistical analysis is necessary to provide management guidelines. In this study, 45 continuous probability distribution functions were selected to fit the Cd and Pb data in the runoff events of an urban area during October 2014-May 2015. The sampling was conducted from the outlet of the city basin during seven precipitation events. For evaluation and ranking of the functions, we used the goodness of fit Kolmogorov-Smirnov and Anderson-Darling tests. The results of Cd analysis showed that Hyperbolic Secant, Wakeby and Log-Pearson 3 are suitable for frequency analysis of the event mean concentration (EMC), the instantaneous concentration series (ICS) and instantaneous concentration of each event (ICEE), respectively. In addition, the LP3, Wakeby and Generalized Extreme Value functions were chosen for the EMC, ICS and ICEE related to Pb contamination.
Zhang, Y.; Li, F.; Zhang, S.; Hao, W.; Zhu, T.; Yuan, L.; Xiao, F.
2017-09-01
In this paper, Statistical Distribution based Conditional Random Fields (STA-CRF) algorithm is exploited for improving marginal ice-water classification. Pixel level ice concentration is presented as the comparison of methods based on CRF. Furthermore, in order to explore the effective statistical distribution model to be integrated into STA-CRF, five statistical distribution models are investigated. The STA-CRF methods are tested on 2 scenes around Prydz Bay and Adélie Depression, where contain a variety of ice types during melt season. Experimental results indicate that the proposed method can resolve sea ice edge well in Marginal Ice Zone (MIZ) and show a robust distinction of ice and water.
Statistical analysis of wind speed using two-parameter Weibull distribution in Alaçatı region
International Nuclear Information System (INIS)
Ozay, Can; Celiktas, Melih Soner
2016-01-01
Highlights: • Wind speed & direction data from September 2008 to March 2014 has been analyzed. • Mean wind speed for the whole data set has been found to be 8.11 m/s. • Highest wind speed is observed in July with a monthly mean value of 9.10 m/s. • Wind speed with the most energy has been calculated as 12.77 m/s. • Observed data has been fit to a Weibull distribution and k &c parameters have been calculated as 2.05 and 9.16. - Abstract: Weibull Statistical Distribution is a common method for analyzing wind speed measurements and determining wind energy potential. Weibull probability density function can be used to forecast wind speed, wind density and wind energy potential. In this study a two-parameter Weibull statistical distribution is used to analyze the wind characteristics of Alaçatı region, located in Çeşme, İzmir. The data used in the density function are acquired from a wind measurement station in Alaçatı. Measurements were gathered on three different heights respectively 70, 50 and 30 m between 10 min intervals for five and half years. As a result of this study; wind speed frequency distribution, wind direction trends, mean wind speed, and the shape and the scale (k&c) Weibull parameters have been calculated for the region. Mean wind speed for the entirety of the data set is found to be 8.11 m/s. k&c parameters are found as 2.05 and 9.16 in relative order. Wind direction analysis along with a wind rose graph for the region is also provided with the study. Analysis suggests that higher wind speeds which range from 6–12 m/s are prevalent between the sectors 340–360°. Lower wind speeds, from 3 to 6 m/s occur between sectors 10–29°. Results of this study contribute to the general knowledge about the regions wind energy potential and can be used as a source for investors and academics.
On widths of mass distributions in statistical theory of fission
International Nuclear Information System (INIS)
Volkov, N.G.; Emel'yanov, V.M.
1979-01-01
The process of nucleon tunneling from one fragment to another near the point of the compoUnd-nucleus fragmentation has been studied in the model of a two-center oscillator. The effect of the number of transferred nucleons on the mass distribution of fragments is estimated. Sensitivity of the model to the form of the single-particle potential, excitation eneraies and deformation of fragments is examined. The calculations performed show that it is possible to calculate the mass distributions at the point of fragment contact in the statistical fission model, taking account of the nucleon exchange between fragments
International Nuclear Information System (INIS)
Alpizar Chavarria, Oscar
2013-01-01
A literature review is conducted to understand the distributed generation, the reason for the introduction into modern power systems and other distributed generation technologies based on renewable energies that have been installed around the country. The frequency protections of distributed generation equipment under 1MW are studied according to international standards like IEEE-1547 and specifications of equipment manufacturers. The influence of the recommended international standards settings are investigated for systems of distributed generation, the performance in frequency that have presented under some frequency perturbation, as well as the influence that can have on the national and regional electrical system, with different amounts of technologies included in the national system. The recommended settings are evaluated through simulations in PSSE program in the context of the behavior of the frequency in the national electric system [es
Comparing simulated and theoretical sampling distributions of the U3 person-fit statistic
Emons, W.H.M.; Meijer, R.R.; Sijtsma, K.
2002-01-01
The accuracy with which the theoretical sampling distribution of van der Flier's person-.t statistic U3 approaches the empirical U3 sampling distribution is affected by the item discrimination. A simulation study showed that for tests with a moderate or a strong mean item discrimination, the Type I
International Nuclear Information System (INIS)
Maluckov, Cedomir A.; Karamarkovic, Jugoslav P.; Radovic, Miodrag K.; Pejovic, Momcilo M.
2004-01-01
The convolution-based model of the electrical breakdown time delay distribution is applied for statistical analysis of experimental results obtained in neon-filled diode tube at 6.5 mbar. At first, the numerical breakdown time delay density distributions are obtained by stochastic modeling as the sum of two independent random variables, the electrical breakdown statistical time delay with exponential, and discharge formative time with Gaussian distribution. Then, the single characteristic breakdown time delay distribution is obtained as the convolution of these two random variables with previously determined parameters. These distributions show good correspondence with the experimental distributions, obtained on the basis of 1000 successive and independent measurements. The shape of distributions is investigated, and corresponding skewness and kurtosis are plotted, in order to follow the transition from Gaussian to exponential distribution
Lozano-Cortes, Diego
2015-10-29
Coral colony size-frequency distributions can be used to assess population responses to local environmental conditions and disturbances. In this study, we surveyed juvenile pocilloporids, herbivorous fish densities, and algal cover in the central and southern Saudi Arabian Red Sea. We sampled nine reefs with different disturbance histories along a north–south natural gradient of physicochemical conditions (higher salinity and wider temperature fluctuations in the north, and higher turbidity and productivity in the south). Since coral populations with negatively skewed size-frequency distributions have been associated with unfavorable environmental conditions, we expected to find more negative distributions in the southern Red Sea, where corals are potentially experiencing suboptimal conditions. Although juvenile coral and parrotfish densities differed significantly between the two regions, mean colony size and size-frequency distributions did not. Results suggest that pocilloporid colony size-frequency distribution may not be an accurate indicator of differences in biological or oceanographic conditions in the Red Sea.
Lozano-Cortés, Diego F; Berumen, Michael L
2016-04-30
Coral colony size-frequency distributions can be used to assess population responses to local environmental conditions and disturbances. In this study, we surveyed juvenile pocilloporids, herbivorous fish densities, and algal cover in the central and southern Saudi Arabian Red Sea. We sampled nine reefs with different disturbance histories along a north-south natural gradient of physicochemical conditions (higher salinity and wider temperature fluctuations in the north, and higher turbidity and productivity in the south). Since coral populations with negatively skewed size-frequency distributions have been associated with unfavorable environmental conditions, we expected to find more negative distributions in the southern Red Sea, where corals are potentially experiencing suboptimal conditions. Although juvenile coral and parrotfish densities differed significantly between the two regions, mean colony size and size-frequency distributions did not. Results suggest that pocilloporid colony size-frequency distribution may not be an accurate indicator of differences in biological or oceanographic conditions in the Red Sea. Copyright © 2015 Elsevier Ltd. All rights reserved.
International Nuclear Information System (INIS)
Pinotti, E.; Brenna, M.; Puppin, E.
2008-01-01
In magneto-optical Kerr measurements of the Barkhausen noise, a magnetization jump ΔM due to a domain reversal produces a variation ΔI of the intensity of a laser beam reflected by the sample, which is the physical quantity actually measured. Due to the non-uniform beam intensity profile, the magnitude of ΔI depends both on ΔM and on its position on the laser spot. This could distort the statistical distribution p(ΔI) of the measured ΔI with respect to the true distribution p(ΔM) of the magnetization jumps ΔM. In this work the exact relationship between the two distributions is derived in a general form, which will be applied to some possible beam profiles. It will be shown that in most cases the usual Gaussian beam produces a negligible statistical distortion. Moreover, for small ΔI the noise of the experimental setup can also distort the statistical distribution p(ΔI), by erroneously rejecting small ΔI as noise. This effect has been calculated for white noise, and it will be shown that it is relatively small but not totally negligible as the measured ΔI approaches the detection limit
Enhanced Recovery Utilizing Variable Frequency Drives and a Distributed Power System
Energy Technology Data Exchange (ETDEWEB)
Randy Peden; Sanjiv Shah
2005-07-26
This report describes complete results of the project entitled ''Enhanced Recovery Utilizing Variable Frequency Drives and a Distributed Power System''. This demonstration project was initiated in July 2003 and completed in March 2005. The objective of the project was to develop an integrated power production/variable frequency drive system that could easily be deployed in the oil field that would increase production and decrease operating costs. This report describes all the activities occurred and documents results of the demonstration.
Statistical Study of Low-Frequency Electromagnetic Cyclotron Waves in the Solar Wind at 1 AU
Zhao, G. Q.; Feng, H. Q.; Wu, D. J.; Liu, Q.; Zhao, Y.; Zhao, A.; Huang, J.
2018-03-01
Electromagnetic cyclotron waves (ECWs) near the proton cyclotron frequency are common wave activities in the solar wind and have attracted much attention in recent years. This paper investigates 82,809 ECWs based on magnetic field data from the Solar Terrestrial Relations Observatory-A mission between 2007 and 2013. Results show that ECWs may last for just a few seconds or incessantly for several tens of minutes. The time fraction of ECW storms among all solar wind is about 0.9%; the storms are obtained with the duration threshold of 10 min, amplitude criterion of 0.032 nT, and time separation limit of 3 min for combination of intermittent ECWs. Most of ECWs have their amplitudes less than 1 nT, while some ECWs have large amplitudes comparable to the ambient magnetic field. The distributions of the durations and amplitudes of these ECWs are characterized by power law spectra, respectively, with spectrum indexes around 4. Statistically, there seems to be a tendency that ECWs with a longer duration will have a larger amplitude. Observed ECW properties are time dependent, and the median frequency of left-hand ECWs can be lower than that of right-hand ECWs in some months in the spacecraft frame. The percentage of left-hand ECWs varies in a large range with respect to months; it is much low (26%) in a month, though it frequently exceeds 50% in other months. Characteristics of ECWs with concurrent polarizations are also researched. The present study should be of importance for a more complete picture of ECWs in the solar wind.
Comparing simulated and theoretical sampling distributions of the U3 person-fit statistic
Emons, Wilco H.M.; Meijer, R.R.; Sijtsma, Klaas
2002-01-01
The accuracy with which the theoretical sampling distribution of van der Flier’s person-fit statistic U3 approaches the empirical U3 sampling distribution is affected by the item discrimination. A simulation study showed that for tests with a moderate or a strong mean item discrimination, the Type I
Spindle frequency activity in the sleep EEG: individual differences and topographic distribution.
Werth, E; Achermann, P; Dijk, D J; Borbély, A A
1997-11-01
The brain topography of EEG power spectra in the frequency range of sleep spindles was investigated in 34 sleep recordings from 20 healthy young men. Referential (F3-A2, C3-A2, P3-A2 and O1-A2) and bipolar derivations (F3-C3, C3-P3 and P3-O1) along the anteroposterior axis were used. Sleep spindles gave rise to a distinct peak in the EEG power spectrum. The distribution of the peak frequencies pooled over subjects and derivations showed a bimodal pattern with modes at 11.5 and 13.0 Hz, and a trough at 12.25 Hz. The large inter-subject variation in peak frequency (range: 1.25 Hz) contrasted with the small intra-subject variation between derivations, non-REM sleep episodes and different nights. In some individuals and/or some derivations, only a single spindle peak was present. The topographic distributions from referential and bipolar recordings showed differences. The power showed a declining trend over consecutive non-REM sleep episodes in the low range of spindle frequency activity and a rising trend in the high range. The functional and topographic heterogeneity of sleep spindles in conjunction with the intra-subject stability of their frequency are important characteristics for the analysis of sleep regulation on the basis of the EEG.
Statistical Analysis of Video Frame Size Distribution Originating from Scalable Video Codec (SVC
Directory of Open Access Journals (Sweden)
Sima Ahmadpour
2017-01-01
Full Text Available Designing an effective and high performance network requires an accurate characterization and modeling of network traffic. The modeling of video frame sizes is normally applied in simulation studies and mathematical analysis and generating streams for testing and compliance purposes. Besides, video traffic assumed as a major source of multimedia traffic in future heterogeneous network. Therefore, the statistical distribution of video data can be used as the inputs for performance modeling of networks. The finding of this paper comprises the theoretical definition of distribution which seems to be relevant to the video trace in terms of its statistical properties and finds the best distribution using both the graphical method and the hypothesis test. The data set used in this article consists of layered video traces generating from Scalable Video Codec (SVC video compression technique of three different movies.
Maris, Eric; van Vugt, Marieke; Kahana, Michael
2011-01-01
Spatially distributed coherent oscillations provide temporal windows of excitability that allow for interactions between distinct neuronal groups. It has been hypothesized that this mechanism for neuronal communication is realized by bursts of high-frequency oscillations that are phase-coupled to a
Allele frequency distribution for 21 autosomal STR loci in Bhutan.
Kraaijenbrink, Thirsa; van Driem, George L; Tshering of Gaselô, Karma; de Knijff, Peter
2007-07-20
We studied the allele frequency distribution of 21 autosomal STR loci contained in the AmpFlSTR Identifiler (Applied Biosystems), the Powerplex 16 (Promega) and the FFFL (Promega) multiplex PCR kits among 936 individuals from the Royal Kingdom of Bhutan. As such these are the first published autosomal DNA results from this country.
Frequency distribution of ABO and Rh (D) blood group alleles in ...
African Journals Online (AJOL)
Kassahun Tesfaye
2014-09-22
Sep 22, 2014 ... Rh (D). Abstract Background: Frequency distribution of blood groups is important as it is used in mod- ern medicine ... sion practice. The need for ... The study design was approved by the Research Ethics Com- mittee, College ...
Alimi, Jean-Michel; de Fromont, Paul
2018-04-01
The statistical properties of cosmic structures are well known to be strong probes for cosmology. In particular, several studies tried to use the cosmic void counting number to obtain tight constrains on dark energy. In this paper, we model the statistical properties of these regions using the CoSphere formalism (de Fromont & Alimi) in both primordial and non-linearly evolved Universe in the standard Λ cold dark matter model. This formalism applies similarly for minima (voids) and maxima (such as DM haloes), which are here considered symmetrically. We first derive the full joint Gaussian distribution of CoSphere's parameters in the Gaussian random field. We recover the results of Bardeen et al. only in the limit where the compensation radius becomes very large, i.e. when the central extremum decouples from its cosmic environment. We compute the probability distribution of the compensation size in this primordial field. We show that this distribution is redshift independent and can be used to model cosmic voids size distribution. We also derive the statistical distribution of the peak parameters introduced by Bardeen et al. and discuss their correlation with the cosmic environment. We show that small central extrema with low density are associated with narrow compensation regions with deep compensation density, while higher central extrema are preferentially located in larger but smoother over/under massive regions.
Directory of Open Access Journals (Sweden)
Howard A. Gaberson
1995-01-01
Full Text Available This article discusses time frequency analysis of machinery diagnostic vibration signals. The short time Fourier transform, the Wigner, and the Choi–Williams distributions are explained and illustrated with test cases. Examples of Choi—Williams analyses of machinery vibration signals are presented. The analyses detect discontinuities in the signals and their timing, amplitude and frequency modulation, and the presence of different components in a vibration signal.
Directory of Open Access Journals (Sweden)
Xiliang Zheng
2015-04-01
Full Text Available We uncovered the universal statistical laws for the biomolecular recognition/binding process. We quantified the statistical energy landscapes for binding, from which we can characterize the distributions of the binding free energy (affinity, the equilibrium constants, the kinetics and the specificity by exploring the different ligands binding with a particular receptor. The results of the analytical studies are confirmed by the microscopic flexible docking simulations. The distribution of binding affinity is Gaussian around the mean and becomes exponential near the tail. The equilibrium constants of the binding follow a log-normal distribution around the mean and a power law distribution in the tail. The intrinsic specificity for biomolecular recognition measures the degree of discrimination of native versus non-native binding and the optimization of which becomes the maximization of the ratio of the free energy gap between the native state and the average of non-native states versus the roughness measured by the variance of the free energy landscape around its mean. The intrinsic specificity obeys a Gaussian distribution near the mean and an exponential distribution near the tail. Furthermore, the kinetics of binding follows a log-normal distribution near the mean and a power law distribution at the tail. Our study provides new insights into the statistical nature of thermodynamics, kinetics and function from different ligands binding with a specific receptor or equivalently specific ligand binding with different receptors. The elucidation of distributions of the kinetics and free energy has guiding roles in studying biomolecular recognition and function through small-molecule evolution and chemical genetics.
The Effect of Distributed Practice in Undergraduate Statistics Homework Sets: A Randomized Trial
Crissinger, Bryan R.
2015-01-01
Most homework sets in statistics courses are constructed so that students concentrate or "mass" their practice on a certain topic in one problem set. Distributed practice homework sets include review problems in each set so that practice on a topic is distributed across problem sets. There is a body of research that points to the…
Single Frequency Network Based Distributed Passive Radar Technology
Directory of Open Access Journals (Sweden)
Wan Xian-rong
2015-01-01
Full Text Available The research and application of passive radar are heading from single transmitter-receiver pair to multiple transmitter-receiver pairs. As an important class of the illuminators of opportunity, most of modern digital broadcasting and television systems work on Single Frequency Network (SFN, which intrinsically determines that the passive radar based on such illuminators must be distributed and networked. In consideration of the remarkable working and processing mode of passive radar under SFN configuration, this paper proposes the concept of SFN-based Distributed Passive Radar (SDPR. The main characteristics and key problems of SDPR are first described. Then several potential solutions are discussed for part of the key technologies. The feasibility of SDPR is demonstrated by preliminary experimental results. Finally, the concept of four network convergence that includes the broadcast based passive radar network is conceived, and its application prospects are discussed.
Frequency distribution of the reduced unit cells of centred lattices from the Protein Data Bank.
Swaminathan, Kunchithapadam
2012-03-01
In crystallography, a centred conventional lattice unit cell has its corresponding reduced primitive unit cell. This study presents the frequency distribution of the reduced unit cells of all centred lattice entries of the Protein Data Bank (as of 23 August 2011) in four unit-cell-dimension-based groups and seven interaxial-angle-based subgroups. This frequency distribution is an added layer of support during space-group assignment in new crystals. In addition, some interesting patterns of distribution are discussed as well as how some reduced unit cells could be wrongly accepted as primitive lattices in a different crystal system.
On the Estimation and Use of Statistical Modelling in Information Retrieval
DEFF Research Database (Denmark)
Petersen, Casper
Automatic text processing often relies on assumptions about the distribution of some property (such as term frequency) in the data being processed. In information retrieval (IR) such assumptions may be contributed to (i) the absence of principled approaches for determining the correct statistical...... that assumptions regarding the distribution of dataset properties can be replaced with an effective, efficient and principled method for determining the best-fitting distribution and that using this distribution can lead to improved retrieval performance....
Directory of Open Access Journals (Sweden)
Xiao Qi
2018-06-01
Full Text Available With large-scale integration of electric vehicles, this paper investigates the load frequency control problem in an islanded microgrid with plug-in electric vehicles (PEVs, which can be regarded as mobile battery energy storages to provide a valuable contribution to frequency regulation. A novel fully-distributed control strategy is proposed to achieve fast frequency regulation of islanded microgrids and effective coordination control of distributed energy sources. Firstly, distributed control based on an improved linear active disturbance rejection algorithm is realized through a multi-agent system, and it greatly enhances the anti-disturbance capability of the microgrid. Then, in order to guarantee the effectiveness of PEVs in frequency regulation, PEVs are controlled following the controllable power rate (CPR calculated from the consensus-based multi-agent system. Furthermore, the system control construction in this paper is well designed to avoid the negative effects caused by system communication time delay. Finally, numerical simulations under different disturbances are carried out to demonstrate the effectiveness of the proposed control strategy in comparison with other previous control strategies.
Electron energy distributions and excitation rates in high-frequency argon discharges
International Nuclear Information System (INIS)
Ferreira, C.M.; Loureiro, J.
1983-06-01
The electron energy distribution functions and rate coefficients for excitation and ionisation in argon under the action of an uniform high-frequency electric field were calculated by numerically solving the homogeneous Boltzmann equation. Analytic calculations in the limiting cases ω>>νsub(c) and ω<<νsub(c), where ω is the wave angular frequency and νsub(c) is the electron-neutral collision frequency for momentum transfer, are also presented and shown to be in very good agreement with the numerical computations. The results reported here are relevant for the modelling of high-frequency discharges in argon and, in particular, for improving recent theoretical descriptions of a plasma column sustained by surface microwaves. The properties of surface wave produced plasmas make them interesting as possible substitutes for other more conventional plasma sources for such important applications as plasma chemistry laser excitation, plasma etching spectroscopic sources etc...
Frequency distribution of Radium-226, Thorium-228 and Potassium-40 concentration in ploughed soils
International Nuclear Information System (INIS)
Drichko, V.F.; Krisyuk, B.E.; Travnikova, I.G.; Lisachenko, E.P.; Dubenskaya, M.A.
1977-01-01
The results of studying Ra-226, Th-228 and K-40 concentration distribution laws in podsol, chernozem and saline soils are considered. Radionuclide concentrations were determined by gamma-spectrometric method in the samples chosen from arable soil layer according to the generally accepted agrotechnical procedure. Measuring procedure is described. The results show that frequency distributions of radionuclide concentrations transform from asymmetric form in normal coordinates into symmetric form in logarithmic coordinates. The usage of the lognormal law to describe frequency concentration distributions is substantiated. The values of concentration distribution parameters are given. The analysis of the data obtained permits to establish that Ra-226 and Th-228 concentrations in soils distribute lognormally and K-40 concentrations - normally and lognormally. According to the degree of decreasing mean concentrations of Ra-226 and Th-228, soils lie in line: chernozems=chernozem salterns > podsols; and according to the degree of decreasing mean quadratic deviation - in line: podsols>chernozems=salterns. It is necessary to determine the value of mean quadratic deviation and distribution type for full characteristics of the studied soil radioactivity
Fuel rod design by statistical methods for MOX fuel
International Nuclear Information System (INIS)
Heins, L.; Landskron, H.
2000-01-01
Statistical methods in fuel rod design have received more and more attention during the last years. One of different possible ways to use statistical methods in fuel rod design can be described as follows: Monte Carlo calculations are performed using the fuel rod code CARO. For each run with CARO, the set of input data is modified: parameters describing the design of the fuel rod (geometrical data, density etc.) and modeling parameters are randomly selected according to their individual distributions. Power histories are varied systematically in a way that each power history of the relevant core management calculation is represented in the Monte Carlo calculations with equal frequency. The frequency distributions of the results as rod internal pressure and cladding strain which are generated by the Monte Carlo calculation are evaluated and compared with the design criteria. Up to now, this methodology has been applied to licensing calculations for PWRs and BWRs, UO 2 and MOX fuel, in 3 countries. Especially for the insertion of MOX fuel resulting in power histories with relatively high linear heat generation rates at higher burnup, the statistical methodology is an appropriate approach to demonstrate the compliance of licensing requirements. (author)
A New Quantum Key Distribution Scheme Based on Frequency and Time Coding
International Nuclear Information System (INIS)
Chang-Hua, Zhu; Chang-Xing, Pei; Dong-Xiao, Quan; Jing-Liang, Gao; Nan, Chen; Yun-Hui, Yi
2010-01-01
A new scheme of quantum key distribution (QKD) using frequency and time coding is proposed, in which the security is based on the frequency-time uncertainty relation. In this scheme, the binary information sequence is encoded randomly on either the central frequency or the time delay of the optical pulse at the sender. The central frequency of the single photon pulse is set as ω 1 for bit 0 and set as ω 2 for bit 1 when frequency coding is selected. However, the single photon pulse is not delayed for bit 0 and is delayed in τ for 1 when time coding is selected. At the receiver, either the frequency or the time delay of the pulse is measured randomly, and the final key is obtained after basis comparison, data reconciliation and privacy amplification. With the proposed method, the effect of the noise in the fiber channel and environment on the QKD system can be reduced effectively
Statistical study of ion pitch-angle distributions
International Nuclear Information System (INIS)
Sibeck, D.G.; Mcentire, R.W.; Lui, A.T.Y.; Krimigis, S.M.
1987-01-01
Preliminary results of a statistical study of energetic (34-50 keV) ion pitch-angle distributions (PADs) within 9 Re of earth provide evidence for an orderly pattern consistent with both drift-shell splitting and magnetopause shadowing. Normal ion PADs dominate the dayside and inner magnetosphere. Butterfly PADs typically occur in a narrow belt stretching from dusk to dawn through midnight, where they approach within 6 Re of earth. While those ion butterfly PADs that typically occur on closed drift paths are mainly caused by drift-shell splitting, there is also evidence for magnetopause shadowing in observations of more frequent butterfly PAD occurrence in the outer magnetosphere near dawn than dusk. Isotropic and gradient boundary PADs terminate the tailward extent of the butterfly ion PAD belt. 9 references
International Nuclear Information System (INIS)
Dobierzewska-Mozrzymas, E.; Szymczak, G.; Bieganski, P.; Pieciul, E.
2003-01-01
The ranges of statistical description of the systems may be determined on the basis of the inverse power law of the Mandelbrot. The slope of the straight line representing the power law in a double-logarithmic plot, determined as -1/μ (μ being a critical exponent), characterizes the distribution of elements in the system. In this paper, the inverse power law is used to describe the statistical distribution of discontinuous metal films with higher coverage coefficients (near percolation threshold). For these films the critical exponent μ∼1, both the mean value and the variance are infinite. The objects with such microstructure are described according to the Levy distribution; Cauchy, inverse Gauss and inverse gamma distribution, respectively. The experimental histograms are compared with the calculated ones. Inhomogeneous metal films were obtained experimentally, their microstructures were examined by means of electron microscope. On the basis of electron micrographs, the fractal dimensions were determined for the metal films with coverage coefficient ranging from 0.35 to 1.00
Statistical distributions of avalanche size and waiting times in an inter-sandpile cascade model
Batac, Rene; Longjas, Anthony; Monterola, Christopher
2012-02-01
Sandpile-based models have successfully shed light on key features of nonlinear relaxational processes in nature, particularly the occurrence of fat-tailed magnitude distributions and exponential return times, from simple local stress redistributions. In this work, we extend the existing sandpile paradigm into an inter-sandpile cascade, wherein the avalanches emanating from a uniformly-driven sandpile (first layer) is used to trigger the next (second layer), and so on, in a successive fashion. Statistical characterizations reveal that avalanche size distributions evolve from a power-law p(S)≈S-1.3 for the first layer to gamma distributions p(S)≈Sαexp(-S/S0) for layers far away from the uniformly driven sandpile. The resulting avalanche size statistics is found to be associated with the corresponding waiting time distribution, as explained in an accompanying analytic formulation. Interestingly, both the numerical and analytic models show good agreement with actual inventories of non-uniformly driven events in nature.
Current state of the art for statistical modeling of species distributions [Chapter 16
Troy M. Hegel; Samuel A. Cushman; Jeffrey Evans; Falk Huettmann
2010-01-01
Over the past decade the number of statistical modelling tools available to ecologists to model species' distributions has increased at a rapid pace (e.g. Elith et al. 2006; Austin 2007), as have the number of species distribution models (SDM) published in the literature (e.g. Scott et al. 2002). Ten years ago, basic logistic regression (Hosmer and Lemeshow 2000)...
Allele frequency distribution for 21 autosomal STR loci in Nepal.
Kraaijenbrink, T; van Driem, G L; Opgenort, J R M L; Tuladhar, N M; de Knijff, P
2007-05-24
The allele frequency distributions of 21 autosomal loci contained in the AmpFlSTR Identifiler, the Powerplex 16 and the FFFL multiplex PCR kits, was studied in 953 unrelated individuals from Nepal. Several new alleles (i.e. not yet reported in the NIST Short Tandem Repeat DNA Internet DataBase [http://www.cstl.nist.gov/biotech/strbase/]) have been detected in the process.
Statistical mechanics for a class of quantum statistics
International Nuclear Information System (INIS)
Isakov, S.B.
1994-01-01
Generalized statistical distributions for identical particles are introduced for the case where filling a single-particle quantum state by particles depends on filling states of different momenta. The system of one-dimensional bosons with a two-body potential that can be solved by means of the thermodynamic Bethe ansatz is shown to be equivalent thermodynamically to a system of free particles obeying statistical distributions of the above class. The quantum statistics arising in this way are completely determined by the two-particle scattering phases of the corresponding interacting systems. An equation determining the statistical distributions for these statistics is derived
Extreme value statistics and thermodynamics of earthquakes: aftershock sequences
Directory of Open Access Journals (Sweden)
B. H. Lavenda
2000-06-01
Full Text Available The Gutenberg-Richter magnitude-frequency law takes into account the minimum detectable magnitude, and treats aftershocks as if they were independent and identically distributed random events. A new magnitude-frequency relation is proposed which takes into account the magnitude of the main shock, and the degree to which aftershocks depend on the main shock makes them appear clustered. In certain cases, there can be two branches in the order-statistics of aftershock sequences: for energies below threshold, the Pareto law applies and the asymptotic distribution of magnitude is the double-exponential distribution, while energies above threshold follow a one-parameter beta distribution, whose exponent is the cluster dimension, and the asymptotic Gompertz distribution predicts a maximum magnitude. The 1957 Aleutian Islands aftershock sequence exemplifies such dual behavior. A thermodynamics of aftershocks is constructed on the analogy between the non-conservation of the number of aftershocks and that of the particle number in degenerate gases.
Tidal controls on earthquake size-frequency statistics
Ide, S.; Yabe, S.; Tanaka, Y.
2016-12-01
The possibility that tidal stresses can trigger earthquakes is a long-standing issue in seismology. Except in some special cases, a causal relationship between seismicity and the phase of tidal stress has been rejected on the basis of studies using many small events. However, recently discovered deep tectonic tremors are highly sensitive to tidal stress levels, with the relationship being governed by a nonlinear law according to which the tremor rate increases exponentially with increasing stress; thus, slow deformation (and the probability of earthquakes) may be enhanced during periods of large tidal stress. Here, we show the influence of tidal stress on seismicity by calculating histories of tidal shear stress during the 2-week period before earthquakes. Very large earthquakes tend to occur near the time of maximum tidal stress, but this tendency is not obvious for small earthquakes. Rather, we found that tidal stress controls the earthquake size-frequency statistics; i.e., the fraction of large events increases (i.e. the b-value of the Gutenberg-Richter relation decreases) as the tidal shear stress increases. This correlation is apparent in data from the global catalog and in relatively homogeneous regional catalogues of earthquakes in Japan. The relationship is also reasonable, considering the well-known relationship between stress and the b-value. Our findings indicate that the probability of a tiny rock failure expanding to a gigantic rupture increases with increasing tidal stress levels. This finding has clear implications for probabilistic earthquake forecasting.
Energy Technology Data Exchange (ETDEWEB)
Paul, D. [SSBB and Senior Member-ASQ, Kolkata (India); Mandal, S.N. [Kalyani Govt Engg College, Kalyani (India); Mukherjee, D.; Bhadra Chaudhuri, S.R. [Dept of E. and T. C. Engg, B.E.S.U., Shibpur (India)
2010-10-15
System efficiency and payback time are yet to attain a commercially viable level for solar photovoltaic energy projects. Despite huge development in prediction of solar radiation data, there is a gap in extraction of pertinent information from such data. Hence the available data cannot be effectively utilized for engineering application. This is acting as a barrier for the emerging technology. For making accurate engineering and financial calculations regarding any solar energy project, it is crucial to identify and optimize the most significant statistic(s) representing insolation availability by the Photovoltaic setup at the installation site. Quality Function Deployment (QFD) technique has been applied for identifying the statistic(s), which are of high significance from a project designer's point of view. A MATLAB trademark program has been used to build the annual frequency distribution of hourly insolation over any module plane at a given location. Descriptive statistical analysis of such distributions is done through MINITAB trademark. For Building Integrated Photo Voltaic (BIPV) installation, similar statistical analysis has been carried out for the composite frequency distribution, which is formed by weighted summation of insolation distributions for different module planes used in the installation. Vital most influential statistic(s) of the composite distribution have been optimized through Artificial Neural Network computation. This approach is expected to open up a new horizon in BIPV system design. (author)
Coordinated control of distributed energy resources to support load frequency control
International Nuclear Information System (INIS)
Ravikumar Pandi, V.; Al-Hinai, A.; Feliachi, Ali
2015-01-01
Highlights: • We aims to maintain feeder power flow by the coordination of DER units. • The error in feeder flow with respect to scheduled value is used by the controller. • The particle swarm optimization is employed to minimize the error in feeder flow. • Implemented on a transmission system along with 37 bus distribution feeder. • The results of proposed feeder control is analyzed with no feeder control scheme. - Abstract: The control of generating resources to follow the unscheduled load changes is considered to be an essential process in the power system in order to maintain the frequency of power supply. This load frequency control (LFC) problem has been given more importance in the recent smart grid environment because of the impact from high penetration of distributed energy resources (DER) installed at the distribution level. The renewable sources are highly intermittent in nature, so it is required to coordinate and control the DER units to maintain the feeder power flow at substation bus bar which is seen by transmission system operator during the LFC process. This paper aims to identify the impact of distributed generation and its control method to reduce the deviation of feeder power flow from the scheduled value in real time operation. The error in feeder power flow with respect to scheduled value is utilized by the PI controller to estimate the change in power reference of all DER units. The power output of DER units are maintained to reference values by the individual PI controllers. The particle swarm optimization algorithm is employed to minimize the error in feeder power flow by optimally tuning the gain values of all PI controllers. The proposed method is examined on a small transmission system along with the feeder of IEEE 37 bus distribution system with balanced loading condition. The complete system along with DER units is implemented in the MATLAB based stability package named Power Analysis Toolbox (PAT) for performing time domain
International Nuclear Information System (INIS)
Lu Han-Han; Xu Jing-Ping; Liu Lu; Lai Pui-To; Tang Wing-Man
2016-01-01
An equivalent distributed capacitance model is established by considering only the gate oxide-trap capacitance to explain the frequency dispersion in the C – V curve of MOS capacitors measured for a frequency range from 1 kHz to 1 MHz. The proposed model is based on the Fermi–Dirac statistics and the charging/discharging effects of the oxide traps induced by a small ac signal. The validity of the proposed model is confirmed by the good agreement between the simulated results and experimental data. Simulations indicate that the capacitance dispersion of an MOS capacitor under accumulation and near flatband is mainly caused by traps adjacent to the oxide/semiconductor interface, with negligible effects from the traps far from the interface, and the relevant distance from the interface at which the traps can still contribute to the gate capacitance is also discussed. In addition, by excluding the negligible effect of oxide-trap conductance, the model avoids the use of imaginary numbers and complex calculations, and thus is simple and intuitive. (paper)
Conditional maximum-entropy method for selecting prior distributions in Bayesian statistics
Abe, Sumiyoshi
2014-11-01
The conditional maximum-entropy method (abbreviated here as C-MaxEnt) is formulated for selecting prior probability distributions in Bayesian statistics for parameter estimation. This method is inspired by a statistical-mechanical approach to systems governed by dynamics with largely separated time scales and is based on three key concepts: conjugate pairs of variables, dimensionless integration measures with coarse-graining factors and partial maximization of the joint entropy. The method enables one to calculate a prior purely from a likelihood in a simple way. It is shown, in particular, how it not only yields Jeffreys's rules but also reveals new structures hidden behind them.
International Nuclear Information System (INIS)
Mayaud, P.N.
1976-01-01
Because of the various components of positive conservation existing in the series of aa indices, their frequency distribution is necessarily distorted with respect to any random distribution. However when one takes these various components into account, the observed distribution can be considered as being a logarithmo-normal distribution. This implies that the geomagnetic activity satisfies the conditions of the central limit theorem, according to which a phenomenon which presents such a distribution is due to independent causes whose effects are multiplicative. Furthermore, the distorsion of the frequency distribution caused by the 11-year and 90-year cycles corresponds to a pure attenuation effect; an interpretation by the solar 'coronal holes' is proposed [fr
Yigzaw, Kassaye Yitbarek; Michalas, Antonis; Bellika, Johan Gustav
2017-01-03
Techniques have been developed to compute statistics on distributed datasets without revealing private information except the statistical results. However, duplicate records in a distributed dataset may lead to incorrect statistical results. Therefore, to increase the accuracy of the statistical analysis of a distributed dataset, secure deduplication is an important preprocessing step. We designed a secure protocol for the deduplication of horizontally partitioned datasets with deterministic record linkage algorithms. We provided a formal security analysis of the protocol in the presence of semi-honest adversaries. The protocol was implemented and deployed across three microbiology laboratories located in Norway, and we ran experiments on the datasets in which the number of records for each laboratory varied. Experiments were also performed on simulated microbiology datasets and data custodians connected through a local area network. The security analysis demonstrated that the protocol protects the privacy of individuals and data custodians under a semi-honest adversarial model. More precisely, the protocol remains secure with the collusion of up to N - 2 corrupt data custodians. The total runtime for the protocol scales linearly with the addition of data custodians and records. One million simulated records distributed across 20 data custodians were deduplicated within 45 s. The experimental results showed that the protocol is more efficient and scalable than previous protocols for the same problem. The proposed deduplication protocol is efficient and scalable for practical uses while protecting the privacy of patients and data custodians.
National Research Council Canada - National Science Library
Basu, Bamandas
2008-01-01
... (to the ambient magnetic field) flow velocities associated with the current. In order to illustrate the distinguishing features of the kappa distributions, stability properties of the low frequency...
Environmental radionuclide concentrations: statistical model to determine uniformity of distribution
International Nuclear Information System (INIS)
Cawley, C.N.; Fenyves, E.J.; Spitzberg, D.B.; Wiorkowski, J.; Chehroudi, M.T.
1980-01-01
In the evaluation of data from environmental sampling and measurement, a basic question is whether the radionuclide (or pollutant) is distributed uniformly. Since physical measurements have associated errors, it is inappropriate to consider the measurements alone in this determination. Hence, a statistical model has been developed. It consists of a weighted analysis of variance with subsequent t-tests between weighted and independent means. A computer program to perform the calculations is included
Statistical issues in the parton distribution analysis of the Tevatron jet data
International Nuclear Information System (INIS)
Alekhin, S.; Bluemlein, J.; Moch, S.O.; Hamburg Univ.
2012-11-01
We analyse a tension between the D0 and CDF inclusive jet data and the perturbative QCD calculations, which are based on the ABKM09 and ABM11 parton distribution functions (PDFs) within the nuisance parameter framework. Particular attention is paid on the uncertainties in the nuisance parameters due to the data fluctuations and the PDF errors. We show that with account of these uncertainties the nuisance parameters do not demonstrate a statistically significant excess. A statistical bias of the estimator based on the nuisance parameters is also discussed.
Engineering Inertial and Primary-Frequency Response for Distributed Energy Resources: Preprint
Energy Technology Data Exchange (ETDEWEB)
Dall-Anese, Emiliano [National Renewable Energy Laboratory (NREL), Golden, CO (United States); Zhao, Changhong [National Renewable Energy Laboratory (NREL), Golden, CO (United States); Guggilam, Swaroop [University of Minnesota; Dhople, Sairaj V [University of Minnesota; Chen, Yu C [University of British Columbia; Zhao, Changhong [National Renewable Energy Laboratory (NREL), Golden, CO (United States)
2017-12-19
We propose a framework to engineer synthetic-inertia and droop-control parameters for distributed energy resources (DERs) so that the system frequency in a network composed of DERs and synchronous generators conforms to prescribed transient and steady-state performance specifications. Our approach is grounded in a second-order lumped-parameter model that captures the dynamics of synchronous generators and frequency-responsive DERs endowed with inertial and droop control. A key feature of this reduced-order model is that its parameters can be related to those of the originating higher-order dynamical model. This allows one to systematically design the DER inertial and droop-control coefficients leveraging classical frequency-domain response characteristics of second-order systems. Time-domain simulations validate the accuracy of the model-reduction method and demonstrate how DER controllers can be designed to meet steady-state-regulation and transient-performance specifications.
Yang, Jaw-Yen; Yan, Chih-Yuan; Diaz, Manuel; Huang, Juan-Chen; Li, Zhihui; Zhang, Hanxin
2014-01-08
The ideal quantum gas dynamics as manifested by the semiclassical ellipsoidal-statistical (ES) equilibrium distribution derived in Wu et al. (Wu et al . 2012 Proc. R. Soc. A 468 , 1799-1823 (doi:10.1098/rspa.2011.0673)) is numerically studied for particles of three statistics. This anisotropic ES equilibrium distribution was derived using the maximum entropy principle and conserves the mass, momentum and energy, but differs from the standard Fermi-Dirac or Bose-Einstein distribution. The present numerical method combines the discrete velocity (or momentum) ordinate method in momentum space and the high-resolution shock-capturing method in physical space. A decoding procedure to obtain the necessary parameters for determining the ES distribution is also devised. Computations of two-dimensional Riemann problems are presented, and various contours of the quantities unique to this ES model are illustrated. The main flow features, such as shock waves, expansion waves and slip lines and their complex nonlinear interactions, are depicted and found to be consistent with existing calculations for a classical gas.
Yang, Jaw-Yen; Yan, Chih-Yuan; Diaz, Manuel; Huang, Juan-Chen; Li, Zhihui; Zhang, Hanxin
2014-01-01
The ideal quantum gas dynamics as manifested by the semiclassical ellipsoidal-statistical (ES) equilibrium distribution derived in Wu et al. (Wu et al. 2012 Proc. R. Soc. A 468, 1799–1823 (doi:10.1098/rspa.2011.0673)) is numerically studied for particles of three statistics. This anisotropic ES equilibrium distribution was derived using the maximum entropy principle and conserves the mass, momentum and energy, but differs from the standard Fermi–Dirac or Bose–Einstein distribution. The present numerical method combines the discrete velocity (or momentum) ordinate method in momentum space and the high-resolution shock-capturing method in physical space. A decoding procedure to obtain the necessary parameters for determining the ES distribution is also devised. Computations of two-dimensional Riemann problems are presented, and various contours of the quantities unique to this ES model are illustrated. The main flow features, such as shock waves, expansion waves and slip lines and their complex nonlinear interactions, are depicted and found to be consistent with existing calculations for a classical gas. PMID:24399919
Spatial distribution of cold-season lightning frequency in the coastal areas of the Sea of Japan
Tsurushima, Daiki; Sakaida, Kiyotaka; Honma, Noriyasu
2017-12-01
The coastal areas of the Sea of Japan are a well-known hotspot of winter lightning activity. This study distinguishes between three common types of winter lightning in that region (types A-C), based on their frequency distributions and the meteorological conditions under which they occur. Type A lightning occurs with high frequency in the Tohoku district. It is mainly caused by cold fronts that accompany cyclones passing north of the Japanese islands. Type B, which occurs most frequently in the coastal areas of the Hokuriku district, is mainly caused by topographically induced wind convergence and convective instability, both of which are associated with cyclones having multiple centers. Type C's lightning frequency distribution pattern is similar to that of type B, but its principal cause is a topographically induced wind convergence generated by cold air advection from the Siberian continent. Type A is most frequently observed from October to November, while types B and C tend to appear from November to January, consistent with seasonal changes in lightning frequency distribution in Japan's Tohoku and Hokuriku districts.
Directory of Open Access Journals (Sweden)
Bin Wang
2016-01-01
Full Text Available This paper studies the application of frequency distributed model for finite time control of a fractional order nonlinear hydroturbine governing system (HGS. Firstly, the mathematical model of HGS with external random disturbances is introduced. Secondly, a novel terminal sliding surface is proposed and its stability to origin is proved based on the frequency distributed model and Lyapunov stability theory. Furthermore, based on finite time stability and sliding mode control theory, a robust control law to ensure the occurrence of the sliding motion in a finite time is designed for stabilization of the fractional order HGS. Finally, simulation results show the effectiveness and robustness of the proposed scheme.
Energy Technology Data Exchange (ETDEWEB)
Hacke, P.; Spataru, S.
2014-08-01
We propose a method for increasing the frequency of data collection and reducing the time and cost of accelerated lifetime testing of photovoltaic modules undergoing potential-induced degradation (PID). This consists of in-situ measurements of dark current-voltage curves of the modules at elevated stress temperature, their use to determine the maximum power at 25 degrees C standard test conditions (STC), and distribution statistics for determining degradation rates as a function of stress level. The semi-continuous data obtained by this method clearly show degradation curves of the maximum power, including an incubation phase, rates and extent of degradation, precise time to failure, and partial recovery. Stress tests were performed on crystalline silicon modules at 85% relative humidity and 60 degrees C, 72 degrees C, and 85 degrees C. Activation energy for the mean time to failure (1% relative) of 0.85 eV was determined and a mean time to failure of 8,000 h at 25 degrees C and 85% relative humidity is predicted. No clear trend in maximum degradation as a function of stress temperature was observed.
Crimp, Steven; Jin, Huidong; Kokic, Philip; Bakar, Shuvo; Nicholls, Neville
2018-04-01
Anthropogenic climate change has already been shown to effect the frequency, intensity, spatial extent, duration and seasonality of extreme climate events. Understanding these changes is an important step in determining exposure, vulnerability and focus for adaptation. In an attempt to support adaptation decision-making we have examined statistical modelling techniques to improve the representation of global climate model (GCM) derived projections of minimum temperature extremes (frosts) in Australia. We examine the spatial changes in minimum temperature extreme metrics (e.g. monthly and seasonal frost frequency etc.), for a region exhibiting the strongest station trends in Australia, and compare these changes with minimum temperature extreme metrics derived from 10 GCMs, from the Coupled Model Inter-comparison Project Phase 5 (CMIP 5) datasets, and via statistical downscaling. We compare the observed trends with those derived from the "raw" GCM minimum temperature data as well as examine whether quantile matching (QM) or spatio-temporal (spTimerQM) modelling with Quantile Matching can be used to improve the correlation between observed and simulated extreme minimum temperatures. We demonstrate, that the spTimerQM modelling approach provides correlations with observed daily minimum temperatures for the period August to November of 0.22. This represents an almost fourfold improvement over either the "raw" GCM or QM results. The spTimerQM modelling approach also improves correlations with observed monthly frost frequency statistics to 0.84 as opposed to 0.37 and 0.81 for the "raw" GCM and QM results respectively. We apply the spatio-temporal model to examine future extreme minimum temperature projections for the period 2016 to 2048. The spTimerQM modelling results suggest the persistence of current levels of frost risk out to 2030, with the evidence of continuing decadal variation.
DEFF Research Database (Denmark)
Cha, Seung-Tae
distribution networks makes it possible to operate the distribution networks independently which is called islanding operation. However, it is a challenge to ensure secure and reliable operation of the islanded system due to a num-ber of reasons, e.g. low inertia in the islanded system, intermittency of some...... of the DERs, etc. Particularly during islanding operation, with relatively few DG units, the frequency and voltage control of the islanded system is not straightforward. DG units, specially based on renewable energy sources (RESs), i.e. wind and solar, have an inter-mittent nature and intrinsic...... system (BESS) and two secondary frequency control scenarios with BESS and DG units. During the island-ing transition, the frequency is regulated by the fast-acting primary control of the BESS. The secondary control of the main management system (MMS) detects the status of the BESS and tries to return...
Charged-particle thermonuclear reaction rates: I. Monte Carlo method and statistical distributions
International Nuclear Information System (INIS)
Longland, R.; Iliadis, C.; Champagne, A.E.; Newton, J.R.; Ugalde, C.; Coc, A.; Fitzgerald, R.
2010-01-01
A method based on Monte Carlo techniques is presented for evaluating thermonuclear reaction rates. We begin by reviewing commonly applied procedures and point out that reaction rates that have been reported up to now in the literature have no rigorous statistical meaning. Subsequently, we associate each nuclear physics quantity entering in the calculation of reaction rates with a specific probability density function, including Gaussian, lognormal and chi-squared distributions. Based on these probability density functions the total reaction rate is randomly sampled many times until the required statistical precision is achieved. This procedure results in a median (Monte Carlo) rate which agrees under certain conditions with the commonly reported recommended 'classical' rate. In addition, we present at each temperature a low rate and a high rate, corresponding to the 0.16 and 0.84 quantiles of the cumulative reaction rate distribution. These quantities are in general different from the statistically meaningless 'minimum' (or 'lower limit') and 'maximum' (or 'upper limit') reaction rates which are commonly reported. Furthermore, we approximate the output reaction rate probability density function by a lognormal distribution and present, at each temperature, the lognormal parameters μ and σ. The values of these quantities will be crucial for future Monte Carlo nucleosynthesis studies. Our new reaction rates, appropriate for bare nuclei in the laboratory, are tabulated in the second paper of this issue (Paper II). The nuclear physics input used to derive our reaction rates is presented in the third paper of this issue (Paper III). In the fourth paper of this issue (Paper IV) we compare our new reaction rates to previous results.
Statistical models for the analysis of water distribution system pipe break data
International Nuclear Information System (INIS)
Yamijala, Shridhar; Guikema, Seth D.; Brumbelow, Kelly
2009-01-01
The deterioration of pipes leading to pipe breaks and leaks in urban water distribution systems is of concern to water utilities throughout the world. Pipe breaks and leaks may result in reduction in the water-carrying capacity of the pipes and contamination of water in the distribution systems. Water utilities incur large expenses in the replacement and rehabilitation of water mains, making it critical to evaluate the current and future condition of the system for maintenance decision-making. This paper compares different statistical regression models proposed in the literature for estimating the reliability of pipes in a water distribution system on the basis of short time histories. The goals of these models are to estimate the likelihood of pipe breaks in the future and determine the parameters that most affect the likelihood of pipe breaks. The data set used for the analysis comes from a major US city, and these data include approximately 85,000 pipe segments with nearly 2500 breaks from 2000 through 2005. The results show that the set of statistical models previously proposed for this problem do not provide good estimates with the test data set. However, logistic generalized linear models do provide good estimates of pipe reliability and can be useful for water utilities in planning pipe inspection and maintenance
Local time distribution of the SSC-associated HF-Doppler frequency shifts
International Nuclear Information System (INIS)
Kikuchi, T.; Sugiuchi, H.; Ishimine, T.
1985-01-01
The HF-Doppler frequency shift observed at the storm's sudden commencement is composed of a frequency increase (+) and decrease (-), and classified into four types, SCF(+ -), SCF(- +), SCF(+) and SCF(-). Since the latter two types are special cases of the former two types, two different kinds of electrical field exist in the F region and cause the ExB drift motion of plasma. HUANG (1976) interpreted the frequency increase of SCF(+ -) as due to the westward induction electric field proportional to delta H/ delta t and the succeeding frequency decrease due to the eastward conduction electric field which produces ionospheric currents responsible for the magnetic increase on the ground. In spite of his success in interpreting the SCF(+ -), some other interpretations are needed for the explanation of the whole set of SCF's, particularly SCF(- +). Local time distributions of the SCF's are derived from 41 SCF's which are observed on the HF standard signal (JJY) as received in Okinawa (path length =1600 km) and Kokubunji (60 km). It is shown that the SCF(+ -) appears mainly during the day, whereas the SCF(- +) is observed during the night. The results indicate that the preliminary frequency shift (+) of SCF(+ -) and (-) of SCF(- +) is caused by a westward electric field in the dayside hemisphere, while by an eastward electric field in the nightside hemisphere. The main frequency shift (-) of SCF(+ -) and (+) of SCF(- +) is caused by the reversed electric field. Consequently, the preliminary frequency shift is caused by the dusk-to-dawn electric field, while the main frequency shift by the dawn-to-dusk electric field
Local time distribution of the SSC-associated HF-Doppler frequency shifts
Kikuchi, T.; Sugiuchi, H.; Ishimine, T.
1985-01-01
The HF-Doppler frequency shift observed at the storm's sudden commencement is composed of a frequency increase (+) and decrease (-), and classified into four types, SCF(+ -), SCF(- +), SCF(+) and SCF(-). Since the latter two types are special cases of the former two types, two different kinds of electrical field exist in the F region and cause the ExB drift motion of plasma. HUANG (1976) interpreted the frequency increase of SCF(+ -) as due to the westward induction electric field proportional to delta H/ delta t and the succeeding frequency decrease due to the eastward conduction electric field which produces ionospheric currents responsible for the magnetic increase on the ground. In spite of his success in interpreting the SCF(+ -), some other interpretations are needed for the explanation of the whole set of SCF's, particularly SCF(- +). Local time distributions of the SCF's are derived from 41 SCF's which are observed on the HF standard signal (JJY) as received in Okinawa (path length =1600 km) and Kokubunji (60 km). It is shown that the SCF(+ -) appears mainly during the day, whereas the SCF(- +) is observed during the night. The results indicate that the preliminary frequency shift (+) of SCF(+ -) and (-) of SCF(- +) is caused by a westward electric field in the dayside hemisphere, while by an eastward electric field in the nightside hemisphere. The main frequency shift (-) of SCF(+ -) and (+) of SCF(- +) is caused by the reversed electric field. Consequently, the preliminary frequency shift is caused by the dusk-to-dawn electric field, while the main frequency shift by the dawn-to-dusk electric field.
Statistical Analysis of Spectral Properties and Prosodic Parameters of Emotional Speech
Přibil, J.; Přibilová, A.
2009-01-01
The paper addresses reflection of microintonation and spectral properties in male and female acted emotional speech. Microintonation component of speech melody is analyzed regarding its spectral and statistical parameters. According to psychological research of emotional speech, different emotions are accompanied by different spectral noise. We control its amount by spectral flatness according to which the high frequency noise is mixed in voiced frames during cepstral speech synthesis. Our experiments are aimed at statistical analysis of cepstral coefficient values and ranges of spectral flatness in three emotions (joy, sadness, anger), and a neutral state for comparison. Calculated histograms of spectral flatness distribution are visually compared and modelled by Gamma probability distribution. Histograms of cepstral coefficient distribution are evaluated and compared using skewness and kurtosis. Achieved statistical results show good correlation comparing male and female voices for all emotional states portrayed by several Czech and Slovak professional actors.
International Nuclear Information System (INIS)
Xu, L.Q.; Hu, L.Q.; Chen, K.Y.; Li, E.Z.
2013-01-01
Highlights: • Choi–Williams distribution yields excellent time–frequency resolution for discrete signal. • CWD method provides clear time–frequency pictures of EAST and HT-7 fast MHD events. • CWD method has advantages to wavelets transform scalogram and the short-time Fourier transform spectrogram. • We discuss about how to choose the windows and free parameter of CWD method. -- Abstract: The Choi–Williams distribution is applied to the time–frequency analysis of signals describing rapid magneto-hydro-dynamic (MHD) modes and events in tokamak plasmas. A comparison is made with Soft X-ray (SXR) signals as well as Mirnov signal that shows the advantages of the Choi–Williams distribution over both continuous wavelets transform scalogram and the short-time Fourier transform spectrogram. Examples of MHD activities in HT-7 and EAST tokamak are shown, namely the onset of coupling tearing modes, high frequency precursors of sawtooth, and low frequency MHD instabilities in edge localized mode (ELM) free in H mode discharge
Louzoun, Yoram; Alter, Idan; Gragert, Loren; Albrecht, Mark; Maiers, Martin
2018-05-01
Regardless of sampling depth, accurate genotype imputation is limited in regions of high polymorphism which often have a heavy-tailed haplotype frequency distribution. Many rare haplotypes are thus unobserved. Statistical methods to improve imputation by extending reference haplotype distributions using linkage disequilibrium patterns that relate allele and haplotype frequencies have not yet been explored. In the field of unrelated stem cell transplantation, imputation of highly polymorphic human leukocyte antigen (HLA) genes has an important application in identifying the best-matched stem cell donor when searching large registries totaling over 28,000,000 donors worldwide. Despite these large registry sizes, a significant proportion of searched patients present novel HLA haplotypes. Supporting this observation, HLA population genetic models have indicated that many extant HLA haplotypes remain unobserved. The absent haplotypes are a significant cause of error in haplotype matching. We have applied a Bayesian inference methodology for extending haplotype frequency distributions, using a model where new haplotypes are created by recombination of observed alleles. Applications of this joint probability model offer significant improvement in frequency distribution estimates over the best existing alternative methods, as we illustrate using five-locus HLA frequency data from the National Marrow Donor Program registry. Transplant matching algorithms and disease association studies involving phasing and imputation of rare variants may benefit from this statistical inference framework.
Dai, Qi; Yang, Yanchun; Wang, Tianming
2008-10-15
Many proposed statistical measures can efficiently compare biological sequences to further infer their structures, functions and evolutionary information. They are related in spirit because all the ideas for sequence comparison try to use the information on the k-word distributions, Markov model or both. Motivated by adding k-word distributions to Markov model directly, we investigated two novel statistical measures for sequence comparison, called wre.k.r and S2.k.r. The proposed measures were tested by similarity search, evaluation on functionally related regulatory sequences and phylogenetic analysis. This offers the systematic and quantitative experimental assessment of our measures. Moreover, we compared our achievements with these based on alignment or alignment-free. We grouped our experiments into two sets. The first one, performed via ROC (receiver operating curve) analysis, aims at assessing the intrinsic ability of our statistical measures to search for similar sequences from a database and discriminate functionally related regulatory sequences from unrelated sequences. The second one aims at assessing how well our statistical measure is used for phylogenetic analysis. The experimental assessment demonstrates that our similarity measures intending to incorporate k-word distributions into Markov model are more efficient.
The statistical process control methods - SPC
Directory of Open Access Journals (Sweden)
Floreková Ľubica
1998-03-01
Full Text Available Methods of statistical evaluation of quality SPC (item 20 of the documentation system of quality control of ISO norm, series 900 of various processes, products and services belong amongst basic qualitative methods that enable us to analyse and compare data pertaining to various quantitative parameters. Also they enable, based on the latter, to propose suitable interventions with the aim of improving these processes, products and services. Theoretical basis and applicatibily of the principles of the: - diagnostics of a cause and effects, - Paret analysis and Lorentz curve, - number distribution and frequency curves of random variable distribution, - Shewhart regulation charts, are presented in the contribution.
Uncertainty analysis with statistically correlated failure data
International Nuclear Information System (INIS)
Modarres, M.; Dezfuli, H.; Roush, M.L.
1987-01-01
Likelihood of occurrence of the top event of a fault tree or sequences of an event tree is estimated from the failure probability of components that constitute the events of the fault/event tree. Component failure probabilities are subject to statistical uncertainties. In addition, there are cases where the failure data are statistically correlated. At present most fault tree calculations are based on uncorrelated component failure data. This chapter describes a methodology for assessing the probability intervals for the top event failure probability of fault trees or frequency of occurrence of event tree sequences when event failure data are statistically correlated. To estimate mean and variance of the top event, a second-order system moment method is presented through Taylor series expansion, which provides an alternative to the normally used Monte Carlo method. For cases where component failure probabilities are statistically correlated, the Taylor expansion terms are treated properly. Moment matching technique is used to obtain the probability distribution function of the top event through fitting the Johnson Ssub(B) distribution. The computer program, CORRELATE, was developed to perform the calculations necessary for the implementation of the method developed. (author)
The κ parameter and κ-distribution in κ-deformed statistics for the systems in an external field
International Nuclear Information System (INIS)
Guo, Lina; Du, Jiulin
2007-01-01
It is naturally important question for us to ask under what physical situation should the κ-deformed statistics be suitable for the statistical description of a system and what should the κ parameter stand for. In this Letter, a formula expression of κ parameter is derived on the basis of the κ-H theorem, the κ-velocity distribution and the generalized Boltzmann equation in the framework of κ-deformed statistics. We thus obtain a physical interpretation for the parameter κ 0 with regard to the temperature gradient and the external force field. We show, as the q-statistics based on Tsallis entropy, the κ-deformed statistics may also be the candidate one suitable for the statistical description of the systems in external fields when being in the nonequilibrium stationary state, but has different physical characteristics. Namely, the κ-distribution is found to describe the nonequilibrium stationary state of the system where the external force should be vertical to the temperature gradient
Mapping closure for probability distribution function in low frequency magnetized plasma turbulence
International Nuclear Information System (INIS)
Das, A.; Kaw, P.
1995-01-01
Recent numerical studies on the Hasegawa--Mima equation and its variants describing low frequency magnetized plasma turbulence indicate that the potential fluctuations have a Gaussian character whereas the vorticity exhibits non-Gaussian features. A theoretical interpretation for this observation using the recently developed mapping closure technique [Chen, Chen, and Kraichnan, Phys. Rev. Lett. 63, 2657 (1989)] has been provided here. It has been shown that non-Gaussian statistics for the vorticity arises because of a competition between nonlinear straining and diffusive damping whereas the Gaussianity of the statistics of φ arises because the only significant nonlinearity is associated with divergence free convection, which produces no strain terms. copyright 1995 American Institute of Physics
Performance prediction of a synchronization link for distributed aerospace wireless systems.
Wang, Wen-Qin; Shao, Huaizong
2013-01-01
For reasons of stealth and other operational advantages, distributed aerospace wireless systems have received much attention in recent years. In a distributed aerospace wireless system, since the transmitter and receiver placed on separated platforms which use independent master oscillators, there is no cancellation of low-frequency phase noise as in the monostatic cases. Thus, high accurate time and frequency synchronization techniques are required for distributed wireless systems. The use of a dedicated synchronization link to quantify and compensate oscillator frequency instability is investigated in this paper. With the mathematical statistical models of phase noise, closed-form analytic expressions for the synchronization link performance are derived. The possible error contributions including oscillator, phase-locked loop, and receiver noise are quantified. The link synchronization performance is predicted by utilizing the knowledge of the statistical models, system error contributions, and sampling considerations. Simulation results show that effective synchronization error compensation can be achieved by using this dedicated synchronization link.
Fitting Statistical Distributions Functions on Ozone Concentration Data at Coastal Areas
International Nuclear Information System (INIS)
Muhammad Yazid Nasir; Nurul Adyani Ghazali; Muhammad Izwan Zariq Mokhtar; Norhazlina Suhaimi
2016-01-01
Ozone is known as one of the pollutant that contributes to the air pollution problem. Therefore, it is important to carry out the study on ozone. The objective of this study is to find the best statistical distribution for ozone concentration. There are three distributions namely Inverse Gaussian, Weibull and Lognormal were chosen to fit one year hourly average ozone concentration data in 2010 at Port Dickson and Port Klang. Maximum likelihood estimation (MLE) method was used to estimate the parameters to develop the probability density function (PDF) graph and cumulative density function (CDF) graph. Three performance indicators (PI) that are normalized absolute error (NAE), prediction accuracy (PA), and coefficient of determination (R 2 ) were used to determine the goodness-of-fit criteria of the distribution. Result shows that Weibull distribution is the best distribution with the smallest error measure value (NAE) at Port Klang and Port Dickson is 0.08 and 0.31, respectively. The best score for highest adequacy measure (PA: 0.99) with the value of R 2 is 0.98 (Port Klang) and 0.99 (Port Dickson). These results provide useful information to local authorities for prediction purpose. (author)
Bellera, Carine A.; Julien, Marilyse; Hanley, James A.
2010-01-01
The Wilcoxon statistics are usually taught as nonparametric alternatives for the 1- and 2-sample Student-"t" statistics in situations where the data appear to arise from non-normal distributions, or where sample sizes are so small that we cannot check whether they do. In the past, critical values, based on exact tail areas, were…
Statistical analysis of the hydrodynamic pressure in the near field of compressible jets
International Nuclear Information System (INIS)
Camussi, R.; Di Marco, A.; Castelain, T.
2017-01-01
Highlights: • Statistical properties of pressure fluctuations retrieved through wavelet analysis • Time delay PDFs approximated by a log-normal distribution • Amplitude PDFs approximated by a Gamma distribution • Random variable PDFs weakly dependent upon position and Mach number. • A general stochastic model achieved for the distance dependency - Abstract: This paper is devoted to the statistical characterization of the pressure fluctuations measured in the near field of a compressible jet at two subsonic Mach numbers, 0.6 and 0.9. The analysis is focused on the hydrodynamic pressure measured at different distances from the jet exit and analyzed at the typical frequency associated to the Kelvin–Helmholtz instability. Statistical properties are retrieved by the application of the wavelet transform to the experimental data and the computation of the wavelet scalogram around that frequency. This procedure highlights traces of events that appear intermittently in time and have variable strength. A wavelet-based event tracking procedure has been applied providing a statistical characterization of the time delay between successive events and of their energy level. On this basis, two stochastic models are proposed and validated against the experimental data in the different flow conditions
Hamby, D M
2002-01-01
Reconstructed meteorological data are often used in some form of long-term wind trajectory models for estimating the historical impacts of atmospheric emissions. Meteorological data for the straight-line Gaussian plume model are put into a joint frequency distribution, a three-dimensional array describing atmospheric wind direction, speed, and stability. Methods using the Gaussian model and joint frequency distribution inputs provide reasonable estimates of downwind concentration and have been shown to be accurate to within a factor of four. We have used multiple joint frequency distributions and probabilistic techniques to assess the Gaussian plume model and determine concentration-estimate uncertainty and model sensitivity. We examine the straight-line Gaussian model while calculating both sector-averaged and annual-averaged relative concentrations at various downwind distances. The sector-average concentration model was found to be most sensitive to wind speed, followed by horizontal dispersion (sigmaZ), the importance of which increases as stability increases. The Gaussian model is not sensitive to stack height uncertainty. Precision of the frequency data appears to be most important to meteorological inputs when calculations are made for near-field receptors, increasing as stack height increases.
Understanding advanced statistical methods
Westfall, Peter
2013-01-01
Introduction: Probability, Statistics, and ScienceReality, Nature, Science, and ModelsStatistical Processes: Nature, Design and Measurement, and DataModelsDeterministic ModelsVariabilityParametersPurely Probabilistic Statistical ModelsStatistical Models with Both Deterministic and Probabilistic ComponentsStatistical InferenceGood and Bad ModelsUses of Probability ModelsRandom Variables and Their Probability DistributionsIntroductionTypes of Random Variables: Nominal, Ordinal, and ContinuousDiscrete Probability Distribution FunctionsContinuous Probability Distribution FunctionsSome Calculus-Derivatives and Least SquaresMore Calculus-Integrals and Cumulative Distribution FunctionsProbability Calculation and SimulationIntroductionAnalytic Calculations, Discrete and Continuous CasesSimulation-Based ApproximationGenerating Random NumbersIdentifying DistributionsIntroductionIdentifying Distributions from Theory AloneUsing Data: Estimating Distributions via the HistogramQuantiles: Theoretical and Data-Based Estimate...
Introduction to probability and statistics for science, engineering, and finance
Rosenkrantz, Walter A
2008-01-01
Data Analysis Orientation The Role and Scope of Statistics in Science and Engineering Types of Data: Examples from Engineering, Public Health, and Finance The Frequency Distribution of a Variable Defined on a Population Quantiles of a Distribution Measures of Location (Central Value) and Variability Covariance, Correlation, and Regression: Computing a Stock's Beta Mathematical Details and Derivations Large Data Sets Probability Theory Orientation Sample Space, Events, Axioms of Probability Theory Mathematical Models of Random Sampling Conditional Probability and Baye
Spectral Energy Distribution and Radio Halo of NGC 253 at Low Radio Frequencies
Energy Technology Data Exchange (ETDEWEB)
Kapińska, A. D.; Staveley-Smith, L.; Meurer, G. R.; For, B.-Q. [International Centre for Radio Astronomy Research (ICRAR), University of Western Australia, 35 Stirling Hwy, WA 6009 (Australia); Crocker, R. [Research School of Astronomy and Astrophysics, Australian National University, Canberra, ACT 2611 (Australia); Bhandari, S.; Callingham, J. R.; Gaensler, B. M.; Hancock, P. J.; Lenc, E. [ARC Centre of Excellence for All-Sky Astrophysics (CAASTRO), Sydney NSW (Australia); Hurley-Walker, N.; Seymour, N. [International Centre for Radio Astronomy Research (ICRAR), Curtin University, Bentley, WA 6102 (Australia); Offringa, A. R. [Netherlands Institute for Radio Astronomy (ASTRON), P.O. Box 2, 7990 AA Dwingeloo (Netherlands); Hanish, D. J. [Spitzer Science Center, California Institute of Technology, MC 220-6, 1200 East California Boulevard, Pasadena, CA 91125 (United States); Ekers, R. D.; Bell, M. E. [CSIRO Astronomy and Space Science (CASS), P.O. Box 76, Epping, NSW 1710 (Australia); Dwarakanath, K. S. [Raman Research Institute, Bangalore 560080 (India); Hindson, L. [Centre of Astrophysics Research, University of Hertfordshire, College Lane, Hatfield AL10 9AB (United Kingdom); Johnston-Hollitt, M. [School of Chemical and Physical Sciences, Victoria University of Wellington, P.O. Box 600, Wellington 6140 (New Zealand); McKinley, B., E-mail: anna.kapinska@uwa.edu.au [School of Physics, The University of Melbourne, Parkville, VIC 3010 (Australia); and others
2017-03-20
We present new radio continuum observations of NGC 253 from the Murchison Widefield Array at frequencies between 76 and 227 MHz. We model the broadband radio spectral energy distribution for the total flux density of NGC 253 between 76 MHz and 11 GHz. The spectrum is best described as a sum of a central starburst and extended emission. The central component, corresponding to the inner 500 pc of the starburst region of the galaxy, is best modeled as an internally free–free absorbed synchrotron plasma, with a turnover frequency around 230 MHz. The extended emission component of the spectrum of NGC 253 is best described as a synchrotron emission flattening at low radio frequencies. We find that 34% of the extended emission (outside the central starburst region) at 1 GHz becomes partially absorbed at low radio frequencies. Most of this flattening occurs in the western region of the southeast halo, and may be indicative of synchrotron self-absorption of shock-reaccelerated electrons or an intrinsic low-energy cutoff of the electron distribution. Furthermore, we detect the large-scale synchrotron radio halo of NGC 253 in our radio images. At 154–231 MHz the halo displays the well known X-shaped/horn-like structure, and extends out to ∼8 kpc in the z -direction (from the major axis).
Analysis of frequency effect on variegated RAM styles and other parameters using 40 nm FPGA
DEFF Research Database (Denmark)
Sharma, Rashmi; Pandey, Bishwajeet; Sharma, Vaashu
2018-01-01
. This analysis has been performed using the XILINX 12.1 and IBM SPSS Statistics 21 software and VHDL language. Pipe_distributed style at comparatively lesser values of frequencies consumes the least power. Therefore, lesser values of frequencies should be maintained while observing the power. This would bloom up...
New statistical function for the angular distribution of evaporation residues produced by heavy ions
International Nuclear Information System (INIS)
Rigol, J.
1994-01-01
A new statistical function has been found for modelling the angular distribution of evaporation residues produced by heavy ions. Experimental results are compared with the calculated ones. 11 refs.; 4 figs. (author)
Statistical distributions of earthquakes and related non-linear features in seismic waves
International Nuclear Information System (INIS)
Apostol, B.-F.
2006-01-01
A few basic facts in the science of the earthquakes are briefly reviewed. An accumulation, or growth, model is put forward for the focal mechanisms and the critical focal zone of the earthquakes, which relates the earthquake average recurrence time to the released seismic energy. The temporal statistical distribution for average recurrence time is introduced for earthquakes, and, on this basis, the Omori-type distribution in energy is derived, as well as the distribution in magnitude, by making use of the semi-empirical Gutenberg-Richter law relating seismic energy to earthquake magnitude. On geometric grounds, the accumulation model suggests the value r = 1/3 for the Omori parameter in the power-law of energy distribution, which leads to β = 1,17 for the coefficient in the Gutenberg-Richter recurrence law, in fair agreement with the statistical analysis of the empirical data. Making use of this value, the empirical Bath's law is discussed for the average magnitude of the aftershocks (which is 1.2 less than the magnitude of the main seismic shock), by assuming that the aftershocks are relaxation events of the seismic zone. The time distribution of the earthquakes with a fixed average recurrence time is also derived, the earthquake occurrence prediction is discussed by means of the average recurrence time and the seismicity rate, and application of this discussion to the seismic region Vrancea, Romania, is outlined. Finally, a special effect of non-linear behaviour of the seismic waves is discussed, by describing an exact solution derived recently for the elastic waves equation with cubic anharmonicities, its relevance, and its connection to the approximate quasi-plane waves picture. The properties of the seismic activity accompanying a main seismic shock, both like foreshocks and aftershocks, are relegated to forthcoming publications. (author)
A FEMTOSECOND-LEVEL FIBER-OPTICS TIMING DISTRIBUTION SYSTEM USING FREQUENCY-OFFSET INTERFEROMETRY
International Nuclear Information System (INIS)
Staples, J.W.; Byrd, J.; Doolittle, L.; Huang, G.; Wilcox, R.
2008-01-01
An optical fiber-based frequency and timing distribution system based on the principle of heterodyne interferometry has been in development at LBNL for several years. The fiber drift corrector has evolved from an RF-based to an optical-based system, from mechanical correctors (piezo and optical trombone) to fully electronic, and the electronics from analog to fully digital, all using inexpensive off-the-shelf commodity fiber components. Short-term optical phase jitter and long-term phase drift are both in the femtosecond range over distribution paths of 2 km or more
Yuan, Ke-Hai
2008-01-01
In the literature of mean and covariance structure analysis, noncentral chi-square distribution is commonly used to describe the behavior of the likelihood ratio (LR) statistic under alternative hypothesis. Due to the inaccessibility of the rather technical literature for the distribution of the LR statistic, it is widely believed that the…
Earth-Space Links and Fade-Duration Statistics
Davarian, Faramaz
1996-01-01
In recent years, fade-duration statistics have been the subject of several experimental investigations. A good knowledge of the fade-duration distribution is important for the assessment of a satellite communication system's channel dynamics: What is a typical link outage duration? How often do link outages exceeding a given duration occur? Unfortunately there is yet no model that can universally answer the above questions. The available field measurements mainly come from temperate climatic zones and only from a few sites. Furthermore, the available statistics are also limited in the choice of frequency and path elevation angle. Yet, much can be learned from the available information. For example, we now know that the fade-duration distribution is approximately lognormal. Under certain conditions, we can even determine the median and other percentiles of the distribution. This paper reviews the available data obtained by several experimenters in different parts of the world. Areas of emphasis are mobile and fixed satellite links. Fades in mobile links are due to roadside-tree shadowing, whereas fades in fixed links are due to rain attenuation.
Lu, Han-Han; Xu, Jing-Ping; Liu, Lu; Lai, Pui-To; Tang, Wing-Man
2016-11-01
An equivalent distributed capacitance model is established by considering only the gate oxide-trap capacitance to explain the frequency dispersion in the C-V curve of MOS capacitors measured for a frequency range from 1 kHz to 1 MHz. The proposed model is based on the Fermi-Dirac statistics and the charging/discharging effects of the oxide traps induced by a small ac signal. The validity of the proposed model is confirmed by the good agreement between the simulated results and experimental data. Simulations indicate that the capacitance dispersion of an MOS capacitor under accumulation and near flatband is mainly caused by traps adjacent to the oxide/semiconductor interface, with negligible effects from the traps far from the interface, and the relevant distance from the interface at which the traps can still contribute to the gate capacitance is also discussed. In addition, by excluding the negligible effect of oxide-trap conductance, the model avoids the use of imaginary numbers and complex calculations, and thus is simple and intuitive. Project supported by the National Natural Science Foundation of China (Grant Nos. 61176100 and 61274112), the University Development Fund of the University of Hong Kong, China (Grant No. 00600009), and the Hong Kong Polytechnic University, China (Grant No. 1-ZVB1).
A two-component generalized extreme value distribution for precipitation frequency analysis
Czech Academy of Sciences Publication Activity Database
Rulfová, Zuzana; Buishand, A.; Roth, M.; Kyselý, Jan
2016-01-01
Roč. 534, March (2016), s. 659-668 ISSN 0022-1694 R&D Projects: GA ČR(CZ) GA14-18675S Institutional support: RVO:68378289 Keywords : precipitation extremes * two-component extreme value distribution * regional frequency analysis * convective precipitation * stratiform precipitation * Central Europe Subject RIV: DG - Athmosphere Sciences, Meteorology Impact factor: 3.483, year: 2016 http://www.sciencedirect.com/science/article/pii/S0022169416000500
International Nuclear Information System (INIS)
Varjas, Geza; Jozsef, Gabor; Gyenes, Gyoergy; Petranyi, Julia; Bozoky, Laszlo; Pataki, Gezane
1985-01-01
The establishment of the National Computerized Irradiation Planning Network allowed to perform the statistical evaluation presented in this report. During the first 5 years 13389 dose-distribution charts were calculated for the treatment of 5320 patients, i.e. in average, 2,5 dose-distribution chart-variants per patient. This number practically did not change in the last 4 years. The irradiation plan of certain tumour localizations was performed on the basis of the calculation of, in average, 1.6-3.0 dose-distribution charts. Recently, radiation procedures assuring optimal dose-distribution, such as the use of moving fields, and two- or three-irradiation fields, are gaining grounds. (author)
DEFF Research Database (Denmark)
Rohrlack, T.; Christoffersen, K.; Friberg-Jensen, U.
2005-01-01
on the frequency of such compounds in the widely distributed cyanobacterial genus Planktothrix. Of the 89 Planktothrix strains analysed, about 70% produced inhibitors of daphnid trypsin. The strains tested positive represented three common Planktothrix species and were isolated from diverse localities...
Features of the use of time-frequency distributions for controlling the mixture-producing aggregate
Fedosenkov, D. B.; Simikova, A. A.; Fedosenkov, B. A.
2018-05-01
The paper submits and argues the information on filtering properties of the mixing unit as a part of the mixture-producing aggregate. Relevant theoretical data concerning a channel transfer function of the mixing unit and multidimensional material flow signals are adduced here. Note that ordinary one-dimensional material flow signals are defined in terms of time-frequency distributions of Cohen’s class representations operating with Gabor wavelet functions. Two time-frequencies signal representations are written about in the paper to show how one can solve controlling problems as applied to mixture-producing systems: they are the so-called Rihaczek and Wigner-Ville distributions. In particular, the latter illustrates low-pass filtering properties that are practically available in any of low-pass elements of a physical system.
Development of a distributed polarization-OTDR to measure two vibrations with the same frequency
Pan, Yun; Wang, Feng; Wang, Xiangchuan; Zhang, Mingjiang; Zhou, Ling; Sun, Zhenqing; Zhang, Xuping
2015-08-01
A polarization optical time-domain reflectometer (POTDR) can distributedly measure the vibration of fiber by detecting the vibration induced polarization variation only with a polarization analyzer. It has great potential in the monitoring of the border intrusion, structural healthy, anti-stealing of pipeline and so on, because of its simple configuration, fast response speed and distributed measuring ability. However, it is difficult to distinguish two vibrations with the same frequency for POTDR because the signal induced by the first vibration would bury the other vibration induced signal. This paper proposes a simple method to resolve this problem in POTDR by analyzing the phase of the vibration induced signal. The effectiveness of this method in distinguishing two vibrations with the same frequency for POTDR is proved by simulation.
Automatic generation of 3D statistical shape models with optimal landmark distributions.
Heimann, T; Wolf, I; Meinzer, H-P
2007-01-01
To point out the problem of non-uniform landmark placement in statistical shape modeling, to present an improved method for generating landmarks in the 3D case and to propose an unbiased evaluation metric to determine model quality. Our approach minimizes a cost function based on the minimum description length (MDL) of the shape model to optimize landmark correspondences over the training set. In addition to the standard technique, we employ an extended remeshing method to change the landmark distribution without losing correspondences, thus ensuring a uniform distribution over all training samples. To break the dependency of the established evaluation measures generalization and specificity from the landmark distribution, we change the internal metric from landmark distance to volumetric overlap. Redistributing landmarks to an equally spaced distribution during the model construction phase improves the quality of the resulting models significantly if the shapes feature prominent bulges or other complex geometry. The distribution of landmarks on the training shapes is -- beyond the correspondence issue -- a crucial point in model construction.
International Nuclear Information System (INIS)
Weise, K.
1998-01-01
When a contribution of a particular nuclear radiation is to be detected, for instance, a spectral line of interest for some purpose of radiation protection, and quantities and their uncertainties must be taken into account which, such as influence quantities, cannot be determined by repeated measurements or by counting nuclear radiation events, then conventional statistics of event frequencies is not sufficient for defining the decision threshold, the detection limit, and the limits of a confidence interval. These characteristic limits are therefore redefined on the basis of Bayesian statistics for a wider applicability and in such a way that the usual practice remains as far as possible unaffected. The principle of maximum entropy is applied to establish probability distributions from available information. Quantiles of these distributions are used for defining the characteristic limits. But such a distribution must not be interpreted as a distribution of event frequencies such as the Poisson distribution. It rather expresses the actual state of incomplete knowledge of a physical quantity. The different definitions and interpretations and their quantitative consequences are presented and discussed with two examples. The new approach provides a theoretical basis for the DIN 25482-10 standard presently in preparation for general applications of the characteristic limits. (orig.) [de
International Nuclear Information System (INIS)
Buzzi, A.; Tosi, E.
1988-01-01
A statistical investigation is presented of the main variables characterizing the tropospheric general circulation in both hemispheres and extreme season, Winter and Summer. This gives up the opportunity of comparing four distinct realizations of the planetary circulation, as function of different orographic and thermal forcing conditions. Our approach is made possible by the availability of 6 years of global daily analyses prepared by ECMWF (European Centre for Medium-range Weather Forecast). The variables taken into account are the zonal geostrophic wind, the zonal thermal wind and various large-scala wave components, averaged over the tropospheric depth between 1000 and 200 hPa. The mean properties of the analysed quantities in each hemisphere and season are compared and their principal characteristics are discussed. The probability density estimates for the same variables, filtered in order to eliminate the seasonal cycle and the high frequency 'noise', are then presented. The distributions are examined, in particular, with respect of their unimodal or multimodal nature and with reference to the recent discussion in the literature on the bimodality which has been found for some indicators of planetary wave activity in the Nothern Hemisphere Winter. Our results indicate the presence of nonunimodally distributed wave and zonal flow components in both hemispheres and extreme season. The most frequent occurrence of nonunimodal behaviour is found for those wave components which exhibit an almost vanishing zonal phase speed and a larger 'response' to orographic forcing
On the Statistical Properties of Cospectra
Huppenkothen, D.; Bachetti, M.
2018-05-01
In recent years, the cross-spectrum has received considerable attention as a means of characterizing the variability of astronomical sources as a function of wavelength. The cospectrum has only recently been understood as a means of mitigating instrumental effects dependent on temporal frequency in astronomical detectors, as well as a method of characterizing the coherent variability in two wavelength ranges on different timescales. In this paper, we lay out the statistical foundations of the cospectrum, starting with the simplest case of detecting a periodic signal in the presence of white noise, under the assumption that the same source is observed simultaneously in independent detectors in the same energy range. This case is especially relevant for detecting faint X-ray pulsars in detectors heavily affected by instrumental effects, including NuSTAR, Astrosat, and IXPE, which allow for even sampling and where the cospectrum can act as an effective way to mitigate dead time. We show that the statistical distributions of both single and averaged cospectra differ considerably from those for standard periodograms. While a single cospectrum follows a Laplace distribution exactly, averaged cospectra are approximated by a Gaussian distribution only for more than ∼30 averaged segments, dependent on the number of trials. We provide an instructive example of a quasi-periodic oscillation in NuSTAR and show that applying standard periodogram statistics leads to underestimated tail probabilities for period detection. We also demonstrate the application of these distributions to a NuSTAR observation of the X-ray pulsar Hercules X-1.
Directory of Open Access Journals (Sweden)
Sarah E Steele
Full Text Available Body size is an important correlate of life history, ecology and distribution of species. Despite this, very little is known about body size evolution in fishes, particularly freshwater fishes of the Neotropics where species and body size diversity are relatively high. Phylogenetic history and body size data were used to explore body size frequency distributions in Neotropical cichlids, a broadly distributed and ecologically diverse group of fishes that is highly representative of body size diversity in Neotropical freshwater fishes. We test for divergence, phylogenetic autocorrelation and among-clade partitioning of body size space. Neotropical cichlids show low phylogenetic autocorrelation and divergence within and among taxonomic levels. Three distinct regions of body size space were identified from body size frequency distributions at various taxonomic levels corresponding to subclades of the most diverse tribe, Geophagini. These regions suggest that lineages may be evolving towards particular size optima that may be tied to specific ecological roles. The diversification of Geophagini appears to constrain the evolution of body size among other Neotropical cichlid lineages; non-Geophagini clades show lower species-richness in body size regions shared with Geophagini. Neotropical cichlid genera show less divergence and extreme body size than expected within and among tribes. Body size divergence among species may instead be present or linked to ecology at the community assembly scale.
Mixture distributions of wind speed in the UAE
Shin, J.; Ouarda, T.; Lee, T. S.
2013-12-01
Wind speed probability distribution is commonly used to estimate potential wind energy. The 2-parameter Weibull distribution has been most widely used to characterize the distribution of wind speed. However, it is unable to properly model wind speed regimes when wind speed distribution presents bimodal and kurtotic shapes. Several studies have concluded that the Weibull distribution should not be used for frequency analysis of wind speed without investigation of wind speed distribution. Due to these mixture distributional characteristics of wind speed data, the application of mixture distributions should be further investigated in the frequency analysis of wind speed. A number of studies have investigated the potential wind energy in different parts of the Arabian Peninsula. Mixture distributional characteristics of wind speed were detected from some of these studies. Nevertheless, mixture distributions have not been employed for wind speed modeling in the Arabian Peninsula. In order to improve our understanding of wind energy potential in Arabian Peninsula, mixture distributions should be tested for the frequency analysis of wind speed. The aim of the current study is to assess the suitability of mixture distributions for the frequency analysis of wind speed in the UAE. Hourly mean wind speed data at 10-m height from 7 stations were used in the current study. The Weibull and Kappa distributions were employed as representatives of the conventional non-mixture distributions. 10 mixture distributions are used and constructed by mixing four probability distributions such as Normal, Gamma, Weibull and Extreme value type-one (EV-1) distributions. Three parameter estimation methods such as Expectation Maximization algorithm, Least Squares method and Meta-Heuristic Maximum Likelihood (MHML) method were employed to estimate the parameters of the mixture distributions. In order to compare the goodness-of-fit of tested distributions and parameter estimation methods for
The extraction and integration framework: a two-process account of statistical learning.
Thiessen, Erik D; Kronstein, Alexandra T; Hufnagle, Daniel G
2013-07-01
The term statistical learning in infancy research originally referred to sensitivity to transitional probabilities. Subsequent research has demonstrated that statistical learning contributes to infant development in a wide array of domains. The range of statistical learning phenomena necessitates a broader view of the processes underlying statistical learning. Learners are sensitive to a much wider range of statistical information than the conditional relations indexed by transitional probabilities, including distributional and cue-based statistics. We propose a novel framework that unifies learning about all of these kinds of statistical structure. From our perspective, learning about conditional relations outputs discrete representations (such as words). Integration across these discrete representations yields sensitivity to cues and distributional information. To achieve sensitivity to all of these kinds of statistical structure, our framework combines processes that extract segments of the input with processes that compare across these extracted items. In this framework, the items extracted from the input serve as exemplars in long-term memory. The similarity structure of those exemplars in long-term memory leads to the discovery of cues and categorical structure, which guides subsequent extraction. The extraction and integration framework provides a way to explain sensitivity to both conditional statistical structure (such as transitional probabilities) and distributional statistical structure (such as item frequency and variability), and also a framework for thinking about how these different aspects of statistical learning influence each other. 2013 APA, all rights reserved
Vehicle-to-Grid Systems for Frequency Regulation in an Islanded Danish Distribution Network
DEFF Research Database (Denmark)
Pillai, Jayakrishnan Radhakrishna; Bak-Jensen, Birgitte
2010-01-01
vehicles could provide power system ancillary services in the form of power balancing reserves to support the large-scale integration of variable renewable energy sources like wind power. This paper investigates the dynamic frequency response of an islanded Danish distribution system operation with large...
Statistical analysis of dragline monitoring data
Energy Technology Data Exchange (ETDEWEB)
Mirabediny, H.; Baafi, E.Y. [University of Tehran, Tehran (Iran)
1998-07-01
Dragline monitoring systems are normally the best tool used to collect data on the machine performance and operational parameters of a dragline operation. This paper discusses results of a time study using data from a dragline monitoring system captured over a four month period. Statistical summaries of the time study in terms of average values, standard deviation and frequency distributions showed that the mode of operation and the geological conditions have a significant influence on the dragline performance parameters. 6 refs., 14 figs., 3 tabs.
International Nuclear Information System (INIS)
Lopez, O; Chanteau, B; Bercy, A; Argence, B; Darquié, B; Chardonnet, C; Amy-Klein, A; Nicolodi, D; Zhang, W; Abgrall, M; Haboucha, A; Kanj, A; Rovera, D; Achkar, J; Pottie, P-E; Coq, Y Le; Santarelli, G
2013-01-01
We report an optical link of 540 km for ultrastable frequency distribution over the Internet fiber network. The phase-noise compensated link shows a fractional frequency instability in full bandwidth of 3×10 −14 at one second measurement time and 2×10 −18 at 30 000 s. This work is a significant step towards a sustainable wide area ultrastable optical frequency distribution and comparison network. Time transfer was demonstrated simultaneously on the same link and led to an absolute time accuracy (250 ps) and long-term timing stability (20 ps) which outperform the conventional satellite transfer methods by one order of magnitude. Current development addresses the question of multiple users distribution in the same metropolitan area. We demonstrate on-line extraction and first results show frequency stability at the same level as with conventional link. We also report an application to coherent frequency transfer to the mid-infrared. We demonstrate the frequency stabilisation of a mid-infrared laser to the near-infrared frequency reference transferred through the optical link. Fractional stability better than 4×10 −14 at 1 s averaging time was obtained, opening the way to ultrahigh resolution spectroscopy of molecular rovibrational transitions
Structure Learning and Statistical Estimation in Distribution Networks - Part II
Energy Technology Data Exchange (ETDEWEB)
Deka, Deepjyoti [Univ. of Texas, Austin, TX (United States); Backhaus, Scott N. [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Chertkov, Michael [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)
2015-02-13
Limited placement of real-time monitoring devices in the distribution grid, recent trends notwithstanding, has prevented the easy implementation of demand-response and other smart grid applications. Part I of this paper discusses the problem of learning the operational structure of the grid from nodal voltage measurements. In this work (Part II), the learning of the operational radial structure is coupled with the problem of estimating nodal consumption statistics and inferring the line parameters in the grid. Based on a Linear-Coupled(LC) approximation of AC power flows equations, polynomial time algorithms are designed to identify the structure and estimate nodal load characteristics and/or line parameters in the grid using the available nodal voltage measurements. Then the structure learning algorithm is extended to cases with missing data, where available observations are limited to a fraction of the grid nodes. The efficacy of the presented algorithms are demonstrated through simulations on several distribution test cases.
Distribution of Argon Arc Contaminated with Nitrogen as Function of Frequency in Pulsed TIG Welding
Takahashi, Hiroki; Tanaka, Tatsuro; Yamamoto, Shinji; Iwao, Toru
2016-09-01
TIG arc welding is the high-quality and much applicable material joining technology. However, the current has to be small because the cathode melting should be prevented. In this case, the heat input to the welding pool becomes low, then, the welding defect sometimes occurs. The pulsed TIG arc welding is used to improve this disadvantage This welding can be controlled by some current parameters such as frequency However, few report has reported the distribution of argon arc contaminated with nitrogen It is important to prevent the contamination of nitrogen because the melting depth increases in order to prevent the welding defects. In this paper, the distribution of argon arc contaminated as function of frequency with nitrogen in pulsed TIG welding is elucidated. The nitrogen concentration, the radial flow velocity, the arc temperature were calculated using the EMTF simulation when the time reached at the base current. As a result, the nitrogen concentration into the arc became low with increasing the frequency The diffusion coefficient decreased because of the decrement of temperature over 4000 K. In this case, the nitrogen concentration became low near the anode. Therefore, the nitrogen concentration became low because the frequency is high.
Cheng, P.W.; Kuik, van G.A.M.; Bussel, van G.J.W.; Vrouwenvelder, A.C.W.M.
2002-01-01
Extreme response is an important design variable for wind turbines. The statistical uncertainties concerning the extreme response distribution are simulated here with data concerning physical characteristics obtained from measurements. The extreme responses are the flap moment at the blade root and
Regional frequency analysis of extreme rainfalls using partial L moments method
Zakaria, Zahrahtul Amani; Shabri, Ani
2013-07-01
An approach based on regional frequency analysis using L moments and LH moments are revisited in this study. Subsequently, an alternative regional frequency analysis using the partial L moments (PL moments) method is employed, and a new relationship for homogeneity analysis is developed. The results were then compared with those obtained using the method of L moments and LH moments of order two. The Selangor catchment, consisting of 37 sites and located on the west coast of Peninsular Malaysia, is chosen as a case study. PL moments for the generalized extreme value (GEV), generalized logistic (GLO), and generalized Pareto distributions were derived and used to develop the regional frequency analysis procedure. PL moment ratio diagram and Z test were employed in determining the best-fit distribution. Comparison between the three approaches showed that GLO and GEV distributions were identified as the suitable distributions for representing the statistical properties of extreme rainfall in Selangor. Monte Carlo simulation used for performance evaluation shows that the method of PL moments would outperform L and LH moments methods for estimation of large return period events.
Lee, Seung Hyun; Hyun, Jae Seog; Kwon, Oh-Young
2010-08-01
The purpose of this study was to examine the cerebral changes in high beta frequency oscillations (22-30 Hz) induced by sertraline and by audiovisual erotic stimuli in healthy adult males. Scalp electroencephalographies (EEGs) were conducted twice in 11 healthy, right-handed males, once before sertraline intake and again 4 hours thereafter. The EEGs included four sessions recorded sequentially while the subjects were resting, watching a music video, resting, and watching an erotic video for 3 minutes, 5 minutes, 3 minutes, and 5 minutes, respectively. We performed frequency-domain analysis using the EEGs with a distributed model of current-source analysis. The statistical nonparametric maps were obtained from the sessions of watching erotic and music videos (perotic stimuli decreased the current-source density of the high beta frequency band in the middle frontal gyrus, the precentral gyrus, the postcentral gyrus, and the supramarginal gyrus of the left cerebral hemisphere in the baseline EEGs taken before sertraline intake (perotic stimuli did not induce any changes in current-source distribution of the brain 4 hours after sertraline intake. It is speculated that erotic stimuli may decrease the function of the middle frontal gyrus, the precentral gyrus, the postcentral gyrus, and the supramarginal gyrus of the left cerebral hemisphere in healthy adult males. This change may debase the inhibitory control of the brain against erotic stimuli. Sertraline may reduce the decrement in inhibitory control.
Under-Frequency Load Shedding Technique Considering Event-Based for an Islanded Distribution Network
Directory of Open Access Journals (Sweden)
Hasmaini Mohamad
2016-06-01
Full Text Available One of the biggest challenge for an islanding operation is to sustain the frequency stability. A large power imbalance following islanding would cause under-frequency, hence an appropriate control is required to shed certain amount of load. The main objective of this research is to develop an adaptive under-frequency load shedding (UFLS technique for an islanding system. The technique is designed considering an event-based which includes the moment system is islanded and a tripping of any DG unit during islanding operation. A disturbance magnitude is calculated to determine the amount of load to be shed. The technique is modeled by using PSCAD simulation tool. A simulation studies on a distribution network with mini hydro generation is carried out to evaluate the UFLS model. It is performed under different load condition: peak and base load. Results show that the load shedding technique have successfully shed certain amount of load and stabilized the system frequency.
Giommi, P.; Menna, M. T.; Padovani, P.
1999-01-01
We have assembled a multi-frequency database by cross-correlating the NVSS catalog of radio sources with the RASSBSC list of soft X-ray sources, obtaining optical magnitude estimates from the Palomar and UK Schmidt surveys as provided by the APM and COSMOS on-line services. By exploiting the nearly unique broad-band properties of High-Energy Peaked (HBL) BL Lacs we have statistically identified a sample of 218 objects that is expected to include about 85% of BL Lacs and that is therefore seve...
ERROR DISTRIBUTION EVALUATION OF THE THIRD VANISHING POINT BASED ON RANDOM STATISTICAL SIMULATION
Directory of Open Access Journals (Sweden)
C. Li
2012-07-01
Full Text Available POS, integrated by GPS / INS (Inertial Navigation Systems, has allowed rapid and accurate determination of position and attitude of remote sensing equipment for MMS (Mobile Mapping Systems. However, not only does INS have system error, but also it is very expensive. Therefore, in this paper error distributions of vanishing points are studied and tested in order to substitute INS for MMS in some special land-based scene, such as ground façade where usually only two vanishing points can be detected. Thus, the traditional calibration approach based on three orthogonal vanishing points is being challenged. In this article, firstly, the line clusters, which parallel to each others in object space and correspond to the vanishing points, are detected based on RANSAC (Random Sample Consensus and parallelism geometric constraint. Secondly, condition adjustment with parameters is utilized to estimate nonlinear error equations of two vanishing points (VX, VY. How to set initial weights for the adjustment solution of single image vanishing points is presented. Solving vanishing points and estimating their error distributions base on iteration method with variable weights, co-factor matrix and error ellipse theory. Thirdly, under the condition of known error ellipses of two vanishing points (VX, VY and on the basis of the triangle geometric relationship of three vanishing points, the error distribution of the third vanishing point (VZ is calculated and evaluated by random statistical simulation with ignoring camera distortion. Moreover, Monte Carlo methods utilized for random statistical estimation are presented. Finally, experimental results of vanishing points coordinate and their error distributions are shown and analyzed.
Error Distribution Evaluation of the Third Vanishing Point Based on Random Statistical Simulation
Li, C.
2012-07-01
POS, integrated by GPS / INS (Inertial Navigation Systems), has allowed rapid and accurate determination of position and attitude of remote sensing equipment for MMS (Mobile Mapping Systems). However, not only does INS have system error, but also it is very expensive. Therefore, in this paper error distributions of vanishing points are studied and tested in order to substitute INS for MMS in some special land-based scene, such as ground façade where usually only two vanishing points can be detected. Thus, the traditional calibration approach based on three orthogonal vanishing points is being challenged. In this article, firstly, the line clusters, which parallel to each others in object space and correspond to the vanishing points, are detected based on RANSAC (Random Sample Consensus) and parallelism geometric constraint. Secondly, condition adjustment with parameters is utilized to estimate nonlinear error equations of two vanishing points (VX, VY). How to set initial weights for the adjustment solution of single image vanishing points is presented. Solving vanishing points and estimating their error distributions base on iteration method with variable weights, co-factor matrix and error ellipse theory. Thirdly, under the condition of known error ellipses of two vanishing points (VX, VY) and on the basis of the triangle geometric relationship of three vanishing points, the error distribution of the third vanishing point (VZ) is calculated and evaluated by random statistical simulation with ignoring camera distortion. Moreover, Monte Carlo methods utilized for random statistical estimation are presented. Finally, experimental results of vanishing points coordinate and their error distributions are shown and analyzed.
Frequency distribution of ABO and Rh (D) blood group alleles in Silte Zone, Ethiopia
Kassahun Tesfaye; Yohannes Petros; Mebeaselassie Andargie
2015-01-01
Background: Frequency distribution of blood groups is important as it is used in modern medicine, genetic research, anthropology, and tracing ancestral relations of humans. The ABO and Rh blood groups are the most important blood groups despite the long list of several other blood groups discovered so far. Aim of the study: To study and document the frequency of ABO and Rh (D) blood groups in three ethnic groups of Silte Zone, Ethiopia. Subjects and methods: ABO and Rh (D) typing was ca...
International Nuclear Information System (INIS)
Oscarsson, T.E.; Roennmark, K.G.
1990-01-01
In this paper the authors present an investigation of low-frequency waves observed on auroral field lines below the acceleration region by the Swedish satellite Viking. The measured frequency spectra are peaked at half the local proton gyrofrequency, and the waves are observed in close connection with precipitating electrons. In order to obtain information about the distribution of wave energy in wave vector space, they reconstruct the wave distribution function (WDF) from observed spectral densities. They use a new scheme that allows them to reconstruct simultaneously the WDF over a broad frequency band. The method also makes it possible to take into account available particle observations as well as Doppler shifts caused by the relative motion between the plasma and the satellite. The distribution of energy in wave vector space suggested by the reconstructed WDF is found to be consistent with what is expected from a plasma instability driven by the observed precipitating electrons. Furthermore, by using UV images obtained on Viking, they demonstrate that the wave propagation directions indicated by the reconstructed WDFs are consistent with a simple model of the presumed wave source in the electron precipitation region
Ruth Marian S. Guzman; Ricardo Noel R. Gervasio; Ian Kendrich C. Fontanilla; Ernelea P. Cao
2009-01-01
Frequency distribution of blood groups is important as it is used in modern medicine, genetic research, anthropology, and tracing ancestral relations of humans. Blood groups include the ABO, Rh and the MN red cell antigens. The frequency distribution of these three blood groups were obtained and assessed for differences from three populations: (1) a regional population from the town of Cabagan located in Isabela province; (2) a cosmopolitan population from the University of the Philippines’ r...
Directory of Open Access Journals (Sweden)
Dr. Abdollah Mousavi
2000-05-01
Full Text Available Objects Determining the frequency distribution of hearing disorders among the student of public elementary school by Otoscopy, Puretone Audiometry, impedance Audiometry and questionnaires. Methods and Materials: This study was can-led out in a cross - sectional descriptive survey - on 1000 students (500 girls and 500 boys among the student of primary school of EslamAhad Gharh, academic year 1376-77 Results: 1- Otoscopy examination; abnormal conditions of external ear canal was found in 13.65% of cases; mostly impacted ceruman (13.3%. Abnormal conditions of external ear canal was more prevalent in girls than boys and the difference was statistically meaningful in the right (P V=0 .012 and left (PV=0.043 ear. Abnormal tympanic membrane was seen in 6.75% of cases; mostly retraction (295%. "n2- Impedance Audiometry: 11.05% abnormal tympanograms were observed, mostly type C (4.1%. 3- Pure Tone Audiometry: An overall 9.7% hearing loss was found in this population including 3.5% bilateral and 6.2% unilateral hearing loss. 4.15% of population suffered from SNHL observed mostly in boys and conductive. Hearing loss mostly in girls and the difference was statistically meaning in the left ear (PV=0.03. (l.e 0.6% were in need of rehabilitation services. Family background showed on effects on the hearing disorders. Only 11. 4% of parents, 13.4% of teachers and 14.4% of afflicted students were aware of the problem.
Huang, N. E.; Long, S. R.
1980-01-01
Laboratory experiments were performed to measure the surface elevation probability density function and associated statistical properties for a wind-generated wave field. The laboratory data along with some limited field data were compared. The statistical properties of the surface elevation were processed for comparison with the results derived from the Longuet-Higgins (1963) theory. It is found that, even for the highly non-Gaussian cases, the distribution function proposed by Longuet-Higgins still gives good approximations.
Rouse, I.; Willitsch, S.
2018-04-01
An ion held in a radio-frequency trap interacting with a uniform buffer gas of neutral atoms develops a steady-state energy distribution characterized by a power-law tail at high energies instead of the exponential decay characteristic of thermal equilibrium. We have previously shown that the Tsallis statistics frequently used as an empirical model for this distribution is a good approximation when the ion is heated due to a combination of micromotion interruption and exchange of kinetic energy with the buffer gas [Rouse and Willitsch, Phys. Rev. Lett. 118, 143401 (2017), 10.1103/PhysRevLett.118.143401]. Here, we extend our treatment to include the heating due to additional motion of the ion caused by external forces, including the "excess micromotion" induced by uniform electric fields and rf phase offsets. We show that this also leads to a Tsallis distribution with a potentially different power-law exponent from that observed in the absence of this additional forced motion, with the difference increasing as the ratio of the mass of the neutral atoms to that of the ion decreases. Our results indicate that unless the excess micromotion is minimized to a very high degree, then even a system with very light neutrals and a heavy ion does not exhibit a thermal distribution.
Directory of Open Access Journals (Sweden)
Xiangwu Yan
2018-03-01
Full Text Available The increasing penetration rate of grid connected renewable energy power generation reduces the primary frequency regulation capability of the system and poses a challenge to the security and stability of the power grid. In this paper, a distributed photovoltaic (PV storage virtual synchronous generator system is constructed, which realizes the external characteristics of synchronous generator/motor. For this kind of input/output bidirectional devices (e.g., renewable power generation/storage combined systems, pumped storage power stations, battery energy storage systems, and vehicle-to-grid electric vehicles, a synthesis analysis method for system power-frequency considering source-load static frequency characteristics (S-L analysis method is proposed in order to depict the system’s power balance dynamic adjustment process visually. Simultaneously, an inertia matching method is proposed to solve the problem of inertia matching in the power grid. Through the simulation experiment in MATLAB, the feasibility of the distributed PV storage synchronous virtual machine system is verified as well as the effectiveness of S-L analysis method and inertia matching method.
Castilla, José Luis; Pellón, Ricardo
2013-11-01
Under intermittent food schedules animals develop temporally organized behaviors throughout interfood intervals, with behaviors early in the intervals (interim) normally occurring in excess. Schedule-induced drinking (a prototype of interim, adjunctive behavior) is related to food deprivation and food frequency. This study investigated the interactions that resulted from combining different food-deprivation levels (70%, 80% or 90% free-feeding weights) with different food-occurrence frequencies (15-, 30- or 60-s interfood intervals) in a within-subjects design. Increases in food deprivation and food frequency generally led to increased licking, with greater differences due to food deprivation as interfood intervals became shorter. Distributions of licking were modestly shifted to later in the interfood interval as interfood intervals lengthened, a result that was most marked under 90% food deprivation, which also resulted in flatter distributions. It would therefore appear that food deprivation modulates the licking rate and the distribution of licking in different ways. Effects of food deprivation and food frequency are adequately explained by a theory of adjunctive behavior based on delayed food reinforcement, in contrast to alternative hypotheses. © Society for the Experimental Analysis of Behavior.
International Nuclear Information System (INIS)
Orlov, Sergei N; Polivanov, Yurii N
2007-01-01
Dispersion phase matching curves and spectral distributions of the efficiency of difference frequency generation in the terahertz range are calculated for collinear propagation of interacting waves in zinc blende semiconductor crystals (ZnTe, CdTe, GaP, GaAs). The effect of the pump wavelength, the nonlinear crystal length and absorption in the terahertz range on the spectral distribution of the efficiency of difference frequency generation is analysed. (nonlinear optical phenomena)
ABO and Rh (D group distribution and gene frequency; the first multicentric study in India
Directory of Open Access Journals (Sweden)
Amit Agrawal
2014-01-01
Full Text Available Background and Objectives: The study was undertaken with the objective to provide data on the ABO and Rh(D blood group distribution and gene frequency across India. Materials and Methods: A total of 10,000 healthy blood donors donating in blood banks situated in five different geographical regions of the country (North, South, East and Center were included in the study. ABO and Rh (D grouping was performed on all these samples. Data on the frequency of ABO and Rh(D blood groups was reported in simple numbers and percentages. Results: The study showed that O was the most common blood group (37.12% in the country closely followed by B at 32.26%, followed by A at 22.88% while AB was the least prevalent group at 7.74%. 94.61% of the donor population was Rh positive and the rest were Rh negative. Regional variations were observed in the distribution. Using the maximum likelihood method, the frequencies of the I A , I B and I O alleles were calculated and tested according to the Hardy Weinberg law of Equilibrium. The calculated gene frequencies are 0.1653 for I A (p, 0.2254 for I B (q and 0.6093 for I O (r. In Indian Population, O (r records the highest value followed by B (q and A (p; O > B > A. Conclusion: The study provides information about the relative distribution of various alleles in the Indian population both on a pan-India basis as well as region-wise. This vital information may be helpful in planning for future health challenges, particularly planning with regards to blood transfusion services.
Conceptual citation frequency - quantum mechanics and elementary particle physics
International Nuclear Information System (INIS)
Hurt, C.D.
1986-01-01
The differences in conceptual citation frequency are examined between quantum mechanics literature and elementary particle physics literature. Using a sample based on increments of 5 years, 7 contrast tests were generated over a literature period of 35 years. A Dunn planned comparison procedure indicated a statistical difference in years 5 and 10 but no differences were found in the remaining years. The results must be weighed against the time frames in which the literature was produced but clearly point to an initial difference in the two areas. Additional work is required to reevaluate the findings and to investigate the conceptual citation frequency issue further. The frequency distribution generated approximates a cumulative advantage process. (author)
Willits, Jon A; Seidenberg, Mark S; Saffran, Jenny R
2014-09-01
What makes some words easy for infants to recognize, and other words difficult? We addressed this issue in the context of prior results suggesting that infants have difficulty recognizing verbs relative to nouns. In this work, we highlight the role played by the distributional contexts in which nouns and verbs occur. Distributional statistics predict that English nouns should generally be easier to recognize than verbs in fluent speech. However, there are situations in which distributional statistics provide similar support for verbs. The statistics for verbs that occur with the English morpheme -ing, for example, should facilitate verb recognition. In two experiments with 7.5- and 9.5-month-old infants, we tested the importance of distributional statistics for word recognition by varying the frequency of the contextual frames in which verbs occur. The results support the conclusion that distributional statistics are utilized by infant language learners and contribute to noun-verb differences in word recognition. Copyright © 2014. Published by Elsevier B.V.
DEFF Research Database (Denmark)
Conradsen, Knut; Nielsen, Allan Aasbjerg; Schou, Jesper
2003-01-01
. Based on this distribution, a test statistic for equality of two such matrices and an associated asymptotic probability for obtaining a smaller value of the test statistic are derived and applied successfully to change detection in polarimetric SAR data. In a case study, EMISAR L-band data from April 17...... to HH, VV, or HV data alone, the derived test statistic reduces to the well-known gamma likelihood-ratio test statistic. The derived test statistic and the associated significance value can be applied as a line or edge detector in fully polarimetric SAR data also....
CDFTBL: A statistical program for generating cumulative distribution functions from data
International Nuclear Information System (INIS)
Eslinger, P.W.
1991-06-01
This document describes the theory underlying the CDFTBL code and gives details for using the code. The CDFTBL code provides an automated tool for generating a statistical cumulative distribution function that describes a set of field data. The cumulative distribution function is written in the form of a table of probabilities, which can be used in a Monte Carlo computer code. A a specific application, CDFTBL can be used to analyze field data collected for parameters required by the PORMC computer code. Section 2.0 discusses the mathematical basis of the code. Section 3.0 discusses the code structure. Section 4.0 describes the free-format input command language, while Section 5.0 describes in detail the commands to run the program. Section 6.0 provides example program runs, and Section 7.0 provides references. The Appendix provides a program source listing. 11 refs., 2 figs., 19 tabs
Mulholland, Henry
1968-01-01
Fundamentals of Statistics covers topics on the introduction, fundamentals, and science of statistics. The book discusses the collection, organization and representation of numerical data; elementary probability; the binomial Poisson distributions; and the measures of central tendency. The text describes measures of dispersion for measuring the spread of a distribution; continuous distributions for measuring on a continuous scale; the properties and use of normal distribution; and tests involving the normal or student's 't' distributions. The use of control charts for sample means; the ranges
Mutual Information in Frequency and Its Application to Measure Cross-Frequency Coupling in Epilepsy
Malladi, Rakesh; Johnson, Don H.; Kalamangalam, Giridhar P.; Tandon, Nitin; Aazhang, Behnaam
2018-06-01
We define a metric, mutual information in frequency (MI-in-frequency), to detect and quantify the statistical dependence between different frequency components in the data, referred to as cross-frequency coupling and apply it to electrophysiological recordings from the brain to infer cross-frequency coupling. The current metrics used to quantify the cross-frequency coupling in neuroscience cannot detect if two frequency components in non-Gaussian brain recordings are statistically independent or not. Our MI-in-frequency metric, based on Shannon's mutual information between the Cramer's representation of stochastic processes, overcomes this shortcoming and can detect statistical dependence in frequency between non-Gaussian signals. We then describe two data-driven estimators of MI-in-frequency: one based on kernel density estimation and the other based on the nearest neighbor algorithm and validate their performance on simulated data. We then use MI-in-frequency to estimate mutual information between two data streams that are dependent across time, without making any parametric model assumptions. Finally, we use the MI-in- frequency metric to investigate the cross-frequency coupling in seizure onset zone from electrocorticographic recordings during seizures. The inferred cross-frequency coupling characteristics are essential to optimize the spatial and spectral parameters of electrical stimulation based treatments of epilepsy.
Hansen, John P
2003-01-01
Healthcare quality improvement professionals need to understand and use inferential statistics to interpret sample data from their organizations. In quality improvement and healthcare research studies all the data from a population often are not available, so investigators take samples and make inferences about the population by using inferential statistics. This three-part series will give readers an understanding of the concepts of inferential statistics as well as the specific tools for calculating confidence intervals for samples of data. This article, Part 2, describes probability, populations, and samples. The uses of descriptive and inferential statistics are outlined. The article also discusses the properties and probability of normal distributions, including the standard normal distribution.
Landslide scaling and magnitude-frequency distribution (Invited)
Stark, C. P.; Guzzetti, F.
2009-12-01
Landslide-driven erosion is controlled by the scale and frequency of slope failures and by the consequent fluxes of debris off the hillslopes. Here I focus on the magnitude-frequency part of the process and develop a theory of initial slope failure and debris mobilization that reproduces the heavy-tailed distributions (PDFs) observed for landslide source areas and volumes. Landslide rupture propagation is treated as a quasi-static, non-inertial process of simplified elastoplastic deformation with strain weakening; debris runout is not considered. The model tracks the stochastically evolving imbalance of frictional, cohesive, and body forces across a failing slope, and uses safety-factor concepts to convert the evolving imbalance into a series of incremental rupture growth or arrest probabilities. A single rupture is simulated with a sequence of weighted ``coin tosses'' with weights set by the growth probabilities. Slope failure treated in this stochastic way is a survival process that generates asymptotically power-law-tail PDFs of area and volume for rock and debris slides; predicted scaling exponents are consistent with analyses of landslide inventories. The primary control on the shape of the model PDFs is the relative importance of cohesion over friction in setting slope stability: the scaling of smaller, shallower failures, and the size of the most common landslide volumes, are the result of the low cohesion of soil and regolith, whereas the negative power-law tail scaling for larger failures is tied to the greater cohesion of bedrock. The debris budget may be dominated by small or large landslides depending on the scaling of both the PDF and of the depth-length relation. I will present new model results that confirm the hypothesis that depth-length scaling is linear. Model PDF of landslide volumes.
Blakemore, J S
1962-01-01
Semiconductor Statistics presents statistics aimed at complementing existing books on the relationships between carrier densities and transport effects. The book is divided into two parts. Part I provides introductory material on the electron theory of solids, and then discusses carrier statistics for semiconductors in thermal equilibrium. Of course a solid cannot be in true thermodynamic equilibrium if any electrical current is passed; but when currents are reasonably small the distribution function is but little perturbed, and the carrier distribution for such a """"quasi-equilibrium"""" co
DEFF Research Database (Denmark)
Jurado-Navas, Antonio
2015-01-01
in homogeneous, isotropic turbulence. Málaga distribution was demonstrated to have the advantage of unifying most of the proposed statistical models derived until now in the scientific literature in a closed-form and mathematically-tractable expression. Furthermore, it unifies most of the proposed statistical...... models for the irradiance fluctuations derived in the bibliography providing, in addition, an excellent agreement with published plane wave and spherical wave simulation data over a wide range of turbulence conditions (weak to strong). In this communication, reviews of its different features...... scintillation in atmospheric optical communication links under any turbulence conditions...
Noise and the statistical mechanics of distributed transport in a colony of interacting agents
Katifori, Eleni; Graewer, Johannes; Ronellenfitsch, Henrik; Mazza, Marco G.
Inspired by the process of liquid food distribution between individuals in an ant colony, in this work we consider the statistical mechanics of resource dissemination between interacting agents with finite carrying capacity. The agents move inside a confined space (nest), pick up the food at the entrance of the nest and share it with other agents that they encounter. We calculate analytically and via a series of simulations the global food intake rate for the whole colony as well as observables describing how uniformly the food is distributed within the nest. Our model and predictions provide a useful benchmark to assess which strategies can lead to efficient food distribution within the nest and also to what level the observed food uptake rates and efficiency in food distribution are due to stochastic fluctuations or specific food exchange strategies by an actual ant colony.
Golombeck, M.; Rapp, D.
1996-01-01
The size-frequency distribution of rocks and the Vicking landing sites and a variety of rocky locations on the Earth that formed from a number of geologic processes all have the general shape of simple exponential curves, which have been combined with remote sensing data and models on rock abundance to predict the frequency of boulders potentially hazardous to future Mars landers and rovers.
Inference for Local Distributions at High Sampling Frequencies: A Bootstrap Approach
DEFF Research Database (Denmark)
Hounyo, Ulrich; Varneskov, Rasmus T.
of "large" jumps. Our locally dependent wild bootstrap (LDWB) accommodate issues related to the stochastic scale and jumps as well as account for a special block-wise dependence structure induced by sampling errors. We show that the LDWB replicates first and second-order limit theory from the usual...... empirical process and the stochastic scale estimate, respectively, as well as an asymptotic bias. Moreover, we design the LDWB sufficiently general to establish asymptotic equivalence between it and and a nonparametric local block bootstrap, also introduced here, up to second-order distribution theory....... Finally, we introduce LDWB-aided Kolmogorov-Smirnov tests for local Gaussianity as well as local von-Mises statistics, with and without bootstrap inference, and establish their asymptotic validity using the second-order distribution theory. The finite sample performance of CLT and LDWB-aided local...
Coelho, Carlos A.; Marques, Filipe J.
2013-09-01
In this paper the authors combine the equicorrelation and equivariance test introduced by Wilks [13] with the likelihood ratio test (l.r.t.) for independence of groups of variables to obtain the l.r.t. of block equicorrelation and equivariance. This test or its single block version may find applications in many areas as in psychology, education, medicine, genetics and they are important "in many tests of multivariate analysis, e.g. in MANOVA, Profile Analysis, Growth Curve analysis, etc" [12, 9]. By decomposing the overall hypothesis into the hypotheses of independence of groups of variables and the hypothesis of equicorrelation and equivariance we are able to obtain the expressions for the overall l.r.t. statistic and its moments. From these we obtain a suitable factorization of the characteristic function (c.f.) of the logarithm of the l.r.t. statistic, which enables us to develop highly manageable and precise near-exact distributions for the test statistic.
Watson, Kara M.; McHugh, Amy R.
2014-01-01
Regional regression equations were developed for estimating monthly flow-duration and monthly low-flow frequency statistics for ungaged streams in Coastal Plain and non-coastal regions of New Jersey for baseline and current land- and water-use conditions. The equations were developed to estimate 87 different streamflow statistics, which include the monthly 99-, 90-, 85-, 75-, 50-, and 25-percentile flow-durations of the minimum 1-day daily flow; the August–September 99-, 90-, and 75-percentile minimum 1-day daily flow; and the monthly 7-day, 10-year (M7D10Y) low-flow frequency. These 87 streamflow statistics were computed for 41 continuous-record streamflow-gaging stations (streamgages) with 20 or more years of record and 167 low-flow partial-record stations in New Jersey with 10 or more streamflow measurements. The regression analyses used to develop equations to estimate selected streamflow statistics were performed by testing the relation between flow-duration statistics and low-flow frequency statistics for 32 basin characteristics (physical characteristics, land use, surficial geology, and climate) at the 41 streamgages and 167 low-flow partial-record stations. The regression analyses determined drainage area, soil permeability, average April precipitation, average June precipitation, and percent storage (water bodies and wetlands) were the significant explanatory variables for estimating the selected flow-duration and low-flow frequency statistics. Streamflow estimates were computed for two land- and water-use conditions in New Jersey—land- and water-use during the baseline period of record (defined as the years a streamgage had little to no change in development and water use) and current land- and water-use conditions (1989–2008)—for each selected station using data collected through water year 2008. The baseline period of record is representative of a period when the basin was unaffected by change in development. The current period is
Directory of Open Access Journals (Sweden)
Gökhan Gökdere
2014-05-01
Full Text Available In this paper, closed form expressions for the moments of the truncated Pareto order statistics are obtained by using conditional distribution. We also derive some results for the moments which will be useful for moment computations based on ordered data.
Pedretti, Daniele; Beckie, Roger Daniel
2014-05-01
Missing data in hydrological time-series databases are ubiquitous in practical applications, yet it is of fundamental importance to make educated decisions in problems involving exhaustive time-series knowledge. This includes precipitation datasets, since recording or human failures can produce gaps in these time series. For some applications, directly involving the ratio between precipitation and some other quantity, lack of complete information can result in poor understanding of basic physical and chemical dynamics involving precipitated water. For instance, the ratio between precipitation (recharge) and outflow rates at a discharge point of an aquifer (e.g. rivers, pumping wells, lysimeters) can be used to obtain aquifer parameters and thus to constrain model-based predictions. We tested a suite of methodologies to reconstruct missing information in rainfall datasets. The goal was to obtain a suitable and versatile method to reduce the errors given by the lack of data in specific time windows. Our analyses included both a classical chronologically-pairing approach between rainfall stations and a probability-based approached, which accounted for the probability of exceedence of rain depths measured at two or multiple stations. Our analyses proved that it is not clear a priori which method delivers the best methodology. Rather, this selection should be based considering the specific statistical properties of the rainfall dataset. In this presentation, our emphasis is to discuss the effects of a few typical parametric distributions used to model the behavior of rainfall. Specifically, we analyzed the role of distributional "tails", which have an important control on the occurrence of extreme rainfall events. The latter strongly affect several hydrological applications, including recharge-discharge relationships. The heavy-tailed distributions we considered were parametric Log-Normal, Generalized Pareto, Generalized Extreme and Gamma distributions. The methods were
Saha, Arindam; Amritkar, R. E.
2014-12-01
Kuramoto oscillators have been proposed earlier as a model for interacting systems that exhibit synchronisation. In this article we study the difference between networks with symmetric and asymmetric distribution of natural frequencies. We first indicate that the synchronisation frequency of the oscillators is independent of the natural frequency distribution for a completely connected network. Further we analyse the case of oscillators in a directed ring-network where asymmetry in the natural frequency distribution is seen to shift the synchronisation frequency of the network. We also present an estimate of the shift in the frequencies for slightly asymmetric distributions.
Schneider, Markus P. A.
This dissertation contributes to two areas in economics: the understanding of the distribution of earned income and to Bayesian analysis of distributional data. Recently, physicists claimed that the distribution of earned income is exponential (see Yakovenko, 2009). The first chapter explores the perspective that the economy is a statistical mechanical system and the implication for labor market outcomes is considered critically. The robustness of the empirical results that lead to the physicists' claims, the significance of the exponential distribution in statistical mechanics, and the case for a conservation law in economics are discussed. The conclusion reached is that physicists' conception of the economy is too narrow even within their chosen framework, but that their overall approach is insightful. The dual labor market theory of segmented labor markets is invoked to understand why the observed distribution may be a mixture of distributional components, corresponding to different generating mechanisms described in Reich et al. (1973). The application of informational entropy in chapter II connects this work to Bayesian analysis and maximum entropy econometrics. The analysis follows E. T. Jaynes's treatment of Wolf's dice data, but is applied to the distribution of earned income based on CPS data. The results are calibrated to account for rounded survey responses using a simple simulation, and answer the graphical analyses by physicists. The results indicate that neither the income distribution of all respondents nor of the subpopulation used by physicists appears to be exponential. The empirics do support the claim that a mixture with exponential and log-normal distributional components ts the data. In the final chapter, a log-linear model is used to fit the exponential to the earned income distribution. Separating the CPS data by gender and marital status reveals that the exponential is only an appropriate model for a limited number of subpopulations, namely
Distributive estimation of frequency selective channels for massive MIMO systems
Zaib, Alam
2015-12-28
We consider frequency selective channel estimation in the uplink of massive MIMO-OFDM systems, where our major concern is complexity. A low complexity distributed LMMSE algorithm is proposed that attains near optimal channel impulse response (CIR) estimates from noisy observations at receive antenna array. In proposed method, every antenna estimates the CIRs of its neighborhood followed by recursive sharing of estimates with immediate neighbors. At each step, every antenna calculates the weighted average of shared estimates which converges to near optimal LMMSE solution. The simulation results validate the near optimal performance of proposed algorithm in terms of mean square error (MSE). © 2015 EURASIP.
Crater size-frequency distributions and a revised Martian relative chronology
International Nuclear Information System (INIS)
Barlow, N.G.
1988-01-01
A relative plotting technique is applied to Viking 1:2M photomosaics of 25,826 Martian craters of diameter greater than 8 km and age younger than that of the Martian surface. The size-frequency distribution curves are calculated and analyzed in detail, and the results are presented in extensive tables and maps. It is found that about 60 percent of the crater-containing lithologic units, including many small volcanoes and the ridged planes, were formed during the heavy-bombardment period (HBP), while 40 percent arose after the HBP. Wide region-to-region variation in the crater density is noted, and localized age estimates are provided. 42 references
Differences in Crossover Frequency and Distribution among Three Sibling Species of Drosophila
True, J. R.; Mercer, J. M.; Laurie, C. C.
1996-01-01
Comparisons of the genetic and cytogenetic maps of three sibling species of Drosophila reveal marked differences in the frequency and cumulative distribution of crossovers during meiosis. The maps for two of these species, Drosophila melanogaster and D. simulans, have previously been described, while this report presents new map data for D. mauritiana, obtained using a set of P element markers. A genetic map covering nearly the entire genome was constructed by estimating the recombination fra...
Dupoyet, B.; Fiebig, H. R.; Musgrove, D. P.
2010-01-01
We report on initial studies of a quantum field theory defined on a lattice with multi-ladder geometry and the dilation group as a local gauge symmetry. The model is relevant in the cross-disciplinary area of econophysics. A corresponding proposal by Ilinski aimed at gauge modeling in non-equilibrium pricing is implemented in a numerical simulation. We arrive at a probability distribution of relative gains which matches the high frequency historical data of the NASDAQ stock exchange index.
Distributed Optical Fiber Sensors Based on Optical Frequency Domain Reflectometry: A review.
Ding, Zhenyang; Wang, Chenhuan; Liu, Kun; Jiang, Junfeng; Yang, Di; Pan, Guanyi; Pu, Zelin; Liu, Tiegen
2018-04-03
Distributed optical fiber sensors (DOFS) offer unprecedented features, the most unique one of which is the ability of monitoring variations of the physical and chemical parameters with spatial continuity along the fiber. Among all these distributed sensing techniques, optical frequency domain reflectometry (OFDR) has been given tremendous attention because of its high spatial resolution and large dynamic range. In addition, DOFS based on OFDR have been used to sense many parameters. In this review, we will survey the key technologies for improving sensing range, spatial resolution and sensing performance in DOFS based on OFDR. We also introduce the sensing mechanisms and the applications of DOFS based on OFDR including strain, stress, vibration, temperature, 3D shape, flow, refractive index, magnetic field, radiation, gas and so on.
Theoretical statistics of zero-age cataclysmic variables
International Nuclear Information System (INIS)
Politano, M.J.
1988-01-01
The distribution of the white dwarf masses, the distribution of the mass ratios and the distribution of the orbital periods in cataclysmic variables which are forming at the present time are calculated. These systems are referred to as zero-age cataclysmic variables. The results show that 60% of the systems being formed contain helium white dwarfs and 40% contain carbon-oxygen white dwarfs. The mean dwarf mass in those systems containing helium white dwarfs is 0.34. The mean white dwarf mass in those systems containing carbon-oxygen white dwarfs is 0.75. The orbital period distribution identifies four main classes of zero-age cataclysmic variables: (1) short-period systems containing helium white dwarfs, (2) systems containing carbon-oxygen white dwarfs whose secondaries are convectively stable against rapid mass transfer to the white dwarf, (3) systems containing carbon-oxygen white dwarfs whose secondaries are radiatively stable against rapid mass transfer to the white dwarf and (4) long period systems with evolved secondaries. The white dwarf mass distribution in zero-age cataclysmic variables has direct application to the calculation of the frequency of outburst in classical novae as a function of the mass of the white dwarf. The method developed in this thesis to calculate the distributions of the orbital parameters in zero-age cataclysmic variables can be used to calculate theoretical statistics of any class of binary systems. This method provides a theoretical framework from which to investigate the statistical properties and the evolution of the orbital parameters of binary systems
Non-Gaussian power grid frequency fluctuations characterized by Lévy-stable laws and superstatistics
Schäfer, Benjamin; Beck, Christian; Aihara, Kazuyuki; Witthaut, Dirk; Timme, Marc
2018-02-01
Multiple types of fluctuations impact the collective dynamics of power grids and thus challenge their robust operation. Fluctuations result from processes as different as dynamically changing demands, energy trading and an increasing share of renewable power feed-in. Here we analyse principles underlying the dynamics and statistics of power grid frequency fluctuations. Considering frequency time series for a range of power grids, including grids in North America, Japan and Europe, we find a strong deviation from Gaussianity best described as Lévy-stable and q-Gaussian distributions. We present a coarse framework to analytically characterize the impact of arbitrary noise distributions, as well as a superstatistical approach that systematically interprets heavy tails and skewed distributions. We identify energy trading as a substantial contribution to today's frequency fluctuations and effective damping of the grid as a controlling factor enabling reduction of fluctuation risks, with enhanced effects for small power grids.
Statistical modeling of urban air temperature distributions under different synoptic conditions
Beck, Christoph; Breitner, Susanne; Cyrys, Josef; Hald, Cornelius; Hartz, Uwe; Jacobeit, Jucundus; Richter, Katja; Schneider, Alexandra; Wolf, Kathrin
2015-04-01
Within urban areas air temperature may vary distinctly between different locations. These intra-urban air temperature variations partly reach magnitudes that are relevant with respect to human thermal comfort. Therefore and furthermore taking into account potential interrelations with other health related environmental factors (e.g. air quality) it is important to estimate spatial patterns of intra-urban air temperature distributions that may be incorporated into urban planning processes. In this contribution we present an approach to estimate spatial temperature distributions in the urban area of Augsburg (Germany) by means of statistical modeling. At 36 locations in the urban area of Augsburg air temperatures are measured with high temporal resolution (4 min.) since December 2012. These 36 locations represent different typical urban land use characteristics in terms of varying percentage coverages of different land cover categories (e.g. impervious, built-up, vegetated). Percentage coverages of these land cover categories have been extracted from different sources (Open Street Map, European Urban Atlas, Urban Morphological Zones) for regular grids of varying size (50, 100, 200 meter horizonal resolution) for the urban area of Augsburg. It is well known from numerous studies that land use characteristics have a distinct influence on air temperature and as well other climatic variables at a certain location. Therefore air temperatures at the 36 locations are modeled utilizing land use characteristics (percentage coverages of land cover categories) as predictor variables in Stepwise Multiple Regression models and in Random Forest based model approaches. After model evaluation via cross-validation appropriate statistical models are applied to gridded land use data to derive spatial urban air temperature distributions. Varying models are tested and applied for different seasons and times of the day and also for different synoptic conditions (e.g. clear and calm
Cohn, T.A.; England, J.F.; Berenbrock, C.E.; Mason, R.R.; Stedinger, J.R.; Lamontagne, J.R.
2013-01-01
he Grubbs-Beck test is recommended by the federal guidelines for detection of low outliers in flood flow frequency computation in the United States. This paper presents a generalization of the Grubbs-Beck test for normal data (similar to the Rosner (1983) test; see also Spencer and McCuen (1996)) that can provide a consistent standard for identifying multiple potentially influential low flows. In cases where low outliers have been identified, they can be represented as “less-than” values, and a frequency distribution can be developed using censored-data statistical techniques, such as the Expected Moments Algorithm. This approach can improve the fit of the right-hand tail of a frequency distribution and provide protection from lack-of-fit due to unimportant but potentially influential low flows (PILFs) in a flood series, thus making the flood frequency analysis procedure more robust.
International Nuclear Information System (INIS)
Cologne, J.B.; Preston, D.L.
1998-01-01
Biological dosimeters are useful for epidemiologic risk assessment in populations exposed to catastrophic nuclear events and as a means of validating physical dosimetry in radiation workers. Application requires knowledge of the magnitude of uncertainty in the biological dose estimates and an understanding of potential statistical pitfalls arising from their use. This paper describes the statistical aspects of biological dosimetry in general and presents a detailed analysis in the specific case of dosimetry for risk assessment using stable chromosome aberration frequency. Biological dose estimates may be obtained from a dose-response curve, but negative estimates can result and adjustment must be made for regression bias due to imprecise estimation when the estimates are used in regression analyses. Posterior-mean estimates, derived as the mean of the distribution of true doses compatible with a given value of the biological endpoint, have several desirable properties: they are nonnegative, less sensitive to extreme skewness in the true dose distribution, and implicitly adjusted to avoid regression bias. The methods necessitate approximating the true-dose distribution in the population in which biological dosimetry is being applied, which calls for careful consideration of this distribution through other information. An important question addressed here is to what extent the methods are robust to misspecification of this distribution, because in many applications of biological dosimetry it cannot be characterized well. The findings suggest that dosimetry based solely on stable chromosome aberration frequency may be useful for population-based risk assessment
Structure Learning and Statistical Estimation in Distribution Networks - Part I
Energy Technology Data Exchange (ETDEWEB)
Deka, Deepjyoti [Univ. of Texas, Austin, TX (United States); Backhaus, Scott N. [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Chertkov, Michael [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)
2015-02-13
Traditionally power distribution networks are either not observable or only partially observable. This complicates development and implementation of new smart grid technologies, such as those related to demand response, outage detection and management, and improved load-monitoring. In this two part paper, inspired by proliferation of the metering technology, we discuss estimation problems in structurally loopy but operationally radial distribution grids from measurements, e.g. voltage data, which are either already available or can be made available with a relatively minor investment. In Part I, the objective is to learn the operational layout of the grid. Part II of this paper presents algorithms that estimate load statistics or line parameters in addition to learning the grid structure. Further, Part II discusses the problem of structure estimation for systems with incomplete measurement sets. Our newly suggested algorithms apply to a wide range of realistic scenarios. The algorithms are also computationally efficient – polynomial in time– which is proven theoretically and illustrated computationally on a number of test cases. The technique developed can be applied to detect line failures in real time as well as to understand the scope of possible adversarial attacks on the grid.
Incoherent Optical Frequency Domain Reflectometry for Distributed Thermal Sensing
DEFF Research Database (Denmark)
Karamehmedovic, Emir
2006-01-01
comprising a pump laser, optical filters, optical fibre and photo-detectors are presented. Limitations, trade-offs and optimisation processes are described for setups having different specifications with respect to range, resolution and accuracy. The analysis is conducted using computer simulation programs...... developed and implemented in Matlab. The computer model is calibrated and tested, and describes the entire system with high precision. Noise analysis and digital processing of the detected signal are discussed as well. An equation describing the standard deviation of the measured temperature is derived......This thesis reports the main results from an investigation of a fibre-optic distributed temperature sensor based on spontaneous Raman scattering. The technique used for spatial resolving is the incoherent optical frequency domain reflectometry, where a pump laser is sine modulated with a stepwise...
Directory of Open Access Journals (Sweden)
Mehmet KURBAN
2007-01-01
Full Text Available In this paper, the wind energy potential of the region is analyzed with Weibull and Reyleigh statistical distribution functions by using the wind speed data measured per 15 seconds in July, August, September, and October of 2005 at 10 m height of 30-m observation pole in the wind observation station constructed in the coverage of the scientific research project titled "The Construction of Hybrid (Wind-Solar Power Plant Model by Determining the Wind and Solar Potential in the Iki Eylul Campus of A.U." supported by Anadolu University. The Maximum likelihood method is used for finding the parameters of these distributions. The conclusion of the analysis for the months taken represents that the Weibull distribution models the wind speeds better than the Rayleigh distribution. Furthermore, the error rate in the monthly values of power density computed by using the Weibull distribution is smaller than the values by Rayleigh distribution.
LPI Radar Waveform Recognition Based on Time-Frequency Distribution
Directory of Open Access Journals (Sweden)
Ming Zhang
2016-10-01
Full Text Available In this paper, an automatic radar waveform recognition system in a high noise environment is proposed. Signal waveform recognition techniques are widely applied in the field of cognitive radio, spectrum management and radar applications, etc. We devise a system to classify the modulating signals widely used in low probability of intercept (LPI radar detection systems. The radar signals are divided into eight types of classifications, including linear frequency modulation (LFM, BPSK (Barker code modulation, Costas codes and polyphase codes (comprising Frank, P1, P2, P3 and P4. The classifier is Elman neural network (ENN, and it is a supervised classification based on features extracted from the system. Through the techniques of image filtering, image opening operation, skeleton extraction, principal component analysis (PCA, image binarization algorithm and Pseudo–Zernike moments, etc., the features are extracted from the Choi–Williams time-frequency distribution (CWD image of the received data. In order to reduce the redundant features and simplify calculation, the features selection algorithm based on mutual information between classes and features vectors are applied. The superiority of the proposed classification system is demonstrated by the simulations and analysis. Simulation results show that the overall ratio of successful recognition (RSR is 94.7% at signal-to-noise ratio (SNR of −2 dB.
A Statistical and Spectral Model for Representing Noisy Sounds with Short-Time Sinusoids
Directory of Open Access Journals (Sweden)
Myriam Desainte-Catherine
2005-07-01
Full Text Available We propose an original model for noise analysis, transformation, and synthesis: the CNSS model. Noisy sounds are represented with short-time sinusoids whose frequencies and phases are random variables. This spectral and statistical model represents information about the spectral density of frequencies. This perceptually relevant property is modeled by three mathematical parameters that define the distribution of the frequencies. This model also represents the spectral envelope. The mathematical parameters are defined and the analysis algorithms to extract these parameters from sounds are introduced. Then algorithms for generating sounds from the parameters of the model are presented. Applications of this model include tools for composers, psychoacoustic experiments, and pedagogy.
Experimental Limits on Gravitational Waves in the MHz frequency Range
Energy Technology Data Exchange (ETDEWEB)
Lanza, Robert Jr. [Univ. of Chicago, IL (United States)
2015-03-01
This thesis presents the results of a search for gravitational waves in the 1-11MHz frequency range using dual power-recycled Michelson laser interferometers at Fermi National Accelerator Laboratory. An unprecedented level of sensitivity to gravitational waves in this frequency range has been achieved by cross-correlating the output fluctuations of two identical and colocated 40m long interferometers. This technique produces sensitivities better than two orders of magnitude below the quantum shot-noise limit, within integration times of less than 1 hour. 95% confidence level upper limits are placed on the strain amplitude of MHz frequency gravitational waves at the 10^{-21} Hz^{-1/2} level, constituting the best direct limits to date at these frequencies. For gravitational wave power distributed over this frequency range, a broadband upper limit of 2.4 x 10^{-21}Hz^{-1/2} at 95% confidence level is also obtained. This thesis covers the detector technology, the commissioning and calibration of the instrument, the statistical data analysis, and the gravitational wave limit results. Particular attention is paid to the end-to-end calibration of the instrument’s sensitivity to differential arm length motion, and so to gravitational wave strain. A detailed statistical analysis of the data is presented as well.
Directory of Open Access Journals (Sweden)
Hsueh-Hsien Chang
2017-04-01
Full Text Available This paper proposes statistical feature extraction methods combined with artificial intelligence (AI approaches for fault locations in non-intrusive single-line-to-ground fault (SLGF detection of low voltage distribution systems. The input features of the AI algorithms are extracted using statistical moment transformation for reducing the dimensions of the power signature inputs measured by using non-intrusive fault monitoring (NIFM techniques. The data required to develop the network are generated by simulating SLGF using the Electromagnetic Transient Program (EMTP in a test system. To enhance the identification accuracy, these features after normalization are given to AI algorithms for presenting and evaluating in this paper. Different AI techniques are then utilized to compare which identification algorithms are suitable to diagnose the SLGF for various power signatures in a NIFM system. The simulation results show that the proposed method is effective and can identify the fault locations by using non-intrusive monitoring techniques for low voltage distribution systems.
Yonemaru, Naoyuki; Kumamoto, Hiroki; Takahashi, Keitaro; Kuroyanagi, Sachiko
2018-04-01
A new detection method for ultra-low frequency gravitational waves (GWs) with a frequency much lower than the observational range of pulsar timing arrays (PTAs) was suggested in Yonemaru et al. (2016). In the PTA analysis, ultra-low frequency GWs (≲ 10-10 Hz) which evolve just linearly during the observation time span are absorbed by the pulsar spin-down rates since both have the same effect on the pulse arrival time. Therefore, such GWs cannot be detected by the conventional method of PTAs. However, the bias on the observed spin-down rates depends on relative direction of a pulsar and GW source and shows a quadrupole pattern in the sky. Thus, if we divide the pulsars according to the position in the sky and see the difference in the statistics of the spin-down rates, ultra-low frequency GWs from a single source can be detected. In this paper, we evaluate the potential of this method by Monte-Carlo simulations and estimate the sensitivity, considering only the "Earth term" while the "pulsar term" acts like random noise for GW frequencies 10-13 - 10-10 Hz. We find that with 3,000 milli-second pulsars, which are expected to be discovered by a future survey with the Square Kilometre Array, GWs with the derivative of amplitude of about 3 × 10^{-19} {s}^{-1} can in principle be detected. Implications for possible supermassive binary black holes in Sgr* and M87 are also given.
Distribution-level electricity reliability: Temporal trends using statistical analysis
International Nuclear Information System (INIS)
Eto, Joseph H.; LaCommare, Kristina H.; Larsen, Peter; Todd, Annika; Fisher, Emily
2012-01-01
This paper helps to address the lack of comprehensive, national-scale information on the reliability of the U.S. electric power system by assessing trends in U.S. electricity reliability based on the information reported by the electric utilities on power interruptions experienced by their customers. The research analyzes up to 10 years of electricity reliability information collected from 155 U.S. electric utilities, which together account for roughly 50% of total U.S. electricity sales. We find that reported annual average duration and annual average frequency of power interruptions have been increasing over time at a rate of approximately 2% annually. We find that, independent of this trend, installation or upgrade of an automated outage management system is correlated with an increase in the reported annual average duration of power interruptions. We also find that reliance on IEEE Standard 1366-2003 is correlated with higher reported reliability compared to reported reliability not using the IEEE standard. However, we caution that we cannot attribute reliance on the IEEE standard as having caused or led to higher reported reliability because we could not separate the effect of reliance on the IEEE standard from other utility-specific factors that may be correlated with reliance on the IEEE standard. - Highlights: ► We assess trends in electricity reliability based on the information reported by the electric utilities. ► We use rigorous statistical techniques to account for utility-specific differences. ► We find modest declines in reliability analyzing interruption duration and frequency experienced by utility customers. ► Installation or upgrade of an OMS is correlated to an increase in reported duration of power interruptions. ► We find reliance in IEEE Standard 1366 is correlated with higher reported reliability.
A Statistical Method for Aggregated Wind Power Plants to Provide Secondary Frequency Control
DEFF Research Database (Denmark)
Hu, Junjie; Ziras, Charalampos; Bindner, Henrik W.
2017-01-01
curtailment for aggregated wind power plants providing secondary frequency control (SFC) to the power system. By using historical SFC signals and wind speed data, we calculate metrics for the reserve provision error as a function of the scheduled wind power. We show that wind curtailment can be significantly......The increasing penetration of wind power brings significant challenges to power system operators due to the wind’s inherent uncertainty and variability. Traditionally, power plants and more recently demand response have been used to balance the power system. However, the use of wind power...... as a balancing-power source has also been investigated, especially for wind power dominated power systems such as Denmark. The main drawback is that wind power must be curtailed by setting a lower operating point, in order to offer upward regulation. We propose a statistical approach to reduce wind power...
Rapp, J.B.
1991-01-01
Q-mode factor analysis was used to quantitate the distribution of the major aliphatic hydrocarbon (n-alkanes, pristane, phytane) systems in sediments from a variety of marine environments. The compositions of the pure end members of the systems were obtained from factor scores and the distribution of the systems within each sample was obtained from factor loadings. All the data, from the diverse environments sampled (estuarine (San Francisco Bay), fresh-water (San Francisco Peninsula), polar-marine (Antarctica) and geothermal-marine (Gorda Ridge) sediments), were reduced to three major systems: a terrestrial system (mostly high molecular weight aliphatics with odd-numbered-carbon predominance), a mature system (mostly low molecular weight aliphatics without predominance) and a system containing mostly high molecular weight aliphatics with even-numbered-carbon predominance. With this statistical approach, it is possible to assign the percentage contribution from various sources to the observed distribution of aliphatic hydrocarbons in each sediment sample. ?? 1991.
Crossover distribution and frequency are regulated by him-5 in Caenorhabditis elegans.
Meneely, Philip M; McGovern, Olivia L; Heinis, Frazer I; Yanowitz, Judith L
2012-04-01
Mutations in the him-5 gene in Caenorhabditis elegans strongly reduce the frequency of crossovers on the X chromosome, with lesser effects on the autosomes. him-5 mutants also show a change in crossover distribution on both the X and autosomes. These phenotypes are accompanied by a delayed entry into pachytene and premature desynapsis of the X chromosome. The nondisjunction, progression defects and desynapsis can be rescued by an exogenous source of double strand breaks (DSBs), indicating that the role of HIM-5 is to promote the formation of meiotic DSBs. Molecular cloning of the gene shows that the inferred HIM-5 product is a highly basic protein of 252 amino acids with no clear orthologs in other species, including other Caenorhabditis species. Although him-5 mutants are defective in segregation of the X chromosome, HIM-5 protein localizes preferentially to the autosomes. The mutant phenotypes and localization of him-5 are similar but not identical to the results seen with xnd-1, although unlike xnd-1, him-5 has no apparent effect on the acetylation of histone H2A on lysine 5 (H2AacK5). The localization of HIM-5 to the autosomes depends on the activities of both xnd-1 and him-17 allowing us to begin to establish pathways for the control of crossover distribution and frequency.
Distribution, Statistics, and Resurfacing of Large Impact Basins on Mercury
Fassett, Caleb I.; Head, James W.; Baker, David M. H.; Chapman, Clark R.; Murchie, Scott L.; Neumann, Gregory A.; Oberst, Juergen; Prockter, Louise M.; Smith, David E.; Solomon, Sean C.;
2012-01-01
The distribution and geological history of large impact basins (diameter D greater than or equal to 300 km) on Mercury is important to understanding the planet's stratigraphy and surface evolution. It is also informative to compare the density of impact basins on Mercury with that of the Moon to understand similarities and differences in their impact crater and basin populations [1, 2]. A variety of impact basins were proposed on the basis of geological mapping with Mariner 10 data [e.g. 3]. This basin population can now be re-assessed and extended to the full planet, using data from the MErcury Surface, Space ENvironment, GEochemistry, and Ranging (MESSENGER) spacecraft. Note that small-to- medium-sized peak-ring basins on Mercury are being examined separately [4, 5]; only the three largest peak-ring basins on Mercury overlap with the size range we consider here. In this study, we (1) re-examine the large basins suggested on the basis of Mariner 10 data, (2) suggest additional basins from MESSENGER's global coverage of Mercury, (3) assess the size-frequency distribution of mercurian basins on the basis of these global observations and compare it to the Moon, and (4) analyze the implications of these observations for the modification history of basins on Mercury.
Distributed Optical Fiber Sensors Based on Optical Frequency Domain Reflectometry: A review
Directory of Open Access Journals (Sweden)
Zhenyang Ding
2018-04-01
Full Text Available Distributed optical fiber sensors (DOFS offer unprecedented features, the most unique one of which is the ability of monitoring variations of the physical and chemical parameters with spatial continuity along the fiber. Among all these distributed sensing techniques, optical frequency domain reflectometry (OFDR has been given tremendous attention because of its high spatial resolution and large dynamic range. In addition, DOFS based on OFDR have been used to sense many parameters. In this review, we will survey the key technologies for improving sensing range, spatial resolution and sensing performance in DOFS based on OFDR. We also introduce the sensing mechanisms and the applications of DOFS based on OFDR including strain, stress, vibration, temperature, 3D shape, flow, refractive index, magnetic field, radiation, gas and so on.
Distributed Optical Fiber Sensors Based on Optical Frequency Domain Reflectometry: A review
Wang, Chenhuan; Liu, Kun; Jiang, Junfeng; Yang, Di; Pan, Guanyi; Pu, Zelin; Liu, Tiegen
2018-01-01
Distributed optical fiber sensors (DOFS) offer unprecedented features, the most unique one of which is the ability of monitoring variations of the physical and chemical parameters with spatial continuity along the fiber. Among all these distributed sensing techniques, optical frequency domain reflectometry (OFDR) has been given tremendous attention because of its high spatial resolution and large dynamic range. In addition, DOFS based on OFDR have been used to sense many parameters. In this review, we will survey the key technologies for improving sensing range, spatial resolution and sensing performance in DOFS based on OFDR. We also introduce the sensing mechanisms and the applications of DOFS based on OFDR including strain, stress, vibration, temperature, 3D shape, flow, refractive index, magnetic field, radiation, gas and so on. PMID:29614024
Directory of Open Access Journals (Sweden)
Dr. Mohammad Kamali
2000-05-01
Full Text Available Objectives: determining the frequency distribution of hearing disorders among the student of public elementary school by Otoscopy, Puretone Audiometry, Impedance Audiometiy and questionnaires. Methods and Materials: This study was carried out in a cross - sectional descriptive survey - on 1200 students (600 girls and 600 boys among the student of primary school of Neishabbor, academic year 1376-77 Results: 1- Otoscopy examination; abnormal conditions of external ear canal was found in 14.1% of cases; mostly impacted ceruman (13.6%; Abnormal conditions of tympanic membrane (0.4% and foreign body 0.16%. Abnormal conditions of external ear canal was statistically unmeaningful. (P V=0 .8 2- Impedance Audiometiy: 5. 75% Abnormal tympanograms were observed, mostly type C (3.15% 3- Pure Tone Audiometry: An overall 5.5% hearing loss was found in this population including 3% bilateral and 1.25% unilateral hearing loss. Only conductive hearing loss was found in this population (2.7% in right ear and 3.5% in left ear, PV=0.9. Hearing loss observed mostly in girls but the difference was slight. Family background showed no effects on the hearing disorders.9.8% of cases were in need of medical care and 0.3% were in need of rehabilitation services. Only 28.8% of parents, 36.3% of teachers and 40.9% of afflicted students were aware of the problem.
The footprint of atmospheric turbulence in power grid frequency measurements
Haehne, H.; Schottler, J.; Waechter, M.; Peinke, J.; Kamps, O.
2018-02-01
Fluctuating wind energy makes a stable grid operation challenging. Due to the direct contact with atmospheric turbulence, intermittent short-term variations in the wind speed are converted to power fluctuations that cause transient imbalances in the grid. We investigate the impact of wind energy feed-in on short-term fluctuations in the frequency of the public power grid, which we have measured in our local distribution grid. By conditioning on wind power production data, provided by the ENTSO-E transparency platform, we demonstrate that wind energy feed-in has a measurable effect on frequency increment statistics for short time scales (renewable generation.
Perea, Manuel; Urkia, Miriam; Davis, Colin J; Agirre, Ainhoa; Laseka, Edurne; Carreiras, Manuel
2006-11-01
We describe a Windows program that enables users to obtain a broad range of statistics concerning the properties of word and nonword stimuli in an agglutinative language (Basque), including measures of word frequency (at the whole-word and lemma levels), bigram and biphone frequency, orthographic similarity, orthographic and phonological structure, and syllable-based measures. It is designed for use by researchers in psycholinguistics, particularly those concerned with recognition of isolated words and morphology. In addition to providing standard orthographic and phonological neighborhood measures, the program can be used to obtain information about other forms of orthographic similarity, such as transposed-letter similarity and embedded-word similarity. It is available free of charge from www .uv.es/mperea/E-Hitz.zip.
Li, Li; Xiong, De-fu; Liu, Jia-wen; Li, Zi-xin; Zeng, Guang-cheng; Li, Hua-liang
2014-03-01
We aimed to evaluate the interference of 50 Hz extremely low frequency electromagnetic field (ELF-EMF) occupational exposure on the neurobehavior tests of workers performing tour-inspection close to transformers and distribution power lines. Occupational short-term "spot" measurements were carried out. 310 inspection workers and 300 logistics staff were selected as exposure and control. The neurobehavior tests were performed through computer-based neurobehavior evaluation system, including mental arithmetic, curve coincide, simple visual reaction time, visual retention, auditory digit span and pursuit aiming. In 500 kV areas electric field intensity at 71.98% of total measured 590 spots were above 5 kV/m (national occupational standard), while in 220 kV areas electric field intensity at 15.69% of total 701 spots were above 5 kV/m. Magnetic field flux density at all the spots was below 1,000 μT (ICNIRP occupational standard). The neurobehavior score changes showed no statistical significance. Results of neurobehavior tests among different age, seniority groups showed no significant changes. Neurobehavior changes caused by daily repeated ELF-EMF exposure were not observed in the current study.
Asymptotic distribution of ∆AUC, NRIs, and IDI based on theory of U-statistics.
Demler, Olga V; Pencina, Michael J; Cook, Nancy R; D'Agostino, Ralph B
2017-09-20
The change in area under the curve (∆AUC), the integrated discrimination improvement (IDI), and net reclassification index (NRI) are commonly used measures of risk prediction model performance. Some authors have reported good validity of associated methods of estimating their standard errors (SE) and construction of confidence intervals, whereas others have questioned their performance. To address these issues, we unite the ∆AUC, IDI, and three versions of the NRI under the umbrella of the U-statistics family. We rigorously show that the asymptotic behavior of ∆AUC, NRIs, and IDI fits the asymptotic distribution theory developed for U-statistics. We prove that the ∆AUC, NRIs, and IDI are asymptotically normal, unless they compare nested models under the null hypothesis. In the latter case, asymptotic normality and existing SE estimates cannot be applied to ∆AUC, NRIs, or IDI. In the former case, SE formulas proposed in the literature are equivalent to SE formulas obtained from U-statistics theory if we ignore adjustment for estimated parameters. We use Sukhatme-Randles-deWet condition to determine when adjustment for estimated parameters is necessary. We show that adjustment is not necessary for SEs of the ∆AUC and two versions of the NRI when added predictor variables are significant and normally distributed. The SEs of the IDI and three-category NRI should always be adjusted for estimated parameters. These results allow us to define when existing formulas for SE estimates can be used and when resampling methods such as the bootstrap should be used instead when comparing nested models. We also use the U-statistic theory to develop a new SE estimate of ∆AUC. Copyright © 2017 John Wiley & Sons, Ltd. Copyright © 2017 John Wiley & Sons, Ltd.
Directory of Open Access Journals (Sweden)
María Gabriela Mago Ramos
2012-05-01
Full Text Available A methodology was developed for analysing faults in distribution transformers using the statistical package for social sciences (SPSS; it consisted of organising and creating of database regarding failed equipment, incorporating such data into the processing programme and converting all the information into numerical variables to be processed, thereby obtaining descriptive statistics and enabling factor and discriminant analysis. The research was based on information provided by companies in areas served by Corpoelec (Valencia, Venezuela and Codensa (Bogotá, Colombia.
Zipf’s word frequency law in natural language: A critical review and future directions
2014-01-01
The frequency distribution of words has been a key object of study in statistical linguistics for the past 70 years. This distribution approximately follows a simple mathematical form known as Zipf ’ s law. This article first shows that human language has a highly complex, reliable structure in the frequency distribution over and above this classic law, although prior data visualization methods have obscured this fact. A number of empirical phenomena related to word frequencies are then reviewed. These facts are chosen to be informative about the mechanisms giving rise to Zipf’s law and are then used to evaluate many of the theoretical explanations of Zipf’s law in language. No prior account straightforwardly explains all the basic facts or is supported with independent evaluation of its underlying assumptions. To make progress at understanding why language obeys Zipf’s law, studies must seek evidence beyond the law itself, testing assumptions and evaluating novel predictions with new, independent data. PMID:24664880
Scientific applications of frequency-stabilized laser technology in space
Schumaker, Bonny L.
1990-01-01
A synoptic investigation of the uses of frequency-stabilized lasers for scientific applications in space is presented. It begins by summarizing properties of lasers, characterizing their frequency stability, and describing limitations and techniques to achieve certain levels of frequency stability. Limits to precision set by laser frequency stability for various kinds of measurements are investigated and compared with other sources of error. These other sources include photon-counting statistics, scattered laser light, fluctuations in laser power, and intensity distribution across the beam, propagation effects, mechanical and thermal noise, and radiation pressure. Methods are explored to improve the sensitivity of laser-based interferometric and range-rate measurements. Several specific types of science experiments that rely on highly precise measurements made with lasers are analyzed, and anticipated errors and overall performance are discussed. Qualitative descriptions are given of a number of other possible science applications involving frequency-stabilized lasers and related laser technology in space. These applications will warrant more careful analysis as technology develops.
Wavelet Based Characterization of Low Radio Frequency Solar Emissions
Suresh, A.; Sharma, R.; Das, S. B.; Oberoi, D.; Pankratius, V.; Lonsdale, C.
2016-12-01
Low-frequency solar radio observations with the Murchison Widefield Array (MWA) have revealed the presence of numerous short-lived, narrow-band weak radio features, even during quiet solar conditions. In their appearance in in the frequency-time plane, they come closest to the solar type III bursts, but with much shorter spectral spans and flux densities, so much so that they are not detectable with the usual swept frequency radio spectrographs. These features occur at rates of many thousand features per hour in the 30.72 MHz MWA bandwidth, and hence necessarily require an automated approach to determine robust statistical estimates of their properties, e.g., distributions of spectral widths, temporal spans, flux densities, slopes in the time-frequency plane and distribution over frequency. To achieve this, a wavelet decomposition approach has been developed for feature recognition and subsequent parameter extraction from the MWA dynamic spectrum. This work builds on earlier work by the members of this team to achieve a reliable flux calibration in a computationally efficient manner. Preliminary results show that the distribution of spectral span of these features peaks around 3 MHz, most of them last for less than two seconds and are characterized by flux densities of about 60% of the background solar emission. In analogy with the solar type III bursts, this non-thermal emission is envisaged to arise via coherent emission processes. There is also an exciting possibility that these features might correspond to radio signatures of nanoflares, hypothesized (Gold, 1964; Parker, 1972) to explain coronal heating.
Robust Distributed Model Predictive Load Frequency Control of Interconnected Power System
Directory of Open Access Journals (Sweden)
Xiangjie Liu
2013-01-01
Full Text Available Considering the load frequency control (LFC of large-scale power system, a robust distributed model predictive control (RDMPC is presented. The system uncertainty according to power system parameter variation alone with the generation rate constraints (GRC is included in the synthesis procedure. The entire power system is composed of several control areas, and the problem is formulated as convex optimization problem with linear matrix inequalities (LMI that can be solved efficiently. It minimizes an upper bound on a robust performance objective for each subsystem. Simulation results show good dynamic response and robustness in the presence of power system dynamic uncertainties.
Hayslett, H T
1991-01-01
Statistics covers the basic principles of Statistics. The book starts by tackling the importance and the two kinds of statistics; the presentation of sample data; the definition, illustration and explanation of several measures of location; and the measures of variation. The text then discusses elementary probability, the normal distribution and the normal approximation to the binomial. Testing of statistical hypotheses and tests of hypotheses about the theoretical proportion of successes in a binomial population and about the theoretical mean of a normal population are explained. The text the
Epping, Ruben; Panne, Ulrich; Falkenhagen, Jana
2017-02-07
Statistical ethylene oxide (EO) and propylene oxide (PO) copolymers of different monomer compositions and different average molar masses additionally containing two kinds of end groups (FTD) were investigated by ultra high pressure liquid chromatography under critical conditions (UP-LCCC) combined with electrospray ionization time-of flight mass spectrometry (ESI-TOF-MS). Theoretical predictions of the existence of a critical adsorption point (CPA) for statistical copolymers with a given chemical and sequence distribution1 could be studied and confirmed. A fundamentally new approach to determine these critical conditions in a copolymer, alongside the inevitable chemical composition distribution (CCD), with mass spectrometric detection, is described. The shift of the critical eluent composition with the monomer composition of the polymers was determined. Due to the broad molar mass distribution (MMD) and the presumed existence of different end group functionalities as well as monomer sequence distribution (MSD), gradient separation only by CCD was not possible. Therefore, isocratic separation conditions at the CPA of definite CCD fractions were developed. Although the various present distributions partly superimposed the separation process, the goal of separation by end group functionality was still achieved on the basis of the additional dimension of ESI-TOF-MS. The existence of HO-H besides the desired allylO-H end group functionalities was confirmed and their amount estimated. Furthermore, indications for a MSD were found by UPLC/MS/MS measurements. This approach offers for the first time the possibility to obtain a fingerprint of a broad distributed statistical copolymer including MMD, FTD, CCD, and MSD.
The Effects of Flare Definitions on the Statistics of Derived Flare Distrubtions
Ryan, Daniel; Dominique, Marie; Seaton, Daniel B.; Stegen, Koen; White, Arthur
2016-05-01
The statistical examination of solar flares is crucial to revealing their global characteristics and behaviour. However, statistical flare studies are often performed using standard but basic flare detection algorithms relying on arbitrary thresholds which may affect the derived flare distributions. We explore the effect of the arbitrary thresholds used in the GOES event list and LYRA Flare Finder algorithms. We find that there is a small but significant relationship between the power law exponent of the GOES flare peak flux frequency distribution and the algorithms’ flare start thresholds. We also find that the power law exponents of these distributions are not stable but appear to steepen with increasing peak flux. This implies that the observed flare size distribution may not be a power law at all. We show that depending on the true value of the exponent of the flare size distribution, this deviation from a power law may be due to flares missed by the flare detection algorithms. However, it is not possible determine the true exponent from GOES/XRS observations. Additionally we find that the PROBA2/LYRA flare size distributions are clearly non-power law. We show that this is consistent with an insufficient degradation correction which causes LYRA absolute irradiance values to be unreliable. This means that they should not be used for flare statistics or energetics unless degradation is adequately accounted for. However they can be used to study time variations over shorter timescales and for space weather monitoring.
Prediction of slant path rain attenuation statistics at various locations
Goldhirsh, J.
1977-01-01
The paper describes a method for predicting slant path attenuation statistics at arbitrary locations for variable frequencies and path elevation angles. The method involves the use of median reflectivity factor-height profiles measured with radar as well as the use of long-term point rain rate data and assumed or measured drop size distributions. The attenuation coefficient due to cloud liquid water in the presence of rain is also considered. Absolute probability fade distributions are compared for eight cases: Maryland (15 GHz), Texas (30 GHz), Slough, England (19 and 37 GHz), Fayetteville, North Carolina (13 and 18 GHz), and Cambridge, Massachusetts (13 and 18 GHz).
On precipitation monitoring with theoretical statistical distributions
Cindrić, Ksenija; Juras, Josip; Pasarić, Zoran
2018-04-01
A common practice in meteorological drought monitoring is to transform the observed precipitation amounts to the standardised precipitation index (SPI). Though the gamma distribution is usually employed for this purpose, some other distribution may be used, particularly in regions where zero precipitation amounts are recorded frequently. In this study, two distributions are considered alongside with the gamma distribution: the compound Poisson exponential distribution (CPE) and the square root normal distribution (SRN). They are fitted to monthly precipitation amounts measured at 24 stations in Croatia in the 55-year-long period (1961-2015). At five stations, long-term series (1901-2015) are available and they have been used for a more detailed investigation. The accommodation of the theoretical distributions to empirical ones is tested by comparison of the corresponding empirical and theoretical ratios of the skewness and the coefficient of variation. Furthermore, following the common approach to precipitation monitoring (CLIMAT reports), the comparison of the empirical and theoretical quintiles in the two periods (1961-1990 and 1991-2015) is examined. The results from the present study reveal that it would be more appropriate to implement theoretical distributions in such climate reports, since they provide better evaluation for monitoring purposes than the current empirical distribution. Nevertheless, deciding on an optimal theoretical distribution for different climate regimes and for different time periods is not easy to accomplish. With regard to Croatian stations (covering different climate regimes), the CPE or SRN distribution could also be the right choice in the climatological practice, in addition to the gamma distribution.
STATISTICAL INSIGHT INTO THE BINDING REGIONS IN DISORDERED HUMAN PROTEOME
Directory of Open Access Journals (Sweden)
Uttam Pal
2016-03-01
Full Text Available The human proteome contains a significant number of intrinsically disordered proteins (IDPs. They show unusual structural features that enable them to participate in diverse cellular functions and play significant roles in cell signaling and reorganization processes. In addition, the actions of IDPs, their functional cooperativity, conformational alterations and folding often accompany binding to a target macromolecule. Applying bioinformatics approaches and with the aid of statistical methodologies, we investigated the statistical parameters of binding regions (BRs found in disordered human proteome. In this report, we detailed the bioinformatics analysis of binding regions found in the IDPs. Statistical models for the occurrence of BRs, their length distribution and percent occupancy in the parent proteins are shown. The frequency of BRs followed a Poisson distribution pattern with increasing expectancy with the degree of disorderedness. The length of the individual BRs also followed Poisson distribution with a mean of 6 residues, whereas, percentage of residues in BR showed a normal distribution pattern. We also explored the physicochemical properties such as the grand average of hydropathy (GRAVY and the theoretical isoelectric points (pIs. The theoretical pIs of the BRs followed a bimodal distribution as in the parent proteins. However, the mean acidic/basic pIs were significantly lower/higher than that of the proteins, respectively. We further showed that the amino acid composition of BRs was enriched in hydrophobic residues such as Ala, Val, Ile, Leu and Phe compared to the average sequence content of the proteins. Sequences in a BR showed conformational adaptability mostly towards flexible coil structure and followed by helix, however, the ordered secondary structural conformation was significantly lower in BRs than the proteins. Combining and comparing these statistical information of BRs with other methods may be useful for high
Characterizing the Statistics of a Bunch of Optical Pulses Using a Nonlinear Optical Loop Mirror
Directory of Open Access Journals (Sweden)
Olivier Pottiez
2015-01-01
Full Text Available We propose in this work a technique for determining the amplitude distribution of a wave packet containing a large number of short optical pulses with different amplitudes. The technique takes advantage of the fast response of the optical Kerr effect in a fiber nonlinear optical loop mirror (NOLM. Under some assumptions, the statistics of the pulses can be determined from the energy transfer characteristic of the packet through the NOLM, which can be measured with a low-frequency detection setup. The statistical distribution is retrieved numerically by approximating the solution of a system of nonlinear algebraic equations using the least squares method. The technique is demonstrated numerically in the case of a packet of solitons.
International Nuclear Information System (INIS)
Chung, Christine B.; Vande Berg, Bruno C.; Malghem, Jacques; Tavernier, Thierry; Cotten, Anne; Laredo, Jean-Denis; Vallee, Christian
2004-01-01
To investigate the frequency and distribution of end plate marrow signal intensity changes in an asymptomatic population and to correlate these findings with patient age and degenerative findings in the spine. MR imaging studies of the lumbosacral (LS) spine in 59 asymptomatic subjects were retrospectively reviewed by 2 musculoskeletal radiologists to determine the presence and location of fat-like and edema-like marrow signal changes about the end plates of the L1-2 through L5-S1 levels. The presence of degenerative changes in the spine was recorded as was patient age. Descriptive statistics were utilized to determine the frequency and associations of end plate findings and degenerative changes in the spine. Interobserver variability was determined by a kappa score. Binomial probability was used to predict the prevalence of the end plate changes in a similar subject population. The Fisher exact test was performed to determine statistical significance of the relationship of end plate changes with degenerative changes in the spine, superior versus inferior location about the disc and age of the patient population. Focal fat-like signal intensity adjacent to the vertebral end-plate was noted in 15 out of 59 subjects by both readers, and involved 38 and 36 out of 590 end plates by readers 1 and 2, respectively. Focal edema-like signal intensity adjacent to the vertebral end plate was noted in 8 out of 59 subjects by both readers and involved 11 and 10 out of 590 end plates by readers 1 and 2, respectively. Either fat or edema signal intensity occurred most often at the anterior (p<.05) aspects of the mid-lumbar spine and was seen in an older sub-population of the study (p<.05). End plate marrow signal intensity changes are present in the lumbar spine of some asymptomatic subjects with a characteristic location along the spine and in vertebral end plates. (orig.)
Karbasi, Ashraf; Aliannejad, Rasoul; Ghanei, Mostafa; Sanamy, Mehran Noory; Alaeddini, Farshid; Harandi, Ali Amini
2015-07-01
There is no data on the prevalence and the association of gastro esophageal reflux disease (GERD) with toxic fume inhalation. Therefore, we aimed to evaluate the frequency distribution of GERD symptoms among the individuals with mild respiratory disorder due to the past history of toxic fume exposure to sulfur mustard (SM). In a historical cohort study, subjects were randomly selected from 7000 patients in a database of all those who had a history of previous exposure to a single high dose of SM gas during war. The control group was randomly selected from adjacent neighbors of the patients, and two healthy male subjects were chosen per patient. In this study, we used the validated Persian translation of Mayo Gastroesophageal Reflux Questionnaire to assess the frequency distribution of reflux disease. Relative frequency of GERD symptoms, was found to be significantly higher in the inhalation injury patients with an odds ratio of 8.30 (95% confidence interval [CI]: 4.73-14.55), and after adjustment for cigarette smoking, tea consumption, age, and body mass index, aspirin and chronic cough the odds ratio was found to be 4.41 (95% CI: 1.61-12.07). The most important finding of our study was the major GERD symptoms (heartburn and/or acid regurgitation once or more per week) among the individuals with the past history of exposure to SM toxic gas is substantially higher (4.4-fold) than normal populations.
Statistical properties of the ice particle distribution in stratiform clouds
Delanoe, J.; Tinel, C.; Testud, J.
2003-04-01
This paper presents an extensive analysis of several microphysical data bases CEPEX, EUCREX, CLARE and CARL to determine statistical properties of the Particle Size Distribution (PSD). The data base covers different type of stratiform clouds : tropical cirrus (CEPEX), mid-latitude cirrus (EUCREX) and mid-latitude cirrus and stratus (CARL,CLARE) The approach for analysis uses the concept of normalisation of the PSD developed by Testud et al. (2001). The normalization aims at isolating three independent characteristics of the PSD : its "intrinsic" shape, the "average size" of the spectrum and the ice water content IWC, "average size" is meant the mean mass weighted diameter. It is shown that concentration should be normalized by N_0^* proportional to IWC/D_m^4. The "intrinsic" shape is defined as F(Deq/D_m)=N(Deq)/N_0^* where Deq is the equivalent melted diameter. The "intrinsic" shape is found to be very stable in the range 001.5, more scatter is observed, but future analysis should decide if it is representative of real physical variation or statistical "error" due to counting problem. Considering an overall statistics over the full data base, a large scatter of the N_0^* against Dm plot is found. But in the case of a particular event or a particular leg of a flight, the N_0^* vs. Dm plot is much less scattered and shows a systematic trend for decaying of N_0^* when Dm increases. This trend is interpreted as the manifestation of the predominance of the aggregation process. Finally an important point for cloud remote sensing is investigated : the normalised relationships IWC/N_0^* against Z/N_0^* is much less scattered that the classical IWC against Z the radar reflectivity factor.
Statistical distribution of time to crack initiation and initial crack size using service data
Heller, R. A.; Yang, J. N.
1977-01-01
Crack growth inspection data gathered during the service life of the C-130 Hercules airplane were used in conjunction with a crack propagation rule to estimate the distribution of crack initiation times and of initial crack sizes. A Bayesian statistical approach was used to calculate the fraction of undetected initiation times as a function of the inspection time and the reliability of the inspection procedure used.
Statistical methods in nuclear theory
International Nuclear Information System (INIS)
Shubin, Yu.N.
1974-01-01
The paper outlines statistical methods which are widely used for describing properties of excited states of nuclei and nuclear reactions. It discusses physical assumptions lying at the basis of known distributions between levels (Wigner, Poisson distributions) and of widths of highly excited states (Porter-Thomas distribution, as well as assumptions used in the statistical theory of nuclear reactions and in the fluctuation analysis. The author considers the random matrix method, which consists in replacing the matrix elements of a residual interaction by random variables with a simple statistical distribution. Experimental data are compared with results of calculations using the statistical model. The superfluid nucleus model is considered with regard to superconducting-type pair correlations
Probability Model of Allele Frequency of Alzheimer’s Disease Genetic Risk Factor
Directory of Open Access Journals (Sweden)
Afshin Fayyaz-Movaghar
2016-06-01
Full Text Available Background and Purpose: The identification of genetics risk factors of human diseases is very important. This study is conducted to model the allele frequencies (AFs of Alzheimer’s disease. Materials and Methods: In this study, several candidate probability distributions are fitted on a data set of Alzheimer’s disease genetic risk factor. Unknown parameters of the considered distributions are estimated, and some criterions of goodness-of-fit are calculated for the sake of comparison. Results: Based on some statistical criterions, the beta distribution gives the best fit on AFs. However, the estimate values of the parameters of beta distribution lead us to the standard uniform distribution. Conclusion: The AFs of Alzheimer’s disease follow the standard uniform distribution.
Fast and Statistically Efficient Fundamental Frequency Estimation
DEFF Research Database (Denmark)
Nielsen, Jesper Kjær; Jensen, Tobias Lindstrøm; Jensen, Jesper Rindom
2016-01-01
Fundamental frequency estimation is a very important task in many applications involving periodic signals. For computational reasons, fast autocorrelation-based estimation methods are often used despite parametric estimation methods having superior estimation accuracy. However, these parametric...... a recursive solver. Via benchmarks, we demonstrate that the computation time is reduced by approximately two orders of magnitude. The proposed fast algorithm is available for download online....
Energy Technology Data Exchange (ETDEWEB)
Song, Myung Sub; Kim, Song Hyun; Kim, Jong Kyung [Hanyang Univ., Seoul (Korea, Republic of); Noh, Jae Man [Korea Atomic Energy Research Institute, Daejeon (Korea, Republic of)
2013-10-15
The uncertainty evaluation with statistical method is performed by repetition of transport calculation with sampling the directly perturbed nuclear data. Hence, the reliable uncertainty result can be obtained by analyzing the results of the numerous transport calculations. One of the problems in the uncertainty analysis with the statistical approach is known as that the cross section sampling from the normal (Gaussian) distribution with relatively large standard deviation leads to the sampling error of the cross sections such as the sampling of the negative cross section. Some collection methods are noted; however, the methods can distort the distribution of the sampled cross sections. In this study, a sampling method of the nuclear data is proposed by using lognormal distribution. After that, the criticality calculations with sampled nuclear data are performed and the results are compared with that from the normal distribution which is conventionally used in the previous studies. In this study, the statistical sampling method of the cross section with the lognormal distribution was proposed to increase the sampling accuracy without negative sampling error. Also, a stochastic cross section sampling and writing program was developed. For the sensitivity and uncertainty analysis, the cross section sampling was pursued with the normal and lognormal distribution. The uncertainties, which are caused by covariance of (n,.) cross sections, were evaluated by solving GODIVA problem. The results show that the sampling method with lognormal distribution can efficiently solve the negative sampling problem referred in the previous studies. It is expected that this study will contribute to increase the accuracy of the sampling-based uncertainty analysis.
International Nuclear Information System (INIS)
Song, Myung Sub; Kim, Song Hyun; Kim, Jong Kyung; Noh, Jae Man
2013-01-01
The uncertainty evaluation with statistical method is performed by repetition of transport calculation with sampling the directly perturbed nuclear data. Hence, the reliable uncertainty result can be obtained by analyzing the results of the numerous transport calculations. One of the problems in the uncertainty analysis with the statistical approach is known as that the cross section sampling from the normal (Gaussian) distribution with relatively large standard deviation leads to the sampling error of the cross sections such as the sampling of the negative cross section. Some collection methods are noted; however, the methods can distort the distribution of the sampled cross sections. In this study, a sampling method of the nuclear data is proposed by using lognormal distribution. After that, the criticality calculations with sampled nuclear data are performed and the results are compared with that from the normal distribution which is conventionally used in the previous studies. In this study, the statistical sampling method of the cross section with the lognormal distribution was proposed to increase the sampling accuracy without negative sampling error. Also, a stochastic cross section sampling and writing program was developed. For the sensitivity and uncertainty analysis, the cross section sampling was pursued with the normal and lognormal distribution. The uncertainties, which are caused by covariance of (n,.) cross sections, were evaluated by solving GODIVA problem. The results show that the sampling method with lognormal distribution can efficiently solve the negative sampling problem referred in the previous studies. It is expected that this study will contribute to increase the accuracy of the sampling-based uncertainty analysis
Discrete- and finite-bandwidth-frequency distributions in nonlinear stability applications
Kuehl, Joseph J.
2017-02-01
A new "wave packet" formulation of the parabolized stability equations method is presented. This method accounts for the influence of finite-bandwidth-frequency distributions on nonlinear stability calculations. The methodology is motivated by convolution integrals and is found to appropriately represent nonlinear energy transfer between primary modes and harmonics, in particular nonlinear feedback, via a "nonlinear coupling coefficient." It is found that traditional discrete mode formulations overestimate nonlinear feedback by approximately 70%. This results in smaller maximum disturbance amplitudes than those observed experimentally. The new formulation corrects this overestimation, accounts for the generation of side lobes responsible for spectral broadening, and results in disturbance representation more consistent with the experiment than traditional formulations. A Mach 6 flared-cone example is presented.
Directory of Open Access Journals (Sweden)
M.M. Mohie El-Din
2011-10-01
Full Text Available In this paper, two sample Bayesian prediction intervals for order statistics (OS are obtained. This prediction is based on a certain class of the inverse exponential-type distributions using a right censored sample. A general class of prior density functions is used and the predictive cumulative function is obtained in the two samples case. The class of the inverse exponential-type distributions includes several important distributions such the inverse Weibull distribution, the inverse Burr distribution, the loglogistic distribution, the inverse Pareto distribution and the inverse paralogistic distribution. Special cases of the inverse Weibull model such as the inverse exponential model and the inverse Rayleigh model are considered.
Riehle, Fritz
2006-01-01
Of all measurement units, frequency is the one that may be determined with the highest degree of accuracy. It equally allows precise measurements of other physical and technical quantities, whenever they can be measured in terms of frequency.This volume covers the central methods and techniques relevant for frequency standards developed in physics, electronics, quantum electronics, and statistics. After a review of the basic principles, the book looks at the realisation of commonly used components. It then continues with the description and characterisation of important frequency standards
Statistical modeling of total crash frequency at highway intersections
Directory of Open Access Journals (Sweden)
Arash M. Roshandeh
2016-04-01
Full Text Available Intersection-related crashes are associated with high proportion of accidents involving drivers, occupants, pedestrians, and cyclists. In general, the purpose of intersection safety analysis is to determine the impact of safety-related variables on pedestrians, cyclists and vehicles, so as to facilitate the design of effective and efficient countermeasure strategies to improve safety at intersections. This study investigates the effects of traffic, environmental, intersection geometric and pavement-related characteristics on total crash frequencies at intersections. A random-parameter Poisson model was used with crash data from 357 signalized intersections in Chicago from 2004 to 2010. The results indicate that out of the identified factors, evening peak period traffic volume, pavement condition, and unlighted intersections have the greatest effects on crash frequencies. Overall, the results seek to suggest that, in order to improve effective highway-related safety countermeasures at intersections, significant attention must be focused on ensuring that pavements are adequately maintained and intersections should be well lighted. It needs to be mentioned that, projects could be implemented at and around the study intersections during the study period (7 years, which could affect the crash frequency over the time. This is an important variable which could be a part of the future studies to investigate the impacts of safety-related works at intersections and their marginal effects on crash frequency at signalized intersections.
Phenomenological model to fit complex permittivity data of water from radio to optical frequencies.
Shubitidze, Fridon; Osterberg, Ulf
2007-04-01
A general factorized form of the dielectric function together with a fractional model-based parameter estimation method is used to provide an accurate analytical formula for the complex refractive index in water for the frequency range 10(8)-10(16)Hz . The analytical formula is derived using a combination of a microscopic frequency-dependent rational function for adjusting zeros and poles of the dielectric dispersion together with the macroscopic statistical Fermi-Dirac distribution to provide a description of both the real and imaginary parts of the complex permittivity for water. The Fermi-Dirac distribution allows us to model the dramatic reduction in the imaginary part of the permittivity in the visible window of the water spectrum.
Lu, Xian; Chu, Xinzhao; Li, Haoyu; Chen, Cao; Smith, John A.; Vadas, Sharon L.
2017-09-01
We present the first statistical study of gravity waves with periods of 0.3-2.5 h that are persistent and dominant in the vertical winds measured with the University of Colorado STAR Na Doppler lidar in Boulder, CO (40.1°N, 105.2°W). The probability density functions of the wave amplitudes in temperature and vertical wind, ratios of these two amplitudes, phase differences between them, and vertical wavelengths are derived directly from the observations. The intrinsic period and horizontal wavelength of each wave are inferred from its vertical wavelength, amplitude ratio, and a designated eddy viscosity by applying the gravity wave polarization and dispersion relations. The amplitude ratios are positively correlated with the ground-based periods with a coefficient of 0.76. The phase differences between the vertical winds and temperatures (φW -φT) follow a Gaussian distribution with 84.2±26.7°, which has a much larger standard deviation than that predicted for non-dissipative waves ( 3.3°). The deviations of the observed phase differences from their predicted values for non-dissipative waves may indicate wave dissipation. The shorter-vertical-wavelength waves tend to have larger phase difference deviations, implying that the dissipative effects are more significant for shorter waves. The majority of these waves have the vertical wavelengths ranging from 5 to 40 km with a mean and standard deviation of 18.6 and 7.2 km, respectively. For waves with similar periods, multiple peaks in the vertical wavelengths are identified frequently and the ones peaking in the vertical wind are statistically longer than those peaking in the temperature. The horizontal wavelengths range mostly from 50 to 500 km with a mean and median of 180 and 125 km, respectively. Therefore, these waves are mesoscale waves with high-to-medium frequencies. Since they have recently become resolvable in high-resolution general circulation models (GCMs), this statistical study provides an important
Loxley, P N
2017-10-01
The two-dimensional Gabor function is adapted to natural image statistics, leading to a tractable probabilistic generative model that can be used to model simple cell receptive field profiles, or generate basis functions for sparse coding applications. Learning is found to be most pronounced in three Gabor function parameters representing the size and spatial frequency of the two-dimensional Gabor function and characterized by a nonuniform probability distribution with heavy tails. All three parameters are found to be strongly correlated, resulting in a basis of multiscale Gabor functions with similar aspect ratios and size-dependent spatial frequencies. A key finding is that the distribution of receptive-field sizes is scale invariant over a wide range of values, so there is no characteristic receptive field size selected by natural image statistics. The Gabor function aspect ratio is found to be approximately conserved by the learning rules and is therefore not well determined by natural image statistics. This allows for three distinct solutions: a basis of Gabor functions with sharp orientation resolution at the expense of spatial-frequency resolution, a basis of Gabor functions with sharp spatial-frequency resolution at the expense of orientation resolution, or a basis with unit aspect ratio. Arbitrary mixtures of all three cases are also possible. Two parameters controlling the shape of the marginal distributions in a probabilistic generative model fully account for all three solutions. The best-performing probabilistic generative model for sparse coding applications is found to be a gaussian copula with Pareto marginal probability density functions.
Olurotimi, E. O.; Sokoya, O.; Ojo, J. S.; Owolawi, P. A.
2018-03-01
Rain height is one of the significant parameters for prediction of rain attenuation for Earth-space telecommunication links, especially those operating at frequencies above 10 GHz. This study examines Three-parameter Dagum distribution of the rain height over Durban, South Africa. 5-year data were used to study the monthly, seasonal, and annual variations using the parameters estimated by the maximum likelihood of the distribution. The performance estimation of the distribution was determined using the statistical goodness of fit. Three-parameter Dagum distribution shows an appropriate distribution for the modeling of rain height over Durban with the Root Mean Square Error of 0.26. Also, the shape and scale parameters for the distribution show a wide variation. The probability exceedance of time for 0.01% indicates the high probability of rain attenuation at higher frequencies.
Gyenge, N.; Ballai, I.; Baranyi, T.
2016-07-01
The aim of the present investigation is to study the spatio-temporal distribution of precursor flares during the 24 h interval preceding M- and X-class major flares and the evolution of follower flares. Information on associated (precursor and follower) flares is provided by Reuven Ramaty High Energy Solar Spectroscopic Imager (RHESSI). Flare list, while the major flares are observed by the Geostationary Operational Environmental Satellite (GOES) system satellites between 2002 and 2014. There are distinct evolutionary differences between the spatio-temporal distributions of associated flares in about one-day period depending on the type of the main flare. The spatial distribution was characterized by the normalized frequency distribution of the quantity δ (the distance between the major flare and its precursor flare normalized by the sunspot group diameter) in four 6 h time intervals before the major event. The precursors of X-class flares have a double-peaked spatial distribution for more than half a day prior to the major flare, but it changes to a lognormal-like distribution roughly 6 h prior to the event. The precursors of M-class flares show lognormal-like distribution in each 6 h subinterval. The most frequent sites of the precursors in the active region are within a distance of about 0.1 diameter of sunspot group from the site of the major flare in each case. Our investigation shows that the build-up of energy is more effective than the release of energy because of precursors.
International Nuclear Information System (INIS)
D'Oliveira, A.B.; Amorim, E.S. do; Galvao, O.B.
1981-03-01
Double differential cross sections for thermal neutrons, based on incoherent approximation, using continum distribution as discrete frequency set are theoretically estimated, regarding two models previously done. The FASTT computer program is used in order to obtain a numerical estimation. (L.C.) [pt
Jin, Tian; Yuan, Heliang; Zhao, Na; Qin, Honglei; Sun, Kewen; Ji, Yuanfa
2017-12-04
Frequency-locked detector (FLD) has been widely utilized in tracking loops of Global Positioning System (GPS) receivers to indicate their locking status. The relation between FLD and lock status has been seldom discussed. The traditional PLL experience is not suitable for FLL. In this paper, the threshold setting criteria for frequency-locked detector in the GPS receiver has been proposed by analyzing statistical characteristic of FLD output. The approximate probability distribution of frequency-locked detector is theoretically derived by using a statistical approach, which reveals the relationship between probabilities of frequency-locked detector and the carrier-to-noise ratio ( C / N ₀) of the received GPS signal. The relationship among mean-time-to-lose-lock (MTLL), detection threshold and lock probability related to C / N ₀ can be further discovered by utilizing this probability. Therefore, a theoretical basis for threshold setting criteria in frequency locked loops for GPS receivers is provided based on mean-time-to-lose-lock analysis.
Energy Technology Data Exchange (ETDEWEB)
NONE
2008-07-01
The general direction of energy and raw materials (DGEMP) is in charge of the follow up of the French market of petroleum products distribution. Each year, a statistical inquiry is carried out with the suppliers and published in the annual report of the petroleum industry. A particular emphasis is laid on fuel sales at highway service stations which give some additional information. The 2007 sales remain stable with respect to 2006 but they show a significant progress of diesel with respect to gasoline. The super-ethanol (E85) sales remain modest but show an import rise all along the year. And finally, 70% of the market is in the hands of 3 suppliers only. (J.S.)
Lozano-Corté s, Diego; Berumen, Michael L.
2015-01-01
Coral colony size-frequency distributions can be used to assess population responses to local environmental conditions and disturbances. In this study, we surveyed juvenile pocilloporids, herbivorous fish densities, and algal cover in the central
Bearings fault detection in helicopters using frequency readjustment and cyclostationary analysis
Girondin, Victor; Pekpe, Komi Midzodzi; Morel, Herve; Cassar, Jean-Philippe
2013-07-01
The objective of this paper is to propose a vibration-based automated framework dealing with local faults occurring on bearings in the transmission of a helicopter. The knowledge of the shaft speed and kinematic computation provide theoretical frequencies that reveal deteriorations on the inner and outer races, on the rolling elements or on the cage. In practice, the theoretical frequencies of bearing faults may be shifted. They may also be masked by parasitical frequencies because the numerous noisy vibrations and the complexity of the transmission mechanics make the signal spectrum very profuse. Consequently, detection methods based on the monitoring of the theoretical frequencies may lead to wrong decisions. In order to deal with this drawback, we propose to readjust the fault frequencies from the theoretical frequencies using the redundancy introduced by the harmonics. The proposed method provides the confidence index of the readjusted frequency. Minor variations in shaft speed may induce random jitters. The change of the contact surface or of the transmission path brings also a random component in amplitude and phase. These random components in the signal destroy spectral localization of frequencies and thus hide the fault occurrence in the spectrum. Under the hypothesis that these random signals can be modeled as cyclostationary signals, the envelope spectrum can reveal that hidden patterns. In order to provide an indicator estimating fault severity, statistics are proposed. They make the hypothesis that the harmonics at the readjusted frequency are corrupted with an additive normally distributed noise. In this case, the statistics computed from the spectra are chi-square distributed and a signal-to-noise indicator is proposed. The algorithms are then tested with data from two test benches and from flight conditions. The bearing type and the radial load are the main differences between the experiences on the benches. The fault is mainly visible in the
Statistical analysis of the surface figure of the James Webb Space Telescope
Lightsey, Paul A.; Chaney, David; Gallagher, Benjamin B.; Brown, Bob J.; Smith, Koby; Schwenker, John
2012-09-01
The performance of an optical system is best characterized by either the point spread function (PSF) or the optical transfer function (OTF). However, for system budgeting purposes, it is convenient to use a single scalar metric, or a combination of a few scalar metrics to track performance. For the James Webb Space Telescope, the Observatory level requirements were expressed in metrics of Strehl Ratio, and Encircled Energy. These in turn were converted to the metrics of total rms WFE and rms WFE within spatial frequency domains. The 18 individual mirror segments for the primary mirror segment assemblies (PMSA), the secondary mirror (SM), tertiary mirror (TM), and Fine Steering Mirror have all been fabricated. They are polished beryllium mirrors with a protected gold reflective coating. The statistical analysis of the resulting Surface Figure Error of these mirrors has been analyzed. The average spatial frequency distribution and the mirror-to-mirror consistency of the spatial frequency distribution are reported. The results provide insight to system budgeting processes for similar optical systems.
Statistics of the acoustic emission signals parameters from Zircaloy-4 fuel cladding
International Nuclear Information System (INIS)
Oliveto, Maria E.; Lopez Pumarega, Maria I.; Ruzzante, Jose E.
2000-01-01
Statistic analysis of acoustic emission signals parameters: amplitude, duration and risetime was carried out. CANDU type Zircaloy-4 fuel claddings were pressurized up to rupture, one set of five normal pieces and six with defects included, acoustic emission was used on-line. Amplitude and duration frequency distributions were fitted with lognormal distribution functions, and risetime with an exponential one. Using analysis of variance, acoustic emission was appropriated to distinguish between defective and non-defective subsets. Clusters analysis applied on mean values of acoustic emission signal parameters were not effective to distinguish two sets of fuel claddings studied. (author)
Probability distribution functions for ELM bursts in a series of JET tokamak discharges
International Nuclear Information System (INIS)
Greenhough, J; Chapman, S C; Dendy, R O; Ward, D J
2003-01-01
A novel statistical treatment of the full raw edge localized mode (ELM) signal from a series of previously studied JET plasmas is tested. The approach involves constructing probability distribution functions (PDFs) for ELM amplitudes and time separations, and quantifying the fit between the measured PDFs and model distributions (Gaussian, inverse exponential) and Poisson processes. Uncertainties inherent in the discreteness of the raw signal require the application of statistically rigorous techniques to distinguish ELM data points from background, and to extrapolate peak amplitudes. The accuracy of PDF construction is further constrained by the relatively small number of ELM bursts (several hundred) in each sample. In consequence the statistical technique is found to be difficult to apply to low frequency (typically Type I) ELMs, so the focus is narrowed to four JET plasmas with high frequency (typically Type III) ELMs. The results suggest that there may be several fundamentally different kinds of Type III ELMing process at work. It is concluded that this novel statistical treatment can be made to work, may have wider applications to ELM data, and has immediate practical value as an additional quantitative discriminant between classes of ELMing behaviour
Hefferman, Gerald; Chen, Zhen; Wei, Tao
2017-07-01
This article details the generation of an extended-bandwidth frequency sweep using a single, communication grade distributed feedback (DFB) laser. The frequency sweep is generated using a two-step technique. In the first step, injection current modulation is employed as a means of varying the output frequency of a DFB laser over a bandwidth of 99.26 GHz. A digital optical phase lock loop is used to lock the frequency sweep speed during current modulation, resulting in a linear frequency chirp. In the second step, the temperature of the DFB laser is modulated, resulting in a shifted starting laser output frequency. A laser frequency chirp is again generated beginning at this shifted starting frequency, resulting in a frequency-shifted spectrum relative to the first recorded data. This process is then repeated across a range of starting temperatures, resulting in a series of partially overlapping, frequency-shifted spectra. These spectra are then aligned using cross-correlation and combined using averaging to form a single, broadband spectrum with a total bandwidth of 510.9 GHz. In order to investigate the utility of this technique, experimental testing was performed in which the approach was used as the swept-frequency source of a coherent optical frequency domain reflectometry system. This system was used to interrogate an optical fiber containing a 20 point, 1-mm pitch length fiber Bragg grating, corresponding to a period of 100 GHz. Using this technique, both the periodicity of the grating in the frequency domain and the individual reflector elements of the structure in the time domain were resolved, demonstrating the technique's potential as a method of extending the sweeping bandwidth of semiconductor lasers for frequency-based sensing applications.
Hefferman, Gerald; Chen, Zhen; Wei, Tao
2017-07-01
This article details the generation of an extended-bandwidth frequency sweep using a single, communication grade distributed feedback (DFB) laser. The frequency sweep is generated using a two-step technique. In the first step, injection current modulation is employed as a means of varying the output frequency of a DFB laser over a bandwidth of 99.26 GHz. A digital optical phase lock loop is used to lock the frequency sweep speed during current modulation, resulting in a linear frequency chirp. In the second step, the temperature of the DFB laser is modulated, resulting in a shifted starting laser output frequency. A laser frequency chirp is again generated beginning at this shifted starting frequency, resulting in a frequency-shifted spectrum relative to the first recorded data. This process is then repeated across a range of starting temperatures, resulting in a series of partially overlapping, frequency-shifted spectra. These spectra are then aligned using cross-correlation and combined using averaging to form a single, broadband spectrum with a total bandwidth of 510.9 GHz. In order to investigate the utility of this technique, experimental testing was performed in which the approach was used as the swept-frequency source of a coherent optical frequency domain reflectometry system. This system was used to interrogate an optical fiber containing a 20 point, 1-mm pitch length fiber Bragg grating, corresponding to a period of 100 GHz. Using this technique, both the periodicity of the grating in the frequency domain and the individual reflector elements of the structure in the time domain were resolved, demonstrating the technique's potential as a method of extending the sweeping bandwidth of semiconductor lasers for frequency-based sensing applications.
Changkit, N.; Boonkrongcheep, R.; Youngchauy, U.; Polthum, S.; Kessaratikoon, P.
2017-09-01
The specific activities of natural radionuclides (40K, 226Ra and 232Th) in 50 surface beach sand samples collected from Bangsaen beach in Chonburi province in the easthern region of Thailand, were measured and evaluated. Experimental results were obtained by using a high-purity germanium (HPGe) detector and gamma spectrometry analysis system in the special laboratory at Thailand Institute of Nuclear Technology (Public Organization). The IAEA-SOIL-375 reference material was used to analyze the concentration of 40K, 226Ra and 232Th in all samples. It was found that the specific activities of 40K, 226Ra and 232Th were ranged from 510.85 - 771.35, 8.17 - 17.06 and 4.25 - 15.68 Bq/kg. Furthermore, frequency distribution of the specific activities were studied, analyzed and found to be the asymmetrical distribution by using a statistical computer program. Moreover, four radiological hazard indices for the investigated area were also calculated by using the median values of specific activities of 40K, 226Ra and 232Th. The results were also compared with the Office of Atoms for Peace (OAP) annual report data, Thailand and global radioactivity measurement and evaluations.
Directory of Open Access Journals (Sweden)
Chen Cao
2016-09-01
Full Text Available This study focused on producing flash flood hazard susceptibility maps (FFHSM using frequency ratio (FR and statistical index (SI models in the Xiqu Gully (XQG of Beijing, China. First, a total of 85 flash flood hazard locations (n = 85 were surveyed in the field and plotted using geographic information system (GIS software. Based on the flash flood hazard locations, a flood hazard inventory map was built. Seventy percent (n = 60 of the flooding hazard locations were randomly selected for building the models. The remaining 30% (n = 25 of the flooded hazard locations were used for validation. Considering that the XQG used to be a coal mining area, coalmine caves and subsidence caused by coal mining exist in this catchment, as well as many ground fissures. Thus, this study took the subsidence risk level into consideration for FFHSM. The ten conditioning parameters were elevation, slope, curvature, land use, geology, soil texture, subsidence risk area, stream power index (SPI, topographic wetness index (TWI, and short-term heavy rain. This study also tested different classification schemes for the values for each conditional parameter and checked their impacts on the results. The accuracy of the FFHSM was validated using area under the curve (AUC analysis. Classification accuracies were 86.61%, 83.35%, and 78.52% using frequency ratio (FR-natural breaks, statistical index (SI-natural breaks and FR-manual classification schemes, respectively. Associated prediction accuracies were 83.69%, 81.22%, and 74.23%, respectively. It was found that FR modeling using a natural breaks classification method was more appropriate for generating FFHSM for the Xiqu Gully.
A modular multiple use system for precise time and frequency measurement and distribution
Reinhardt, V. S.; Adams, W. S.; Lee, G. M.; Bush, R. L.
1978-01-01
A modular CAMAC based system is described which was developed to meet a variety of precise time and frequency measurement and distribution needs. The system was based on a generalization of the dual mixer concept. By using a 16 channel 100 ns event clock, the system can intercompare the phase of 16 frequency standards with subpicosecond resolution. The system has a noise floor of 26 fs and a long term stability on the order of 1 ps or better. The system also used a digitally controlled crystal oscillator in a control loop to provide an offsettable 5 MHz output with subpicosecond phase tracking capability. A detailed description of the system is given including theory of operation and performance. A method to improve the performance of the dual mixer technique is discussed when phase balancing of the two input ports cannot be accomplished.
Smith, S. Jerrod; Esralew, Rachel A.
2010-01-01
drainage-basin outlet for the period 1961-1990, 10-85 channel slope (slope between points located at 10 percent and 85 percent of the longest flow-path length upstream from the outlet), and percent impervious area. The Oklahoma StreamStats application interacts with the National Streamflow Statistics database, which contains the peak-flow regression equations in a previously published report. Fourteen peak-flow (flood) frequency statistics are available for computation in the Oklahoma StreamStats application. These statistics include the peak flow at 2-, 5-, 10-, 25-, 50-, 100-, and 500-year recurrence intervals for rural, unregulated streams; and the peak flow at 2-, 5-, 10-, 25-, 50-, 100-, and 500-year recurrence intervals for rural streams that are regulated by Natural Resources Conservation Service floodwater retarding structures. Basin characteristics and streamflow statistics cannot be computed for locations in playa basins (mostly in the Oklahoma Panhandle) and along main stems of the largest river systems in the state, namely the Arkansas, Canadian, Cimarron, Neosho, Red, and Verdigris Rivers, because parts of the drainage areas extend outside of the processed hydrologic units.
Efficient Partitioning of Large Databases without Query Statistics
Directory of Open Access Journals (Sweden)
Shahidul Islam KHAN
2016-11-01
Full Text Available An efficient way of improving the performance of a database management system is distributed processing. Distribution of data involves fragmentation or partitioning, replication, and allocation process. Previous research works provided partitioning based on empirical data about the type and frequency of the queries. These solutions are not suitable at the initial stage of a distributed database as query statistics are not available then. In this paper, I have presented a fragmentation technique, Matrix based Fragmentation (MMF, which can be applied at the initial stage as well as at later stages of distributed databases. Instead of using empirical data, I have developed a matrix, Modified Create, Read, Update and Delete (MCRUD, to partition a large database properly. Allocation of fragments is done simultaneously in my proposed technique. So using MMF, no additional complexity is added for allocating the fragments to the sites of a distributed database as fragmentation is synchronized with allocation. The performance of a DDBMS can be improved significantly by avoiding frequent remote access and high data transfer among the sites. Results show that proposed technique can solve the initial partitioning problem of large distributed databases.
Moigne, Le N.; Oever, van den M.J.A.; Budtova, T.
2011-01-01
Using high resolution optical microscopy coupled with image analysis software and statistical methods, fibre length and aspect ratio distributions in polypropylene composites were characterized. Three types of fibres, flax, sisal and wheat straw, were studied. Number and surface weighted
International Nuclear Information System (INIS)
Kitagawa, Takuya; Oka, Takashi; Demler, Eugene
2012-01-01
In this paper, we study the full conductance statistics of a disordered 1D wire under the application of light. We develop the transfer matrix method for periodically driven systems to analyze the conductance of a large system with small frequency of light, where coherent photon absorptions play an important role to determine not only the average but also the shape of conductance distributions. The average conductance under the application of light results from the competition between dynamic localization and effective dimension increase, and shows non-monotonic behavior as a function of driving amplitude. On the other hand, the shape of conductance distribution displays a crossover phenomena in the intermediate disorder strength; the application of light dramatically changes the distribution from log-normal to normal distributions. Furthermore, we propose that conductance of disordered systems can be controlled by engineering the shape, frequency and amplitude of light. Change of the shape of driving field controls the time-reversals symmetry and the disordered system shows analogous behavior as negative magneto-resistance known in static weak localization. A small change of frequency and amplitude of light leads to a large change of conductance, displaying giant opto-response. Our work advances the perspective to control the mean as well as the full conductance statistics by coherently driving disordered systems. - Highlights: ► We study conductance of disordered systems under the application of light. ► Full conductance distributions are obtained. ► A transfer matrix method is developed for driven systems. ► Conductances are dramatically modified upon the application of light. ► Time-reversal symmetry can also be controlled by light application.
International Nuclear Information System (INIS)
Peixoto, Claudia M.; Jacomino, Vanusa Maria F.; Pego, Valdivio Damasceno
1999-01-01
The basic goal in all environmental data analysis is to characterize the value of some parameter in a portion of the environment over some period of time to a stated degree of accuracy from a limited number of data points. More reliability in the results can be acquired if a proper statistical treatment of the data including estimates of precision, frequency distribution analysis and group comparisons, is performed as part of an environmental surveillance program. The main objective of this paper is to describe the procedures adopted for performing the analysis and statistical treatment of the data obtained in the Environmental Monitoring Program of CDTN during the period of 1993 to 1995. In this study, the results of total alpha and beta concentrations in airborne particulates and surface water samples are considered. In this case, the statistical treatment involved the variability estimation and frequency distribution analysis. Time series analysis of the results is carried out through sequential graphics, which give information about the long-term behavior of the variables. (author) work. (author)
Power-law distributions for a trapped ion interacting with a classical buffer gas.
DeVoe, Ralph G
2009-02-13
Classical collisions with an ideal gas generate non-Maxwellian distribution functions for a single ion in a radio frequency ion trap. The distributions have power-law tails whose exponent depends on the ratio of buffer gas to ion mass. This provides a statistical explanation for the previously observed transition from cooling to heating. Monte Carlo results approximate a Tsallis distribution over a wide range of parameters and have ab initio agreement with experiment.
International Nuclear Information System (INIS)
Hanot, C.; Riaud, P.; Absil, O.; Mennesson, B.; Martin, S.; Liewer, K.; Loya, F.; Mawet, D.; Serabyn, E.
2011-01-01
A new 'self-calibrated' statistical analysis method has been developed for the reduction of nulling interferometry data. The idea is to use the statistical distributions of the fluctuating null depth and beam intensities to retrieve the astrophysical null depth (or equivalently the object's visibility) in the presence of fast atmospheric fluctuations. The approach yields an accuracy much better (about an order of magnitude) than is presently possible with standard data reduction methods, because the astrophysical null depth accuracy is no longer limited by the magnitude of the instrumental phase and intensity errors but by uncertainties on their probability distributions. This approach was tested on the sky with the two-aperture fiber nulling instrument mounted on the Palomar Hale telescope. Using our new data analysis approach alone-and no observations of calibrators-we find that error bars on the astrophysical null depth as low as a few 10 -4 can be obtained in the near-infrared, which means that null depths lower than 10 -3 can be reliably measured. This statistical analysis is not specific to our instrument and may be applicable to other interferometers.
AFD: an application for bi-molecular interaction using axial frequency distribution.
Raza, Saad; Azam, Syed Sikander
2018-03-06
Conformational flexibility and generalized structural features are responsible for specific phenomena existing in biological pathways. With advancements in computational chemistry, novel approaches and new methods are required to compare the dynamic nature of biomolecules, which are crucial not only to address dynamic functional relationships but also to gain detailed insights into the disturbance and positional fluctuation responsible for functional shifts. Keeping this in mind, axial frequency distribution (AFD) has been developed, designed, and implemented. AFD can profoundly represent distribution and density of ligand atom around a particular atom or set of atoms. It enabled us to obtain an explanation of local movements and rotations, which are not significantly highlighted by any other structural and dynamical parameters. AFD can be implemented on biological models representing ligand and protein interactions. It shows a comprehensive view of the binding pattern of ligand by exploring the distribution of atoms relative to the x-y plane of the system. By taking a relative centroid on protein or ligand, molecular interactions like hydrogen bonds, van der Waals, polar or ionic interaction can be analyzed to cater the ligand movement, stabilization or flexibility with respect to the protein. The AFD graph resulted in the residual depiction of bi-molecular interaction in gradient form which can yield specific information depending upon the system of interest.
Computational statistics handbook with Matlab
Martinez, Wendy L
2007-01-01
Prefaces Introduction What Is Computational Statistics? An Overview of the Book Probability Concepts Introduction Probability Conditional Probability and Independence Expectation Common Distributions Sampling Concepts Introduction Sampling Terminology and Concepts Sampling Distributions Parameter Estimation Empirical Distribution Function Generating Random Variables Introduction General Techniques for Generating Random Variables Generating Continuous Random Variables Generating Discrete Random Variables Exploratory Data Analysis Introduction Exploring Univariate Data Exploring Bivariate and Trivariate Data Exploring Multidimensional Data Finding Structure Introduction Projecting Data Principal Component Analysis Projection Pursuit EDA Independent Component Analysis Grand Tour Nonlinear Dimensionality Reduction Monte Carlo Methods for Inferential Statistics Introduction Classical Inferential Statistics Monte Carlo Methods for Inferential Statist...
Counting statistics in radioactivity measurements
International Nuclear Information System (INIS)
Martin, J.
1975-01-01
The application of statistical methods to radioactivity measurement problems is analyzed in several chapters devoted successively to: the statistical nature of radioactivity counts; the application to radioactive counting of two theoretical probability distributions, Poisson's distribution law and the Laplace-Gauss law; true counting laws; corrections related to the nature of the apparatus; statistical techniques in gamma spectrometry [fr
New Statistics for Texture Classification Based on Gabor Filters
Directory of Open Access Journals (Sweden)
J. Pavlovicova
2007-09-01
Full Text Available The paper introduces a new method of texture segmentation efficiency evaluation. One of the well known texture segmentation methods is based on Gabor filters because of their orientation and spatial frequency character. Several statistics are used to extract more information from results obtained by Gabor filtering. Big amount of input parameters causes a wide set of results which need to be evaluated. The evaluation method is based on the normal distributions Gaussian curves intersection assessment and provides a new point of view to the segmentation method selection.
Baijal, Shruti; Nakatani, Chie; van Leeuwen, Cees; Srinivasan, Narayanan
2013-06-07
Human observers show remarkable efficiency in statistical estimation; they are able, for instance, to estimate the mean size of visual objects, even if their number exceeds the capacity limits of focused attention. This ability has been understood as the result of a distinct mode of attention, i.e. distributed attention. Compared to the focused attention mode, working memory representations under distributed attention are proposed to be more compressed, leading to reduced working memory loads. An alternate proposal is that distributed attention uses less structured, feature-level representations. These would fill up working memory (WM) more, even when target set size is low. Using event-related potentials, we compared WM loading in a typical distributed attention task (mean size estimation) to that in a corresponding focused attention task (object recognition), using a measure called contralateral delay activity (CDA). Participants performed both tasks on 2, 4, or 8 different-sized target disks. In the recognition task, CDA amplitude increased with set size; notably, however, in the mean estimation task the CDA amplitude was high regardless of set size. In particular for set-size 2, the amplitude was higher in the mean estimation task than in the recognition task. The result showed that the task involves full WM loading even with a low target set size. This suggests that in the distributed attention mode, representations are not compressed, but rather less structured than under focused attention conditions. Copyright © 2012 Elsevier Ltd. All rights reserved.
DEFF Research Database (Denmark)
Cordua, Knud Skou; Hansen, Thomas Mejer; Lange, Katrine
In order to move beyond simplified covariance based a priori models, which are typically used for inverse problems, more complex multiple-point-based a priori models have to be considered. By means of marginal probability distributions ‘learned’ from a training image, sequential simulation has...... proven to be an efficient way of obtaining multiple realizations that honor the same multiple-point statistics as the training image. The frequency matching method provides an alternative way of formulating multiple-point-based a priori models. In this strategy the pattern frequency distributions (i.......e. marginals) of the training image and a subsurface model are matched in order to obtain a solution with the same multiple-point statistics as the training image. Sequential Gibbs sampling is a simulation strategy that provides an efficient way of applying sequential simulation based algorithms as a priori...
International Nuclear Information System (INIS)
Grendel, M.
1981-01-01
Boundary conditions for distribution functions of quasiparticles scattered by an interface between two crystalline grains are presented. Contrary to former formulations where Maxwell-Boltzmann statistics was considered, the present boundary conditions take into account the quantum statistics (Fermi-Dirac or Bose-Einstein) of quasiparticles. Provided that small deviations only from thermodynamic equilibrium are present, the boundary conditions are linearized, and then their ''renormalization'' is investigated in case of elastic scattering. The final results of the renormalization, which are obtained for a simplified model of an interface, sugo.est that the portion of the Fermi (Bose)-quasiparticles reflected or transmitted specularly is decreased (increased) in comparison with the case of quasiparticles obeying Maxwell-Boltzmann statistics. (author)
International Nuclear Information System (INIS)
Gao, Li-Na; Liu, Fu-Hu; Lacey, Roy A.
2016-01-01
Experimental results of the transverse-momentum distributions of φ mesons and Ω hyperons produced in gold-gold (Au-Au) collisions with different centrality intervals, measured by the STAR Collaboration at different energies (7.7, 11.5, 19.6, 27, and 39 GeV) in the beam energy scan (BES) program at the relativistic heavy-ion collider (RHIC), are approximately described by the single Erlang distribution and the two-component Schwinger mechanism. Moreover, the STAR experimental transverse-momentum distributions of negatively charged particles, produced in Au-Au collisions at RHIC BES energies, are approximately described by the two-component Erlang distribution and the single Tsallis statistics. The excitation functions of free parameters are obtained from the fit to the experimental data. A weak softest point in the string tension in Ω hyperon spectra is observed at 7.7 GeV. (orig.)
pH and its frequency distribution patterns of Acid Precipitation in Japan
International Nuclear Information System (INIS)
Kitamura, Moritsugu; Katou, Takunori; Sekiguchi, Kyoichi
1991-01-01
The pH data was collected at the 29 stations in Phase-I study of Acid Precipitation Survey over Japan by Japan Environment Agency in terms of frequency distribution patterns. This study was undertaken from April 1984 to March 1988, which was the first survey of acid precipitation over Japan with identical sampling procedures and subsequent chemical analyses. While the annual mean pH at each station ranged from 4.4 to 5.5, the monthly mean varied more widely, from 4.0 to 7.1. Its frequency distribution pattern was obtained for each station, and further grouped into four classes: class I; a mode at the rank of pH 4.5∼4.9, class II; bimodes above and below this pH region, class III; a mode at a higher pH region, class IV; a mode at a lower pH region. The bimodal pattern was suggestive of precipitation with and without incorporation of significant amounts of basic aerosol of anthropogenic origin during descent of rain droplet. The patterns of the stations were also classified on a basis of summer-winter difference into another four classes. Winter pH values were appreciably lower than summer pHs in western parts of Japan and on Japan Sea coast, we attribute the winter pH to probable contribution of acidic pollutants transported by strong winter monsoon from Eurasian Continent. At most stations in northern and eastern Japan, the pH was higher in winter months reflecting more incorporation of basic materials, e.g., NH 4 + and Ca 2+ . (author)
Van Marrewijk, N.; Mirzaei, B.; Hayton, D.; Gao, J.R.; Kao, T.Y.; Hu, Q.; Reno, J.L.
2015-01-01
We have performed frequency locking of a dual, forward reverse emitting third-order distributed feedback quantum cascade laser (QCL) at 3.5 THz. By using both directions of THz emission in combination with two gas cells and two power detectors, we can for the first time perform frequency
Khaemba, W.M.; Stein, A.
2001-01-01
This study illustrates the use of modern statistical procedures for better wildlife management by addressing three key issues: determination of abundance, modeling of animal distributions and variability of diversity in space and time. Prior information in Markov Chain Monte Carlo (MCMC) methods is
Statistical analysis of quality control of automatic processor
International Nuclear Information System (INIS)
Niu Yantao; Zhao Lei; Zhang Wei; Yan Shulin
2002-01-01
Objective: To strengthen the scientific management of automatic processor and promote QC, based on analyzing QC management chart for automatic processor by statistical method, evaluating and interpreting the data and trend of the chart. Method: Speed, contrast, minimum density of step wedge of film strip were measured everyday and recorded on the QC chart. Mean (x-bar), standard deviation (s) and range (R) were calculated. The data and the working trend were evaluated and interpreted for management decisions. Results: Using relative frequency distribution curve constructed by measured data, the authors can judge whether it is a symmetric bell-shaped curve or not. If not, it indicates a few extremes overstepping control limits possibly are pulling the curve to the left or right. If it is a normal distribution, standard deviation (s) is observed. When x-bar +- 2s lies in upper and lower control limits of relative performance indexes, it indicates the processor works in stable status in this period. Conclusion: Guided by statistical method, QC work becomes more scientific and quantified. The authors can deepen understanding and application of the trend chart, and improve the quality management to a new step
Maximum Entropy, Word-Frequency, Chinese Characters, and Multiple Meanings
Yan, Xiaoyong; Minnhagen, Petter
2015-01-01
The word-frequency distribution of a text written by an author is well accounted for by a maximum entropy distribution, the RGF (random group formation)-prediction. The RGF-distribution is completely determined by the a priori values of the total number of words in the text (M), the number of distinct words (N) and the number of repetitions of the most common word (kmax). It is here shown that this maximum entropy prediction also describes a text written in Chinese characters. In particular it is shown that although the same Chinese text written in words and Chinese characters have quite differently shaped distributions, they are nevertheless both well predicted by their respective three a priori characteristic values. It is pointed out that this is analogous to the change in the shape of the distribution when translating a given text to another language. Another consequence of the RGF-prediction is that taking a part of a long text will change the input parameters (M, N, kmax) and consequently also the shape of the frequency distribution. This is explicitly confirmed for texts written in Chinese characters. Since the RGF-prediction has no system-specific information beyond the three a priori values (M, N, kmax), any specific language characteristic has to be sought in systematic deviations from the RGF-prediction and the measured frequencies. One such systematic deviation is identified and, through a statistical information theoretical argument and an extended RGF-model, it is proposed that this deviation is caused by multiple meanings of Chinese characters. The effect is stronger for Chinese characters than for Chinese words. The relation between Zipf’s law, the Simon-model for texts and the present results are discussed. PMID:25955175
International Nuclear Information System (INIS)
Parvan, A.S.
2016-01-01
The Tsallis statistics was applied to describe the experimental data on the transverse momentum distributions of hadrons. We considered the energy dependence of the parameters of the Tsallis-factorized statistics, which is now widely used for the description of the experimental transverse momentum distributions of hadrons, and the Tsallis statistics for the charged pions produced in pp collisions at high energies. We found that the results of the Tsallis-factorized statistics deviate from the results of the Tsallis statistics only at low NA61/SHINE energies when the value of the entropic parameter is close to unity. At higher energies, when the value of the entropic parameter deviates essentially from unity, the Tsallis-factorized statistics satisfactorily recovers the results of the Tsallis statistics. (orig.)
Energy Technology Data Exchange (ETDEWEB)
Mork, B; Nelson, R; Kirkendall, B; Stenvig, N
2009-11-30
Application of BPL technologies to existing overhead high-voltage power lines would benefit greatly from improved simulation tools capable of predicting performance - such as the electromagnetic fields radiated from such lines. Existing EMTP-based frequency-dependent line models are attractive since their parameters are derived from physical design dimensions which are easily obtained. However, to calculate the radiated electromagnetic fields, detailed current distributions need to be determined. This paper presents a method of using EMTP line models to determine the current distribution on the lines, as well as a technique for using these current distributions to determine the radiated electromagnetic fields.
The distribution of the pathogenic nematode Nematodirus battus in lambs is zero-inflated
DEFF Research Database (Denmark)
Denwood, Matthew; Stear, M J; Matthews, L
2008-01-01
Understanding the frequency distribution of parasites and parasite stages among hosts is essential for efficient experimental design and statistical analysis, and is also required for the development of sustainable methods of controlling infection. Nematodirus battus is one of the most important...... organisms that infect sheep but the distribution of parasites among hosts is unknown. An initial analysis indicated a high frequency of animals without N. battus and with zero egg counts, suggesting the possibility of a zero-inflated distribution. We developed a Bayesian analysis using Markov chain Monte...... Carlo methods to estimate the parameters of the zero-inflated negative binomial distribution. The analysis of 3000 simulated data sets indicated that this method out-performed the maximum likelihood procedure. Application of this technique to faecal egg counts from lambs in a commercial upland flock...
DEFF Research Database (Denmark)
Hansen, Kurt Schaldemose
2007-01-01
The statistical distribution of extreme wind excursions above a mean level, for a specified recurrence period, is of crucial importance in relation to design of wind sensitive structures. This is particularly true for wind turbine structures. Based on an assumption of a Gaussian "mother......" distribution, Cartwright and Longuet-Higgens [1] derived an asymptotic expression for the distribution of the largest excursion from the mean level during an arbitrary recurrence period. From its inception, this celebrated expression has been widely used in wind engineering (as well as in off-shore engineering...... associated with large excursions from the mean [2]. Thus, the more extreme turbulence excursions (i.e. the upper tail of the turbulence PDF) seem to follow an Exponential-like distribution rather than a Gaussian distribution, and a Gaussian estimate may under-predict the probability of large turbulence...
Directory of Open Access Journals (Sweden)
Krzysztof Jόzwikowska
2015-06-01
Full Text Available The main goal of this work is to determine a statistical non-equilibrium distribution function for the electron and holes in semiconductor heterostructures in steady-state conditions. Based on the postulates of local equilibrium, as well as on the integral form of the weighted Gyarmati’s variational principle in the force representation, using an alternative method, we have derived general expressions, which have the form of the Fermi–Dirac distribution function with four additional components. The physical interpretation of these components has been carried out in this paper. Some numerical results of a non-equilibrium distribution function for an electron in HgCdTe structures are also presented.
A statistical model for deriving probability distributions of contamination for accidental releases
International Nuclear Information System (INIS)
ApSimon, H.M.; Davison, A.C.
1986-01-01
Results generated from a detailed long-range transport model, MESOS, simulating dispersal of a large number of hypothetical releases of radionuclides in a variety of meteorological situations over Western Europe have been used to derive a simpler statistical model, MESOSTAT. This model may be used to generate probability distributions of different levels of contamination at a receptor point 100-1000 km or so from the source (for example, across a frontier in another country) without considering individual release and dispersal scenarios. The model is embodied in a series of equations involving parameters which are determined from such factors as distance between source and receptor, nuclide decay and deposition characteristics, release duration, and geostrophic windrose at the source. Suitable geostrophic windrose data have been derived for source locations covering Western Europe. Special attention has been paid to the relatively improbable extreme values of contamination at the top end of the distribution. The MESOSTAT model and its development are described, with illustrations of its use and comparison with the original more detailed modelling techniques. (author)
International Nuclear Information System (INIS)
Mishra, Anurag; Seo, Jin Seok; Kim, Tae Hyung; Yeom, Geun Young
2015-01-01
Controlling time averaged ion energy distribution (IED) is becoming increasingly important in many plasma material processing applications for plasma etching and deposition. The present study reports the evolution of ion energy distributions with radio frequency (RF) powers in a pulsed dual frequency inductively discharge and also investigates the effect of duty ratio. The discharge has been sustained using two radio frequency, low (P 2 MHz = 2 MHz) and high (P 13.56 MHz = 13.56 MHz) at a pressure of 10 mTorr in argon (90%) and CF 4 (10%) environment. The low frequency RF powers have been varied from 100 to 600 W, whereas the high frequency powers from 200 to 1200 W. Typically, IEDs show bimodal structure and energy width (energy separation between the high and low energy peaks) increases with increasing P 13.56 MHz ; however, it shows opposite trends with P 2 MHz . It has been observed that IEDs bimodal structure tends to mono-modal structure and energy peaks shift towards low energy side as duty ratio increases, keeping pulse power owing to mode transition (capacitive to inductive) constant
Haberlandt, U.; Radtke, I.
2014-01-01
Derived flood frequency analysis allows the estimation of design floods with hydrological modeling for poorly observed basins considering change and taking into account flood protection measures. There are several possible choices regarding precipitation input, discharge output and consequently the calibration of the model. The objective of this study is to compare different calibration strategies for a hydrological model considering various types of rainfall input and runoff output data sets and to propose the most suitable approach. Event based and continuous, observed hourly rainfall data as well as disaggregated daily rainfall and stochastically generated hourly rainfall data are used as input for the model. As output, short hourly and longer daily continuous flow time series as well as probability distributions of annual maximum peak flow series are employed. The performance of the strategies is evaluated using the obtained different model parameter sets for continuous simulation of discharge in an independent validation period and by comparing the model derived flood frequency distributions with the observed one. The investigations are carried out for three mesoscale catchments in northern Germany with the hydrological model HEC-HMS (Hydrologic Engineering Center's Hydrologic Modeling System). The results show that (I) the same type of precipitation input data should be used for calibration and application of the hydrological model, (II) a model calibrated using a small sample of extreme values works quite well for the simulation of continuous time series with moderate length but not vice versa, and (III) the best performance with small uncertainty is obtained when stochastic precipitation data and the observed probability distribution of peak flows are used for model calibration. This outcome suggests to calibrate a hydrological model directly on probability distributions of observed peak flows using stochastic rainfall as input if its purpose is the
Energy Technology Data Exchange (ETDEWEB)
O' Neill, C.; Waskoenig, J. [Centre for Plasma Physics, School of Maths and Physics, Queen' s University Belfast, Belfast BT7 1NN (United Kingdom); Gans, T. [Centre for Plasma Physics, School of Maths and Physics, Queen' s University Belfast, Belfast BT7 1NN (United Kingdom); York Plasma Institute, Department of Physics, University of York, York YO10 5DD (United Kingdom)
2012-10-08
A multi-scale numerical model based on hydrodynamic equations with semi-kinetic treatment of electrons is used to investigate the influence of dual frequency excitation on the effective electron energy distribution function (EEDF) in a radio-frequency driven atmospheric pressure plasma. It is found that variations of power density, voltage ratio, and phase relationship provide separate control over the electron density and the mean electron energy. This is exploited to directly influence both the phase dependent and time averaged effective EEDF. This enables tailoring the EEDF for enhanced control of non-equilibrium plasma chemical kinetics at ambient pressure and temperature.
Generalized Statistical Mechanics at the Onset of Chaos
Directory of Open Access Journals (Sweden)
Alberto Robledo
2013-11-01
Full Text Available Transitions to chaos in archetypal low-dimensional nonlinear maps offer real and precise model systems in which to assess proposed generalizations of statistical mechanics. The known association of chaotic dynamics with the structure of Boltzmann–Gibbs (BG statistical mechanics has suggested the potential verification of these generalizations at the onset of chaos, when the only Lyapunov exponent vanishes and ergodic and mixing properties cease to hold. There are three well-known routes to chaos in these deterministic dissipative systems, period-doubling, quasi-periodicity and intermittency, which provide the setting in which to explore the limit of validity of the standard BG structure. It has been shown that there is a rich and intricate behavior for both the dynamics within and towards the attractors at the onset of chaos and that these two kinds of properties are linked via generalized statistical-mechanical expressions. Amongst the topics presented are: (i permanently growing sensitivity fluctuations and their infinite family of generalized Pesin identities; (ii the emergence of statistical-mechanical structures in the dynamics along the routes to chaos; (iii dynamical hierarchies with modular organization; and (iv limit distributions of sums of deterministic variables. The occurrence of generalized entropy properties in condensed-matter physical systems is illustrated by considering critical fluctuations, localization transition and glass formation. We complete our presentation with the description of the manifestations of the dynamics at the transitions to chaos in various kinds of complex systems, such as, frequency and size rank distributions and complex network images of time series. We discuss the results.
International Nuclear Information System (INIS)
Smith, Michael H.; Tsyusko-Omeltchenko, Olga; Oleksyk, Taras K.
2003-01-01
There are significant linear relationship between the standard deviation and the mean of radiocesium concentration for samples of soils, sediments, plants, and animals from Chornobyl and nuclear sites in the United States. The universal occurrence of this relationship in all types of samples suggests that a non-normal frequency distribution should be expected. The slopes of these relationships are similar for fish and mammals from the two regions of the world but those for plants are not. The slopes for plants are similar for aquatic and terrestrial ecosystems within each region. We hypothesize that there are relationships between the four moments of the frequency distribution of radiocesium (mean, variance, skewness, and kurtosis), and that these relationships are caused by the functional properties of the organisms and other characteristics of the ecosystem. The way in which radiocesium was distributed across the landscape does not seem to be a factor in determining the form of the frequency distribution. (author)
Garrido, Marta Isabel; Teng, Chee Leong James; Taylor, Jeremy Alexander; Rowe, Elise Genevieve; Mattingley, Jason Brett
2016-06-01
The ability to learn about regularities in the environment and to make predictions about future events is fundamental for adaptive behaviour. We have previously shown that people can implicitly encode statistical regularities and detect violations therein, as reflected in neuronal responses to unpredictable events that carry a unique prediction error signature. In the real world, however, learning about regularities will often occur in the context of competing cognitive demands. Here we asked whether learning of statistical regularities is modulated by concurrent cognitive load. We compared electroencephalographic metrics associated with responses to pure-tone sounds with frequencies sampled from narrow or wide Gaussian distributions. We showed that outliers evoked a larger response than those in the centre of the stimulus distribution (i.e., an effect of surprise) and that this difference was greater for physically identical outliers in the narrow than in the broad distribution. These results demonstrate an early neurophysiological marker of the brain's ability to implicitly encode complex statistical structure in the environment. Moreover, we manipulated concurrent cognitive load by having participants perform a visual working memory task while listening to these streams of sounds. We again observed greater prediction error responses in the narrower distribution under both low and high cognitive load. Furthermore, there was no reliable reduction in prediction error magnitude under high-relative to low-cognitive load. Our findings suggest that statistical learning is not a capacity limited process, and that it proceeds automatically even when cognitive resources are taxed by concurrent demands.
The Statistical Properties of Host Load
Directory of Open Access Journals (Sweden)
Peter A. Dinda
1999-01-01
Full Text Available Understanding how host load changes over time is instrumental in predicting the execution time of tasks or jobs, such as in dynamic load balancing and distributed soft real‐time systems. To improve this understanding, we collected week‐long, 1 Hz resolution traces of the Digital Unix 5 second exponential load average on over 35 different machines including production and research cluster machines, compute servers, and desktop workstations. Separate sets of traces were collected at two different times of the year. The traces capture all of the dynamic load information available to user‐level programs on these machines. We present a detailed statistical analysis of these traces here, including summary statistics, distributions, and time series analysis results. Two significant new results are that load is self‐similar and that it displays epochal behavior. All of the traces exhibit a high degree of self‐similarity with Hurst parameters ranging from 0.73 to 0.99, strongly biased toward the top of that range. The traces also display epochal behavior in that the local frequency content of the load signal remains quite stable for long periods of time (150–450 s mean and changes abruptly at epoch boundaries. Despite these complex behaviors, we have found that relatively simple linear models are sufficient for short‐range host load prediction.
Distributed Monitoring of the R(sup 2) Statistic for Linear Regression
Bhaduri, Kanishka; Das, Kamalika; Giannella, Chris R.
2011-01-01
The problem of monitoring a multivariate linear regression model is relevant in studying the evolving relationship between a set of input variables (features) and one or more dependent target variables. This problem becomes challenging for large scale data in a distributed computing environment when only a subset of instances is available at individual nodes and the local data changes frequently. Data centralization and periodic model recomputation can add high overhead to tasks like anomaly detection in such dynamic settings. Therefore, the goal is to develop techniques for monitoring and updating the model over the union of all nodes data in a communication-efficient fashion. Correctness guarantees on such techniques are also often highly desirable, especially in safety-critical application scenarios. In this paper we develop DReMo a distributed algorithm with very low resource overhead, for monitoring the quality of a regression model in terms of its coefficient of determination (R2 statistic). When the nodes collectively determine that R2 has dropped below a fixed threshold, the linear regression model is recomputed via a network-wide convergecast and the updated model is broadcast back to all nodes. We show empirically, using both synthetic and real data, that our proposed method is highly communication-efficient and scalable, and also provide theoretical guarantees on correctness.
Directory of Open Access Journals (Sweden)
M. Koeppel
2018-02-01
Full Text Available Optical temperature sensors offer unique features which make them indispensable for key industries such as the energy sector. However, commercially available systems are usually designed to perform either distributed or distinct hot spot temperature measurements since they are restricted to one measurement principle. We have combined two concepts, fiber Bragg grating (FBG temperature sensors and Raman-based distributed temperature sensing (DTS, to overcome these limitations. Using a technique called incoherent optical frequency domain reflectometry (IOFDR, it is possible to cascade several FBGs with the same Bragg wavelength in one fiber and simultaneously perform truly distributed Raman temperature measurements. In our lab we have achieved a standard deviation of 2.5 K or better at a spatial resolution in the order of 1 m with the Raman DTS. We have also carried out a field test in a high-voltage environment with strong magnetic fields where we performed simultaneous Raman and FBG temperature measurements using a single sensor fiber only.
The surface distribution of chemical anomalies of Ap components in detached close binaries
International Nuclear Information System (INIS)
Kitamura, M.
1980-01-01
By estimating the orbital inclinations of non-eclipsing detached close binaries with Ap spectra, a marked statistical preference is obtained on the frequency distribution of the inclination which suggests that the abundance anomalies of Ap components tend to concentrate towards the stellar polar region. (Auth.)
International Nuclear Information System (INIS)
Zhukhlistov, A.A.; Avilov, A.S.; Ferraris, D.; Zvyagin, B.B.; Plotnikov, V.P.
1997-01-01
The method of improved automatic electron diffractometry for measuring and recording intensities to two-dimensionally distributed reflections of texture-type electron diffraction patterns has been used for the analysis of the brucite Mg(OH) 2 structure. The experimental accuracy of the measured intensities proved to be sufficient for studying fine structural details of the statistical distribution of hydrogen atoms over three structure positions located around the threefold axis of the brucite structure
Digital Repository Service at National Institute of Oceanography (India)
Chandrasekaran, R.; Angusamy, N.; Manickaraj, D.S.; Loveson, V.J.; Gujar, A.R.; Chandrasekar, N.; Rajamanickam, G.V.
, G.M. (1967) Dynamic Processes and Statistical parameters compared for size frequency distribution of beach river sands. Jour. Sed. Petrol,V.37, pp.327-354. Rajamanickam, G.V. and Gujar, A.R. (1984) Sediment depositional environment in some...
Directory of Open Access Journals (Sweden)
Kukharenko Y. A.
2006-12-01
Full Text Available The diagram technique for calculation of the dynamic properties of an anisotropic media with randomly distributed inclusions (pores, cracks is developed. Statistical description of inclusions is determined by distribution function dependent on five groups of parameters :- over coordinates; - over angles of orientation of shapes;- over angles of orientation of crystallographic axes;- over aspect ratio (in a case of ellipsoidal inclusions;- over types of phase of inclusions. Such statistical approach allows to take into consideration any type and order of correlation interactions between inclusions. The diagram series for an average Green function is (GF constructed. The accurate summation of this series leads to a nonlinear dynamic equation for an average GF (Dyson equation. The kernel of this equation is a mass operator which depends on frequency and can be presented in a form of diagram series on accurate GF. The mass operator coincides with effective complex tensor of elasticity (or conductivity in a local approximation. An expansion of effective dynamic elastic (transport tensor on distribution functions of any order is obtained. It is shown that correlation between homogeneities can produce an effective elastic and transport parameters anisotropy. In correlation approximation the dispersion dependencies of the effective elastic constants are studied. Frequency dependencies of a coefficient anisotropy of the elastic properties as function of statistical distributed inclusions over coordinates (isotropic matrix and isotropic (spherical inclusions are obtained. La technique par diagrammes appliquée au calcul des propriétés dynamiques d'un milieu anisotrope ayant une distribution aléatoire d'inclusions (pores, fissures est ici développée. La description statistique des inclusions est déterminée par une fonction de distribution reposant sur cinq groupes de paramètres : - les coordonnées, - les angles d'orientation des formes, - les
Grid Frequency Extreme Event Analysis and Modeling: Preprint
Energy Technology Data Exchange (ETDEWEB)
Florita, Anthony R [National Renewable Energy Laboratory (NREL), Golden, CO (United States); Clark, Kara [National Renewable Energy Laboratory (NREL), Golden, CO (United States); Gevorgian, Vahan [National Renewable Energy Laboratory (NREL), Golden, CO (United States); Folgueras, Maria [National Renewable Energy Laboratory (NREL), Golden, CO (United States); Wenger, Erin [National Renewable Energy Laboratory (NREL), Golden, CO (United States)
2017-11-01
Sudden losses of generation or load can lead to instantaneous changes in electric grid frequency and voltage. Extreme frequency events pose a major threat to grid stability. As renewable energy sources supply power to grids in increasing proportions, it becomes increasingly important to examine when and why extreme events occur to prevent destabilization of the grid. To better understand frequency events, including extrema, historic data were analyzed to fit probability distribution functions to various frequency metrics. Results showed that a standard Cauchy distribution fit the difference between the frequency nadir and prefault frequency (f_(C-A)) metric well, a standard Cauchy distribution fit the settling frequency (f_B) metric well, and a standard normal distribution fit the difference between the settling frequency and frequency nadir (f_(B-C)) metric very well. Results were inconclusive for the frequency nadir (f_C) metric, meaning it likely has a more complex distribution than those tested. This probabilistic modeling should facilitate more realistic modeling of grid faults.
International Nuclear Information System (INIS)
Suniaev, R.A.; Titarchuk, L.G.
1984-01-01
Analytical consideration is given to the comptonization of photons and its effects on the radiation emitted from accretion disks of compact X-ray sources, such as black holes and neutron stars. Attention is given to the photon distribution during escape from the disk, the angular distribution of hard radiation from the disk, the polarization of hard radiation and the electron temperature distribution over the optical depth. It is shown that the hard radiation spectrum is independent of the low-frequency photon source distribution. The angular distribution and polarization of the outgoing X-rays are a function of the optical depth. A Thomson approximation is used to estimate the angular distribution of the hard radiation and the polarization over the disk. The polarization results are compared with OSO-8 satellite data for Cyg X-1 and show good agreement at several energy levels. 17 references
Xu, Henglong; Yong, Jiang; Xu, Guangjian
2015-12-30
Sampling frequency is important to obtain sufficient information for temporal research of microfauna. To determine an optimal strategy for exploring the seasonal variation in ciliated protozoa, a dataset from the Yellow Sea, northern China was studied. Samples were collected with 24 (biweekly), 12 (monthly), 8 (bimonthly per season) and 4 (seasonally) sampling events. Compared to the 24 samplings (100%), the 12-, 8- and 4-samplings recovered 94%, 94%, and 78% of the total species, respectively. To reveal the seasonal distribution, the 8-sampling regime may result in >75% information of the seasonal variance, while the traditional 4-sampling may only explain sampling frequency, the biotic data showed stronger correlations with seasonal variables (e.g., temperature, salinity) in combination with nutrients. It is suggested that the 8-sampling events per year may be an optimal sampling strategy for ciliated protozoan seasonal research in marine ecosystems. Copyright © 2015 Elsevier Ltd. All rights reserved.
Czech Academy of Sciences Publication Activity Database
Netopilík, Miloš; Kratochvíl, Pavel
2006-01-01
Roč. 55, č. 2 (2006), s. 196-203 ISSN 0959-8103 R&D Projects: GA AV ČR IAA100500501; GA AV ČR IAA4050403; GA AV ČR IAA4050409; GA ČR GA203/03/0617 Institutional research plan: CEZ:AV0Z40500505 Keywords : statistical branching * tetrafunctional branch points * molecular-weight distribution Subject RIV: CD - Macromolecular Chemistry Impact factor: 1.475, year: 2006
International Nuclear Information System (INIS)
Tiana-Alsina, J.; Torrent, M. C.; Masoller, C.; Garcia-Ojalvo, J.; Rosso, O. A.
2010-01-01
Low-frequency fluctuations (LFFs) represent a dynamical instability that occurs in semiconductor lasers when they are operated near the lasing threshold and subject to moderate optical feedback. LFFs consist of sudden power dropouts followed by gradual, stepwise recoveries. We analyze experimental time series of intensity dropouts and quantify the complexity of the underlying dynamics employing two tools from information theory, namely, Shannon's entropy and the Martin, Plastino, and Rosso statistical complexity measure. These measures are computed using a method based on ordinal patterns, by which the relative length and ordering of consecutive interdropout intervals (i.e., the time intervals between consecutive intensity dropouts) are analyzed, disregarding the precise timing of the dropouts and the absolute durations of the interdropout intervals. We show that this methodology is suitable for quantifying subtle characteristics of the LFFs, and in particular the transition to fully developed chaos that takes place when the laser's pump current is increased. Our method shows that the statistical complexity of the laser does not increase continuously with the pump current, but levels off before reaching the coherence collapse regime. This behavior coincides with that of the first- and second-order correlations of the interdropout intervals, suggesting that these correlations, and not the chaotic behavior, are what determine the level of complexity of the laser's dynamics. These results hold for two different dynamical regimes, namely, sustained LFFs and coexistence between LFFs and steady-state emission.
Design of nuclear fuel cells by means of a statistical analysis and a sensibility study
International Nuclear Information System (INIS)
Jauregui C, V.; Castillo M, J. A.; Ortiz S, J. J.; Montes T, J. L.; Perusquia del C, R.
2013-10-01
This work contains the results of the statistical analysis realized to study the nuclear fuel cells performance, considering the frequencies for the election of fuel bars used in the design of the same ones. The election of the bars used for the cells design are of 3 types, the first election shows that to the plotting the respective frequency is similar to a normal distribution, in the second case the frequencies graph is of type inverted square X 2 and the last election is when the bars are chosen in aleatory form. The heuristic techniques used for the cells design were the neural networks, the ant colonies and a hybrid between the dispersed search and the trajectories re-linkage. To carry out the statistical analysis in the cells design were considered the local power peak factor and the neutron infinite multiplication factor (k∞) of this. On the other hand, the performance of the designed cells was analyzed when verifying the position of the bars containing gadolinium. The results show that is possible to design cells of nuclear fuel with a good performance, when considering the frequency of the bars used in their design. (Author)
Primary Frequency Response with Aggregated DERs: Preprint
Energy Technology Data Exchange (ETDEWEB)
Guggilam, Swaroop S.; Dhople, Sairaj V.; Zhao, Changhong; Dall' Anese, Emiliano; Chen, Yu Christine
2017-03-03
Power networks have to withstand a variety of disturbances that affect system frequency, and the problem is compounded with the increasing integration of intermittent renewable generation. Following a large-signal generation or load disturbance, system frequency is arrested leveraging primary frequency control provided by governor action in synchronous generators. In this work, we propose a framework for distributed energy resources (DERs) deployed in distribution networks to provide (supplemental) primary frequency response. Particularly, we demonstrate how power-frequency droop slopes for individual DERs can be designed so that the distribution feeder presents a guaranteed frequency-regulation characteristic at the feeder head. Furthermore, the droop slopes are engineered such that injections of individual DERs conform to a well-defined fairness objective that does not penalize them for their location on the distribution feeder. Time-domain simulations for an illustrative network composed of a combined transmission network and distribution network with frequency-responsive DERs are provided to validate the approach.
All-Optical Frequency Modulated High Pressure MEMS Sensor for Remote and Distributed Sensing
DEFF Research Database (Denmark)
Reck, Kasper; Thomsen, Erik Vilain; Hansen, Ole
2011-01-01
We present the design, fabrication and characterization of a new all-optical frequency modulated pressure sensor. Using the tangential strain in a circular membrane, a waveguide with an integrated nanoscale Bragg grating is strained longitudinally proportional to the applied pressure causing...... a shift in the Bragg wavelength. The simple and robust design combined with the small chip area of 1 × 1.8 mm2 makes the sensor ideally suited for remote and distributed sensing in harsh environments and where miniaturized sensors are required. The sensor is designed for high pressure applications up...
DEFF Research Database (Denmark)
Wu, Dan; Tang, Fen; Dragicevic, Tomislav
2013-01-01
In this paper, a distributed coordinated control scheme based on frequency-bus-signaling (FBS) method for a low-voltage AC three phase microgrid is proposed. The control scheme is composed by two levels. Firstly a primary local control which is different for the DGs and the ESS is proposed. The ESS...... control is implemented to restore the frequency deviation produced by the primary ESS controller while preserving the coordinated control performance. Real-time simulation results show the feasibility of the proposed approach by showing the operation of the microgrid in different scenarios....
Investigation of the statistical distance to reach stationary distributions
International Nuclear Information System (INIS)
Nicholson, S.B.; Kim, Eun-jin
2015-01-01
The thermodynamic length gives a Riemannian metric to a system's phase space. Here we extend the traditional thermodynamic length to the information length (L) out of equilibrium and examine its properties. We utilise L as a useful methodology of analysing non-equilibrium systems without evoking conventional assumptions such as Gaussian statistics, detailed balance, priori-known constraints, or ergodicity and numerically examine how L evolves in time for the logistic map in the chaotic regime depending on initial conditions. To this end, we propose a discrete version of L which is mathematically well defined by taking a set theoretic approach. We identify the areas of phase space where the loss of information of the system takes place most rapidly. In particular, we present an interesting result that the unstable fixed points turn out to most efficiently drive the logistic map towards a stationary distribution through L. - Highlights: • Define a set theoretic version of the discrete thermodynamic length. • These sets allow one to analyse systems having zero probabilities in their evolution. • Numerically analyse the Logistic map using the thermodynamic length. • Show how the unstable fixed points most efficiently lead the system to equilibrium
Statistics for experimentalists
Cooper, B E
2014-01-01
Statistics for Experimentalists aims to provide experimental scientists with a working knowledge of statistical methods and search approaches to the analysis of data. The book first elaborates on probability and continuous probability distributions. Discussions focus on properties of continuous random variables and normal variables, independence of two random variables, central moments of a continuous distribution, prediction from a normal distribution, binomial probabilities, and multiplication of probabilities and independence. The text then examines estimation and tests of significance. Topics include estimators and estimates, expected values, minimum variance linear unbiased estimators, sufficient estimators, methods of maximum likelihood and least squares, and the test of significance method. The manuscript ponders on distribution-free tests, Poisson process and counting problems, correlation and function fitting, balanced incomplete randomized block designs and the analysis of covariance, and experiment...
Real-world visual statistics and infants' first-learned object names.
Clerkin, Elizabeth M; Hart, Elizabeth; Rehg, James M; Yu, Chen; Smith, Linda B
2017-01-05
We offer a new solution to the unsolved problem of how infants break into word learning based on the visual statistics of everyday infant-perspective scenes. Images from head camera video captured by 8 1/2 to 10 1/2 month-old infants at 147 at-home mealtime events were analysed for the objects in view. The images were found to be highly cluttered with many different objects in view. However, the frequency distribution of object categories was extremely right skewed such that a very small set of objects was pervasively present-a fact that may substantially reduce the problem of referential ambiguity. The statistical structure of objects in these infant egocentric scenes differs markedly from that in the training sets used in computational models and in experiments on statistical word-referent learning. Therefore, the results also indicate a need to re-examine current explanations of how infants break into word learning.This article is part of the themed issue 'New frontiers for statistical learning in the cognitive sciences'. © 2016 The Author(s).
DEFF Research Database (Denmark)
Graversen, Carina; Malver, Lasse P; Kurita, Geana P
2015-01-01
Opioids alter resting state brain oscillations by multiple and complex factors, which are still to be elucidated. To increase our knowledge, multi-channel electroencephalography (EEG) was subjected to multivariate pattern analysis (MVPA), to identify the most descriptive frequency bands and scalp...... distributions were extracted by a continuous wavelet transform and normalized into delta, theta, alpha, beta and gamma bands. Alterations relative to pre-treatment responses were calculated for all channels and used as input to the MVPA. Compared to placebo, remifentanil increased the delta band and decreased...... the theta and alpha band oscillations as a mean over all channels (all p ≤ 0.007). The most discriminative channels in these frequency bands were F1 in delta (83.33%, p = 0.0023) and theta bands (95.24%, p band (80.95%, p = 0.0054). These alterations were correlated...
International Nuclear Information System (INIS)
Brooke, J.P.
1977-11-01
Selected sites in the United States have been analyzed geomathematically as a part of the technical support program to develop site suitability criteria for High Level Nuclear Waste (HLW) repositories. Using published geological maps and other information, statistical evaluations of the fault patterns and other significant geological features have been completed for 16 selected localities. The observed frequency patterns were compared to theoretical patterns in order to obtain a predictive model for faults at each location. In general, the patterns approximate an exponential distribution function with the exception of Edinburgh, Scotland--the control area. The fault pattern of rocks at Edinburgh closely approximate a negative binominal frequency distribution. The range of fault occurrences encountered during the investigation varied from a low of 0.15 to a high of 10 faults per square mile. Faulting is only one factor in the overall geological evaluation of HLW sites. A general exploration program plan to aid in investigating HLW respository sites has been completed using standard mineral exploration techniques. For the preliminary examination of the suitability of potential sites, present economic conditions indicate the scanning and reconnaissance exploration stages will cost approximately $1,000,000. These would proceed in a logical sequence so that the site selected optimizes the geological factors. The reconnaissance stage of mineral exploration normally utilizes ''saturation geophysics'' to obtain complete geological information. This approach is recommended in the preliminary HLW site investigation process as the most economical and rewarding. Exploration games have been designed for potential sites in the eastern and the western U.S. The game matrix approach is recommended as a suitable technique for the allocation of resources in a search problem during this preliminary phase
Kato, Takeyoshi; Sugimoto, Hiroyuki; Suzuoki, Yasuo
We established a procedure for estimating regional electricity demand and regional potential capacity of distributed generators (DGs) by using a grid square statistics data set. A photovoltaic power system (PV system) for residential use and a co-generation system (CGS) for both residential and commercial use were taken into account. As an example, the result regarding Aichi prefecture was presented in this paper. The statistical data of the number of households by family-type and the number of employees by business category for about 4000 grid-square with 1km × 1km area was used to estimate the floor space or the electricity demand distribution. The rooftop area available for installing PV systems was also estimated with the grid-square statistics data set. Considering the relation between a capacity of existing CGS and a scale-index of building where CGS is installed, the potential capacity of CGS was estimated for three business categories, i.e. hotel, hospital, store. In some regions, the potential capacity of PV systems was estimated to be about 10,000kW/km2, which corresponds to the density of the existing area with intensive installation of PV systems. Finally, we discussed the ratio of regional potential capacity of DGs to regional maximum electricity demand for deducing the appropriate capacity of DGs in the model of future electricity distribution system.
International Nuclear Information System (INIS)
Ng, Felix S.L.
2016-01-01
We develop a statistical-mechanical model of one-dimensional normal grain growth that does not require any drift-velocity parameterization for grain size, such as used in the continuity equation of traditional mean-field theories. The model tracks the population by considering grain sizes in neighbour pairs; the probability of a pair having neighbours of certain sizes is determined by the size-frequency distribution of all pairs. Accordingly, the evolution obeys a partial integro-differential equation (PIDE) over ‘grain size versus neighbour grain size’ space, so that the grain-size distribution is a projection of the PIDE's solution. This model, which is applicable before as well as after statistically self-similar grain growth has been reached, shows that the traditional continuity equation is invalid outside this state. During statistically self-similar growth, the PIDE correctly predicts the coarsening rate, invariant grain-size distribution and spatial grain size correlations observed in direct simulations. The PIDE is then reducible to the standard continuity equation, and we derive an explicit expression for the drift velocity. It should be possible to formulate similar parameterization-free models of normal grain growth in two and three dimensions.
Extreme events in the Mediterranean area: A mixed deterministic-statistical approach
International Nuclear Information System (INIS)
Speranza, A.; Tartaglione, N.
2006-01-01
Statistical inference suffers for severe limitations when applied to extreme meteo-climatic events. A fundamental theorem proposes a constructive theory for a universal distribution law (the Generalized Extreme Value distribution) of extremes. Use of this theorem and of its derivations is nowadays quite common. However, when applying it, the selected events should be real extremes. In practical applications a major source of errors is the fact that there is no strict criterion for selecting extremes and, in order to fatten the statistical sample very mild selection criteria are often used. The theorem in question applies to stationary processes. When a trend is introduced, inference becomes even more problematic. Experience shows that any available a priori knowledge concerning the system can play a fundamental role in the analysis, in particular if it lowers the dimensionality of the parameter space to be explored. The inference procedures serve, then, the purpose of testing the reliability of inductive hypothesis, rather than proving them. Within the above general context, analysis of the hypothesis that the frequency and/or intensity of extreme weather events in the Mediterranean area may be changing is proposed. The analysis is based on a combined deterministic-statistical approach: dynamical analysis of intense perturbations is combined with statistical techniques in order to try to formulate the problem in such a way that meaningful conclusion may be achieved
Do regional methods really help reduce uncertainties in flood frequency analyses?
Cong Nguyen, Chi; Payrastre, Olivier; Gaume, Eric
2013-04-01
Flood frequency analyses are often based on continuous measured series at gauge sites. However, the length of the available data sets is usually too short to provide reliable estimates of extreme design floods. To reduce the estimation uncertainties, the analyzed data sets have to be extended either in time, making use of historical and paleoflood data, or in space, merging data sets considered as statistically homogeneous to build large regional data samples. Nevertheless, the advantage of the regional analyses, the important increase of the size of the studied data sets, may be counterbalanced by the possible heterogeneities of the merged sets. The application and comparison of four different flood frequency analysis methods to two regions affected by flash floods in the south of France (Ardèche and Var) illustrates how this balance between the number of records and possible heterogeneities plays in real-world applications. The four tested methods are: (1) a local statistical analysis based on the existing series of measured discharges, (2) a local analysis valuating the existing information on historical floods, (3) a standard regional flood frequency analysis based on existing measured series at gauged sites and (4) a modified regional analysis including estimated extreme peak discharges at ungauged sites. Monte Carlo simulations are conducted to simulate a large number of discharge series with characteristics similar to the observed ones (type of statistical distributions, number of sites and records) to evaluate to which extent the results obtained on these case studies can be generalized. These two case studies indicate that even small statistical heterogeneities, which are not detected by the standard homogeneity tests implemented in regional flood frequency studies, may drastically limit the usefulness of such approaches. On the other hand, these result show that the valuation of information on extreme events, either historical flood events at gauged
Thermal activation in statistical clusters of magnetic nanoparticles
International Nuclear Information System (INIS)
Hovorka, O
2017-01-01
This article presents a kinetic Monte-Carlo study of thermally activated magnetisation dynamics in clusters of statistically distributed magnetic nanoparticles. The structure of clusters is assumed to be of fractal nature, consistently with recent observations of magnetic particle aggregation in cellular environments. The computed magnetisation relaxation decay and frequency-dependent hysteresis loops are seen to significantly depend on the fractal dimension of aggregates, leading to accelerated magnetisation relaxation and reduction in the size of hysteresis loops as the fractal dimension increases from one-dimensional-like to three-dimensional-like clusters. Discussed are implications for applications in nanomedicine, such as magnetic hyperthermia or magnetic particle imaging. (paper)
Tomasovych, Adam; Kidwell, Susan M.; Foygel Barber, Rina
2015-04-01
Age-frequency distributions of dead skeletal material that capture information on the elapsed time since death of individuals on the landscape or seabed provide decadal- to millennial-scale windows into the history of production and on the processes that lead to skeletal disintegration and burial. However, models quantifying the dynamics of skeletal loss assumed that skeletal production has been constant during accumulation of death assemblages. Here, we assess the joint effects of temporally-variable production and skeletal loss on the shape of postmortem age-frequency distributions. We show that the modes of such distributions will tend to be shifted to younger age cohorts relative to the true timing of past production pulses. This shift in the timing of a past production will be higher where loss rates are high and/or the rate of decline in production is slow. We apply the models combining the dynamic of loss and production to death assemblages with the deposit-feeding bivalve Nuculana taphria from the Southern California continental shelf, finding that (1) an onshore-offshore gradient in time averaging is dominated by a gradient in the timing of production, corresponding to the tracking of shallow-water habitats under a sea-level rise, and (2) model estimates of the timing of past production are in good agreement with an independent sea-level curve.
International Nuclear Information System (INIS)
Fukue, Hisayoshi; Mochizuki, Yoji; Nakamura, Harushige; Kobo, Hiroshi; Nitta, Tetsuo; Kawakami, Kiyoshi
1986-01-01
A pipe bending apparatus has recently been developed by applying high frequency induction heating. However, the smaller the radius of pipe bending, the greater becomes the reduction in wall thickness and the ovality of the pipe form. This makes it impossible to manufacture pipe bending which will meet the nuclear pipe design code. In order to solve this problem it is crucial to obtain a temperature distributions in a pipe which is moving. It is calculated by giving the following boundary conditions : distribution of the heat generation rate, and that of heat transfer of cooling water. In the process of analyzing these distributions, the following results were obtained. (1) The distribution of the heat generation rate is determined by the sink of energy flux of Poynting vectors. The coil efficiency thus calculated was sixty percent. This figure accords with the test data. (2) The distribution of heat transfer coefficient of cooling water is mainly determined by the rate of liquid film heat transfer, but departure from nucleate boiling and dryout has to be taken into consideration. (3) TRUMP CODE is modified so that the temperature distribution in moving pipes can be calculated by taking the boundary conditions into account. The calculated results were in accordance with the test data. (author)
Statistical Approaches for Spatiotemporal Prediction of Low Flows
Fangmann, A.; Haberlandt, U.
2017-12-01
An adequate assessment of regional climate change impacts on streamflow requires the integration of various sources of information and modeling approaches. This study proposes simple statistical tools for inclusion into model ensembles, which are fast and straightforward in their application, yet able to yield accurate streamflow predictions in time and space. Target variables for all approaches are annual low flow indices derived from a data set of 51 records of average daily discharge for northwestern Germany. The models require input of climatic data in the form of meteorological drought indices, derived from observed daily climatic variables, averaged over the streamflow gauges' catchments areas. Four different modeling approaches are analyzed. Basis for all pose multiple linear regression models that estimate low flows as a function of a set of meteorological indices and/or physiographic and climatic catchment descriptors. For the first method, individual regression models are fitted at each station, predicting annual low flow values from a set of annual meteorological indices, which are subsequently regionalized using a set of catchment characteristics. The second method combines temporal and spatial prediction within a single panel data regression model, allowing estimation of annual low flow values from input of both annual meteorological indices and catchment descriptors. The third and fourth methods represent non-stationary low flow frequency analyses and require fitting of regional distribution functions. Method three is subject to a spatiotemporal prediction of an index value, method four to estimation of L-moments that adapt the regional frequency distribution to the at-site conditions. The results show that method two outperforms successive prediction in time and space. Method three also shows a high performance in the near future period, but since it relies on a stationary distribution, its application for prediction of far future changes may be
Griffiths, Dawn
2009-01-01
Wouldn't it be great if there were a statistics book that made histograms, probability distributions, and chi square analysis more enjoyable than going to the dentist? Head First Statistics brings this typically dry subject to life, teaching you everything you want and need to know about statistics through engaging, interactive, and thought-provoking material, full of puzzles, stories, quizzes, visual aids, and real-world examples. Whether you're a student, a professional, or just curious about statistical analysis, Head First's brain-friendly formula helps you get a firm grasp of statistics
Ipsen, Andreas
2015-02-03
Despite the widespread use of mass spectrometry (MS) in a broad range of disciplines, the nature of MS data remains very poorly understood, and this places important constraints on the quality of MS data analysis as well as on the effectiveness of MS instrument design. In the following, a procedure for calculating the statistical distribution of the mass peak intensity for MS instruments that use analog-to-digital converters (ADCs) and electron multipliers is presented. It is demonstrated that the physical processes underlying the data-generation process, from the generation of the ions to the signal induced at the detector, and on to the digitization of the resulting voltage pulse, result in data that can be well-approximated by a Gaussian distribution whose mean and variance are determined by physically meaningful instrumental parameters. This allows for a very precise understanding of the signal-to-noise ratio of mass peak intensities and suggests novel ways of improving it. Moreover, it is a prerequisite for being able to address virtually all data analytical problems in downstream analyses in a statistically rigorous manner. The model is validated with experimental data.
Jansma, J Martijn; de Zwart, Jacco A; van Gelderen, Peter; Duyn, Jeff H; Drevets, Wayne C; Furey, Maura L
2013-05-15
Technical developments in MRI have improved signal to noise, allowing use of analysis methods such as Finite impulse response (FIR) of rapid event related functional MRI (er-fMRI). FIR is one of the most informative analysis methods as it determines onset and full shape of the hemodynamic response function (HRF) without any a priori assumptions. FIR is however vulnerable to multicollinearity, which is directly related to the distribution of stimuli over time. Efficiency can be optimized by simplifying a design, and restricting stimuli distribution to specific sequences, while more design flexibility necessarily reduces efficiency. However, the actual effect of efficiency on fMRI results has never been tested in vivo. Thus, it is currently difficult to make an informed choice between protocol flexibility and statistical efficiency. The main goal of this study was to assign concrete fMRI signal to noise values to the abstract scale of FIR statistical efficiency. Ten subjects repeated a perception task with five random and m-sequence based protocol, with varying but, according to literature, acceptable levels of multicollinearity. Results indicated substantial differences in signal standard deviation, while the level was a function of multicollinearity. Experiment protocols varied up to 55.4% in standard deviation. Results confirm that quality of fMRI in an FIR analysis can significantly and substantially vary with statistical efficiency. Our in vivo measurements can be used to aid in making an informed decision between freedom in protocol design and statistical efficiency. Published by Elsevier B.V.
Directory of Open Access Journals (Sweden)
Li Karen
2008-12-01
Full Text Available Abstract Background Widely used substitution models for proteins, such as the Jones-Taylor-Thornton (JTT or Whelan and Goldman (WAG models, are based on empirical amino acid interchange matrices estimated from databases of protein alignments that incorporate the average amino acid frequencies of the data set under examination (e.g JTT + F. Variation in the evolutionary process between sites is typically modelled by a rates-across-sites distribution such as the gamma (Γ distribution. However, sites in proteins also vary in the kinds of amino acid interchanges that are favoured, a feature that is ignored by standard empirical substitution matrices. Here we examine the degree to which the pattern of evolution at sites differs from that expected based on empirical amino acid substitution models and evaluate the impact of these deviations on phylogenetic estimation. Results We analyzed 21 large protein alignments with two statistical tests designed to detect deviation of site-specific amino acid distributions from data simulated under the standard empirical substitution model: JTT+ F + Γ. We found that the number of states at a given site is, on average, smaller and the frequencies of these states are less uniform than expected based on a JTT + F + Γ substitution model. With a four-taxon example, we show that phylogenetic estimation under the JTT + F + Γ model is seriously biased by a long-branch attraction artefact if the data are simulated under a model utilizing the observed site-specific amino acid frequencies from an alignment. Principal components analyses indicate the existence of at least four major site-specific frequency classes in these 21 protein alignments. Using a mixture model with these four separate classes of site-specific state frequencies plus a fifth class of global frequencies (the JTT + cF + Γ model, significant improvements in model fit for real data sets can be achieved. This simple mixture model also reduces the long
Bustamante, Javier; Seoane, Javier
2004-01-01
Aim To test the effectiveness of statistical models based on explanatory environmental variables vs. existing distribution information (maps and breeding atlas), for predicting the distribution of four species of raptors (family Accipitridae): common buzzard Buteo buteo (Linnaeus, 1758), short-toed eagle Circaetus gallicus (Gmelin, 1788), booted eagle Hieraaetus pennatus (Gmelin, 1788) and black kite Milvus migrans (Boddaert, 1783). Location Andalusia, southe...
Vlad, Marcel Ovidiu; Tsuchiya, Masa; Oefner, Peter; Ross, John
2002-01-01
We investigate the statistical properties of systems with random chemical composition and try to obtain a theoretical derivation of the self-similar Dirichlet distribution, which is used empirically in molecular biology, environmental chemistry, and geochemistry. We consider a system made up of many chemical species and assume that the statistical distribution of the abundance of each chemical species in the system is the result of a succession of a variable number of random dilution events, which can be described by using the renormalization-group theory. A Bayesian approach is used for evaluating the probability density of the chemical composition of the system in terms of the probability densities of the abundances of the different chemical species. We show that for large cascades of dilution events, the probability density of the composition vector of the system is given by a self-similar probability density of the Dirichlet type. We also give an alternative formal derivation for the Dirichlet law based on the maximum entropy approach, by assuming that the average values of the chemical potentials of different species, expressed in terms of molar fractions, are constant. Although the maximum entropy approach leads formally to the Dirichlet distribution, it does not clarify the physical origin of the Dirichlet statistics and has serious limitations. The random theory of dilution provides a physical picture for the emergence of Dirichlet statistics and makes it possible to investigate its validity range. We discuss the implications of our theory in molecular biology, geochemistry, and environmental science.
International Nuclear Information System (INIS)
Liehr, Sascha; Wendt, Mario; Krebber, Katerina
2010-01-01
We present the latest advances in distributed strain measurement in perfluorinated polymer optical fibres (POFs) using backscatter techniques. Compared to previously introduced poly(methyl methacrylate) POFs, the measurement length can be extended to more than 500 m at improved spatial resolution of a few centimetres. It is shown that strain in a perfluorinated POF can be measured up to 100%. In parallel to these investigations, the incoherent optical frequency domain reflectometry (OFDR) technique is introduced to detect strained fibre sections and to measure distributed length change along the fibre with sub-millimetre resolution by applying a cross-correlation algorithm to the backscatter signal. The overall superior performance of the OFDR technique compared to the optical time domain reflectometry in terms of accuracy, dynamic range, spatial resolution and measurement speed is presented. The proposed sensor system is a promising technique for use in structural health monitoring applications where the precise detection of high strain is required
Flood statistics of simple and multiple scaling; Invarianza di scala del regime di piena
Energy Technology Data Exchange (ETDEWEB)
Rosso, Renzo; Mancini, Marco; Burlando, Paolo; De Michele, Carlo [Milan, Politecnico Univ. (Italy). DIIAR; Brath, Armando [Bologna, Univ. (Italy). DISTART
1996-09-01
The variability of flood probabilities throughout the river network is investigated by introducing the concepts of simple and multiple scaling. Flood statistics and quantiles as parametrized by drainage area are considered, and a distributed geomorphoclimatic model is used to analyze in detail their scaling properties for two river basins in Thyrrhenian Liguria (North-Western Italy). Although temporal storm precipitation and spatial runoff production are not scaling, the resulting flood flows do not display substantial deviations from statistical self-similarity or simple scaling. This result has a wide potential in order to assess the concept of hydrological homogeneity, also indicating a new route towards establishing physically-based procedures for flood frequency regionalization.
International Nuclear Information System (INIS)
Ayodele, T.R.; Ogunjuyigbe, A.S.O.
2015-01-01
In this paper, probability distribution of clearness index is proposed for the prediction of global solar radiation. First, the clearness index is obtained from the past data of global solar radiation, then, the parameters of the appropriate distribution that best fit the clearness index are determined. The global solar radiation is thereafter predicted from the clearness index using inverse transformation of the cumulative distribution function. To validate the proposed method, eight years global solar radiation data (2000–2007) of Ibadan, Nigeria are used to determine the parameters of appropriate probability distribution for clearness index. The calculated parameters are then used to predict the future monthly average global solar radiation for the following year (2008). The predicted values are compared with the measured values using four statistical tests: the Root Mean Square Error (RMSE), MAE (Mean Absolute Error), MAPE (Mean Absolute Percentage Error) and the coefficient of determination (R"2). The proposed method is also compared to the existing regression models. The results show that logistic distribution provides the best fit for clearness index of Ibadan and the proposed method is effective in predicting the monthly average global solar radiation with overall RMSE of 0.383 MJ/m"2/day, MAE of 0.295 MJ/m"2/day, MAPE of 2% and R"2 of 0.967. - Highlights: • Distribution of clearnes index is proposed for prediction of global solar radiation. • The clearness index is obtained from the past data of global solar radiation. • The parameters of distribution that best fit the clearness index are determined. • Solar radiation is predicted from the clearness index using inverse transformation. • The method is effective in predicting the monthly average global solar radiation.
Alahmadi, F.; Rahman, N. A.; Abdulrazzak, M.
2014-09-01
Rainfall frequency analysis is an essential tool for the design of water related infrastructure. It can be used to predict future flood magnitudes for a given magnitude and frequency of extreme rainfall events. This study analyses the application of rainfall partial duration series (PDS) in the vast growing urban Madinah city located in the western part of Saudi Arabia. Different statistical distributions were applied (i.e. Normal, Log Normal, Extreme Value type I, Generalized Extreme Value, Pearson Type III, Log Pearson Type III) and their distribution parameters were estimated using L-moments methods. Also, different selection criteria models are applied, e.g. Akaike Information Criterion (AIC), Corrected Akaike Information Criterion (AICc), Bayesian Information Criterion (BIC) and Anderson-Darling Criterion (ADC). The analysis indicated the advantage of Generalized Extreme Value as the best fit statistical distribution for Madinah partial duration daily rainfall series. The outcome of such an evaluation can contribute toward better design criteria for flood management, especially flood protection measures.
Directory of Open Access Journals (Sweden)
F. Alahmadi
2014-09-01
Full Text Available Rainfall frequency analysis is an essential tool for the design of water related infrastructure. It can be used to predict future flood magnitudes for a given magnitude and frequency of extreme rainfall events. This study analyses the application of rainfall partial duration series (PDS in the vast growing urban Madinah city located in the western part of Saudi Arabia. Different statistical distributions were applied (i.e. Normal, Log Normal, Extreme Value type I, Generalized Extreme Value, Pearson Type III, Log Pearson Type III and their distribution parameters were estimated using L-moments methods. Also, different selection criteria models are applied, e.g. Akaike Information Criterion (AIC, Corrected Akaike Information Criterion (AICc, Bayesian Information Criterion (BIC and Anderson-Darling Criterion (ADC. The analysis indicated the advantage of Generalized Extreme Value as the best fit statistical distribution for Madinah partial duration daily rainfall series. The outcome of such an evaluation can contribute toward better design criteria for flood management, especially flood protection measures.
Langevin modelling of high-frequency Hang-Seng index data
Tang, Lei-Han
2003-06-01
Accurate statistical characterization of financial time series, such as compound stock indices, foreign currency exchange rates, etc., is fundamental to investment risk management, pricing of derivative products and financial decision making. Traditionally, such data were analyzed and modeled from a purely statistics point of view, with little concern on the specifics of financial markets. Increasingly, however, attention has been paid to the underlying economic forces and the collective behavior of investors. Here we summarize a novel approach to the statistical modeling of a major stock index (the Hang Seng index). Based on mathematical results previously derived in the fluid turbulence literature, we show that a Langevin equation with a variable noise amplitude correctly reproduces the ubiquitous fat tails in the probability distribution of intra-day price moves. The form of the Langevin equation suggests that, despite the extremely complex nature of financial concerns and investment strategies at the individual's level, there exist simple universal rules governing the high-frequency price move in a stock market.
A weighted U-statistic for genetic association analyses of sequencing data.
Wei, Changshuai; Li, Ming; He, Zihuai; Vsevolozhskaya, Olga; Schaid, Daniel J; Lu, Qing
2014-12-01
With advancements in next-generation sequencing technology, a massive amount of sequencing data is generated, which offers a great opportunity to comprehensively investigate the role of rare variants in the genetic etiology of complex diseases. Nevertheless, the high-dimensional sequencing data poses a great challenge for statistical analysis. The association analyses based on traditional statistical methods suffer substantial power loss because of the low frequency of genetic variants and the extremely high dimensionality of the data. We developed a Weighted U Sequencing test, referred to as WU-SEQ, for the high-dimensional association analysis of sequencing data. Based on a nonparametric U-statistic, WU-SEQ makes no assumption of the underlying disease model and phenotype distribution, and can be applied to a variety of phenotypes. Through simulation studies and an empirical study, we showed that WU-SEQ outperformed a commonly used sequence kernel association test (SKAT) method when the underlying assumptions were violated (e.g., the phenotype followed a heavy-tailed distribution). Even when the assumptions were satisfied, WU-SEQ still attained comparable performance to SKAT. Finally, we applied WU-SEQ to sequencing data from the Dallas Heart Study (DHS), and detected an association between ANGPTL 4 and very low density lipoprotein cholesterol. © 2014 WILEY PERIODICALS, INC.
Statistical measurement of the gamma-ray source-count distribution as a function of energy
Zechlin, H.-S.; Cuoco, A.; Donato, F.; Fornengo, N.; Regis, M.
2017-01-01
Photon counts statistics have recently been proven to provide a sensitive observable for characterizing gamma-ray source populations and for measuring the composition of the gamma-ray sky. In this work, we generalize the use of the standard 1-point probability distribution function (1pPDF) to decompose the high-latitude gamma-ray emission observed with Fermi-LAT into: (i) point-source contributions, (ii) the Galactic foreground contribution, and (iii) a diffuse isotropic background contribution. We analyze gamma-ray data in five adjacent energy bands between 1 and 171 GeV. We measure the source-count distribution dN/dS as a function of energy, and demonstrate that our results extend current measurements from source catalogs to the regime of so far undetected sources. Our method improves the sensitivity for resolving point-source populations by about one order of magnitude in flux. The dN/dS distribution as a function of flux is found to be compatible with a broken power law. We derive upper limits on further possible breaks as well as the angular power of unresolved sources. We discuss the composition of the gamma-ray sky and capabilities of the 1pPDF method.
Overdispersion in nuclear statistics
International Nuclear Information System (INIS)
Semkow, Thomas M.
1999-01-01
The modern statistical distribution theory is applied to the development of the overdispersion theory in ionizing-radiation statistics for the first time. The physical nuclear system is treated as a sequence of binomial processes, each depending on a characteristic probability, such as probability of decay, detection, etc. The probabilities fluctuate in the course of a measurement, and the physical reasons for that are discussed. If the average values of the probabilities change from measurement to measurement, which originates from the random Lexis binomial sampling scheme, then the resulting distribution is overdispersed. The generating functions and probability distribution functions are derived, followed by a moment analysis. The Poisson and Gaussian limits are also given. The distribution functions belong to a family of generalized hypergeometric factorial moment distributions by Kemp and Kemp, and can serve as likelihood functions for the statistical estimations. An application to radioactive decay with detection is described and working formulae are given, including a procedure for testing the counting data for overdispersion. More complex experiments in nuclear physics (such as solar neutrino) can be handled by this model, as well as distinguishing between the source and background
Frequency of congenital heart disease in newborns in Tuzla Canton (Bosnia and Herzegovina
Directory of Open Access Journals (Sweden)
Terzić Rifet
2013-01-01
Full Text Available The aim of this paper is to present the preliminary results of the monitoring study of the frequency of congenital heart disease in newborns in Tuzla Canton (Bosnia and Herzegovina, and their distribution by sex of the newborn and maternal age. The study used the data from the book of protocols and case records of the Clinic for Gynecology and Obstetrics, the University Clinical Center in Tuzla. The analysis of 8,521 newborns between 1 January 2007 and 31 December 2008 has resulted in the frequency of 1.76%, i.e. 1.31% for the mature newborns and 0.45% for the premature newborns respectively. Of the total number of registered anomalies, 10% was associated with congenital anomalies of other systems. No statistically significant differences were found in the subsamples of both mature and premature newborns when it comes to the distribution of congenital heart disease by sex of newborns and maternal age. The frequency registered in the analyzed period suggests the necessity of screening and monitoring congenital heart disease in the observed population.
Statistical physics, seismogenesis, and seismic hazard
Main, Ian
1996-11-01
The scaling properties of earthquake populations show remarkable similarities to those observed at or near the critical point of other composite systems in statistical physics. This has led to the development of a variety of different physical models of seismogenesis as a critical phenomenon, involving locally nonlinear dynamics, with simplified rheologies exhibiting instability or avalanche-type behavior, in a material composed of a large number of discrete elements. In particular, it has been suggested that earthquakes are an example of a "self-organized critical phenomenon" analogous to a sandpile that spontaneously evolves to a critical angle of repose in response to the steady supply of new grains at the summit. In this stationary state of marginal stability the distribution of avalanche energies is a power law, equivalent to the Gutenberg-Richter frequency-magnitude law, and the behavior is relatively insensitive to the details of the dynamics. Here we review the results of some of the composite physical models that have been developed to simulate seismogenesis on different scales during (1) dynamic slip on a preexisting fault, (2) fault growth, and (3) fault nucleation. The individual physical models share some generic features, such as a dynamic energy flux applied by tectonic loading at a constant strain rate, strong local interactions, and fluctuations generated either dynamically or by fixed material heterogeneity, but they differ significantly in the details of the assumed dynamics and in the methods of numerical solution. However, all exhibit critical or near-critical behavior, with behavior quantitatively consistent with many of the observed fractal or multifractal scaling laws of brittle faulting and earthquakes, including the Gutenberg-Richter law. Some of the results are sensitive to the details of the dynamics and hence are not strict examples of self-organized criticality. Nevertheless, the results of these different physical models share some
International Nuclear Information System (INIS)
Kawano, Takao
2014-01-01
It is known that radiation is detected at random and the radiation counts fluctuate statistically. In the present study, a radiation measurement experiment was performed to understand the randomness and statistical fluctuation of radiation counts. In the measurement, three natural radiation sources were used. The sources were fabricated from potassium chloride chemicals, chemical fertilizers and kelps. These materials contain naturally occurring potassium-40 that is a radionuclide. From high schools, junior high schools and elementary schools, nine teachers participated to the radiation measurement experiment. Each participant measured the 1-min integration counts of radiation five times using GM survey meters, and 45 sets of data were obtained for the respective natural radiation sources. It was found that the frequency of occurrence of radiation counts was distributed according to a Gaussian distribution curve, although the obtained 45 data sets of radiation counts superficially looked to be fluctuating meaninglessly. (author)
Statistical analysis of Nomao customer votes for spots of France
Pálovics, Róbert; Daróczy, Bálint; Benczúr, András; Pap, Julia; Ermann, Leonardo; Phan, Samuel; Chepelianskii, Alexei D.; Shepelyansky, Dima L.
2015-08-01
We investigate the statistical properties of votes of customers for spots of France collected by the startup company Nomao. The frequencies of votes per spot and per customer are characterized by a power law distribution which remains stable on a time scale of a decade when the number of votes is varied by almost two orders of magnitude. Using the computer science methods we explore the spectrum and the eigenvalues of a matrix containing user ratings to geolocalized items. Eigenvalues nicely map to large towns and regions but show certain level of instability as we modify the interpretation of the underlying matrix. We evaluate imputation strategies that provide improved prediction performance by reaching geographically smooth eigenvectors. We point on possible links between distribution of votes and the phenomenon of self-organized criticality.
Analysing the Severity and Frequency of Traffic Crashes in Riyadh City Using Statistical Models
Directory of Open Access Journals (Sweden)
Saleh Altwaijri
2012-12-01
Full Text Available Traffic crashes in Riyadh city cause losses in the form of deaths, injuries and property damages, in addition to the pain and social tragedy affecting families of the victims. In 2005, there were a total of 47,341 injury traffic crashes occurred in Riyadh city (19% of the total KSA crashes and 9% of those crashes were severe. Road safety in Riyadh city may have been adversely affected by: high car ownership, migration of people to Riyadh city, high daily trips reached about 6 million, high rate of income, low-cost of petrol, drivers from different nationalities, young drivers and tremendous growth in population which creates a high level of mobility and transport activities in the city. The primary objective of this paper is therefore to explore factors affecting the severity and frequency of road crashes in Riyadh city using appropriate statistical models aiming to establish effective safety policies ready to be implemented to reduce the severity and frequency of road crashes in Riyadh city. Crash data for Riyadh city were collected from the Higher Commission for the Development of Riyadh (HCDR for a period of five years from 1425H to 1429H (roughly corresponding to 2004-2008. Crash data were classified into three categories: fatal, serious-injury and slight-injury. Two nominal response models have been developed: a standard multinomial logit model (MNL and a mixed logit model to injury-related crash data. Due to a severe underreporting problem on the slight injury crashes binary and mixed binary logistic regression models were also estimated for two categories of severity: fatal and serious crashes. For frequency, two count models such as Negative Binomial (NB models were employed and the unit of analysis was 168 HAIs (wards in Riyadh city. Ward-level crash data are disaggregated by severity of the crash (such as fatal and serious injury crashes. The results from both multinomial and binary response models are found to be fairly consistent but
Analysis and applications of a frequency selective surface via a random distribution method
International Nuclear Information System (INIS)
Xie Shao-Yi; Huang Jing-Jian; Yuan Nai-Chang; Liu Li-Guo
2014-01-01
A novel frequency selective surface (FSS) for reducing radar cross section (RCS) is proposed in this paper. This FSS is based on the random distribution method, so it can be called random surface. In this paper, the stacked patches serving as periodic elements are employed for RCS reduction. Previous work has demonstrated the efficiency by utilizing the microstrip patches, especially for the reflectarray. First, the relevant theory of the method is described. Then a sample of a three-layer variable-sized stacked patch random surface with a dimension of 260 mm×260 mm is simulated, fabricated, and measured in order to demonstrate the validity of the proposed design. For the normal incidence, the 8-dB RCS reduction can be achieved both by the simulation and the measurement in 8 GHz–13 GHz. The oblique incidence of 30° is also investigated, in which the 7-dB RCS reduction can be obtained in a frequency range of 8 GHz–14 GHz. (condensed matter: electronic structure, electrical, magnetic, and optical properties)
HF heating of a plasma column at frequencies below the electron cyclotron frequency
International Nuclear Information System (INIS)
Datlov, J.; Kopecky, V.; Musil, J.; Zacek, F.; Novik, K.
1978-02-01
The dispersion of waves, excited by the helical structure in a plasma column and the heating of a tail of the electron distribution function is studied at frequencies below the electron plasma frequency and the electron cyclotron frequency. (author)
Gajendran, Ravi S; Joshi, Aparna
2012-11-01
For globally distributed teams charged with innovation, member contributions to the team are crucial for effective performance. Prior research, however, suggests that members of globally distributed teams often feel isolated and excluded from their team's activities and decisions. How can leaders of such teams foster member inclusion in team decisions? Drawing on leader-member exchange (LMX) theory, we propose that for distributed teams, LMX and communication frequency jointly shape member influence on team decisions. Findings from a test of our hypotheses using data from 40 globally distributed teams suggest that LMX can enhance member influence on team decisions when it is sustained through frequent leader-member communication. This joint effect is strengthened as team dispersion increases. At the team level, member influence on team decisions has a positive effect on team innovation. (c) 2012 APA, all rights reserved.
Zhu, Jian-Rong; Li, Jian; Zhang, Chun-Mei; Wang, Qin
2017-10-01
The decoy-state method has been widely used in commercial quantum key distribution (QKD) systems. In view of the practical decoy-state QKD with both source errors and statistical fluctuations, we propose a universal model of full parameter optimization in biased decoy-state QKD with phase-randomized sources. Besides, we adopt this model to carry out simulations of two widely used sources: weak coherent source (WCS) and heralded single-photon source (HSPS). Results show that full parameter optimization can significantly improve not only the secure transmission distance but also the final key generation rate. And when taking source errors and statistical fluctuations into account, the performance of decoy-state QKD using HSPS suffered less than that of decoy-state QKD using WCS.
International Nuclear Information System (INIS)
Rezaei, Navid; Kalantar, Mohsen
2015-01-01
Highlights: • Detailed formulation of the microgrid static and dynamic securities based on droop control and virtual inertia concepts. • Constructing a novel objective function using frequency excursion and rate of change of frequency profiles. • Ensuring the microgrid security subject to the microgrid economic and environmental policies. • Coordinated management of demand response and droop controlled distributed generation resources. • Precise scheduling of day-ahead hierarchical frequency control ancillary services using a scenario based stochastic programming. - Abstract: Low inertia stack, high penetration levels of renewable energy source and great ratio of power deviations in a small power delivery system put microgrid frequency at risk of instability. On the basis of the close coupling between the microgrid frequency and system security requirements, procurement of adequate ancillary services from cost-effective and environmental friendly resources is a great challenge requests an efficient energy management system. Motivated by this need, this paper presents a novel energy management system that is aimed to coordinately manage the demand response and distributed generation resources. The proposed approach is carried out by constructing a hierarchical frequency control structure in which the frequency dependent control functions of the microgrid components are modeled comprehensively. On the basis of the derived modeling, both the static and dynamic frequency securities of an islanded microgrid are provided in primary and secondary control levels. Besides, to cope with the low inertia stack of islanded microgrids, novel virtual inertia concept is devised based on the precise modeling of droop controlled distributed generation resources. The proposed approach is applied to typical test microgrid. Energy and hierarchical reserve resource are scheduled precisely using a scenario-based stochastic programming methodology. Moreover, analyzing the
Evidence-based orthodontics. Current statistical trends in published articles in one journal.
Law, Scott V; Chudasama, Dipak N; Rinchuse, Donald J
2010-09-01
To ascertain the number, type, and overall usage of statistics in American Journal of Orthodontics and Dentofacial (AJODO) articles for 2008. These data were then compared to data from three previous years: 1975, 1985, and 2003. The frequency and distribution of statistics used in the AJODO original articles for 2008 were dichotomized into those using statistics and those not using statistics. Statistical procedures were then broadly divided into descriptive statistics (mean, standard deviation, range, percentage) and inferential statistics (t-test, analysis of variance). Descriptive statistics were used to make comparisons. In 1975, 1985, 2003, and 2008, AJODO published 72, 87, 134, and 141 original articles, respectively. The percentage of original articles using statistics was 43.1% in 1975, 75.9% in 1985, 94.0% in 2003, and 92.9% in 2008; original articles using statistics stayed relatively the same from 2003 to 2008, with only a small 1.1% decrease. The percentage of articles using inferential statistical analyses was 23.7% in 1975, 74.2% in 1985, 92.9% in 2003, and 84.4% in 2008. Comparing AJODO publications in 2003 and 2008, there was an 8.5% increase in the use of descriptive articles (from 7.1% to 15.6%), and there was an 8.5% decrease in articles using inferential statistics (from 92.9% to 84.4%).
International Nuclear Information System (INIS)
Heinrich, S.
2006-01-01
Nucleus fission process is a very complex phenomenon and, even nowadays, no realistic models describing the overall process are available. The work presented here deals with a theoretical description of fission fragments distributions in mass, charge, energy and deformation. We have reconsidered and updated the B.D. Wilking Scission Point model. Our purpose was to test if this statistic model applied at the scission point and by introducing new results of modern microscopic calculations allows to describe quantitatively the fission fragments distributions. We calculate the surface energy available at the scission point as a function of the fragments deformations. This surface is obtained from a Hartree Fock Bogoliubov microscopic calculation which guarantee a realistic description of the potential dependence on the deformation for each fragment. The statistic balance is described by the level densities of the fragment. We have tried to avoid as much as possible the input of empirical parameters in the model. Our only parameter, the distance between each fragment at the scission point, is discussed by comparison with scission configuration obtained from full dynamical microscopic calculations. Also, the comparison between our results and experimental data is very satisfying and allow us to discuss the success and limitations of our approach. We finally proposed ideas to improve the model, in particular by applying dynamical corrections. (author)
[Frequency distribution of dibucaine numbers in 24,830 patients].
Pestel, G; Sprenger, H; Rothhammer, A
2003-06-01
Atypical cholinesterase prolongs the duration of neuromuscular blocking drugs such as succinylcholine and mivacurium. Measuring the dibucaine number identifies patients who are at risk. This study shows the frequency distribution of dibucaine numbers routinely measured and discusses avoidable clinical problems and economic implications. Dibucaine numbers were measured on a Hitachi 917-analyzer and all dibucaine numbers recorded over a period of 4 years were taken into consideration. Repeat observations were excluded. A total of 24,830 dibucaine numbers were analysed and numbers below 30 were found in 0.07% ( n=18) giving an incidence of 1:1,400. Dibucaine numbers from 30 to 70 were found in 1.23% ( n=306). On the basis of identification of the Dibucaine numbers we could avoid the administration of succinylcholine or mivacurium resulting in a cost reduction of 12,280 Euro offset against the total laboratory costs amounting to 10,470 Euro. An incidence of 1:1,400 of dibucaine numbers below 30 is higher than documented in the literature. Therefore, routine measurement of dibucaine number is a cost-effective method of identifying patients at increased risk of prolonged neuromuscular blockade due to atypical cholinesterase.
Gavare, Zanda; Revalde, Gita; Skudra, Atis
2010-01-01
The goal of the present work was the investigation of the possibility to use intensity distribution of the Q-branch lines of the hydrogen Fulcher-α diagonal band (d3Πu−→a3∑g+ electronic transition; Q-branch with v=v′=2) to determine the temperature of hydrogen containing high-frequency electrodeless lamps (HFEDLs). The values of the rotational temperatures have been obtained from the relative intensity distributions for hydrogen-helium and hydrogen-argon HFEDLs depending on the applied curren...
A generic statistical methodology to predict the maximum pit depth of a localized corrosion process
International Nuclear Information System (INIS)
Jarrah, A.; Bigerelle, M.; Guillemot, G.; Najjar, D.; Iost, A.; Nianga, J.-M.
2011-01-01
Highlights: → We propose a methodology to predict the maximum pit depth in a corrosion process. → Generalized Lambda Distribution and the Computer Based Bootstrap Method are combined. → GLD fit a large variety of distributions both in their central and tail regions. → Minimum thickness preventing perforation can be estimated with a safety margin. → Considering its applications, this new approach can help to size industrial pieces. - Abstract: This paper outlines a new methodology to predict accurately the maximum pit depth related to a localized corrosion process. It combines two statistical methods: the Generalized Lambda Distribution (GLD), to determine a model of distribution fitting with the experimental frequency distribution of depths, and the Computer Based Bootstrap Method (CBBM), to generate simulated distributions equivalent to the experimental one. In comparison with conventionally established statistical methods that are restricted to the use of inferred distributions constrained by specific mathematical assumptions, the major advantage of the methodology presented in this paper is that both the GLD and the CBBM enable a statistical treatment of the experimental data without making any preconceived choice neither on the unknown theoretical parent underlying distribution of pit depth which characterizes the global corrosion phenomenon nor on the unknown associated theoretical extreme value distribution which characterizes the deepest pits. Considering an experimental distribution of depths of pits produced on an aluminium sample, estimations of maximum pit depth using a GLD model are compared to similar estimations based on usual Gumbel and Generalized Extreme Value (GEV) methods proposed in the corrosion engineering literature. The GLD approach is shown having smaller bias and dispersion in the estimation of the maximum pit depth than the Gumbel approach both for its realization and mean. This leads to comparing the GLD approach to the GEV one
Csillik, O.; Evans, I. S.; Drăguţ, L.
2015-03-01
Automated procedures are developed to alleviate long tails in frequency distributions of morphometric variables. They minimize the skewness of slope gradient frequency distributions, and modify the kurtosis of profile and plan curvature distributions toward that of the Gaussian (normal) model. Box-Cox (for slope) and arctangent (for curvature) transformations are tested on nine digital elevation models (DEMs) of varying origin and resolution, and different landscapes, and shown to be effective. Resulting histograms are illustrated and show considerable improvements over those for previously recommended slope transformations (sine, square root of sine, and logarithm of tangent). Unlike previous approaches, the proposed method evaluates the frequency distribution of slope gradient values in a given area and applies the most appropriate transform if required. Sensitivity of the arctangent transformation is tested, showing that Gaussian-kurtosis transformations are acceptable also in terms of histogram shape. Cube root transformations of curvatures produced bimodal histograms. The transforms are applicable to morphometric variables and many others with skewed or long-tailed distributions. By avoiding long tails and outliers, they permit parametric statistics such as correlation, regression and principal component analyses to be applied, with greater confidence that requirements for linearity, additivity and even scatter of residuals (constancy of error variance) are likely to be met. It is suggested that such transformations should be routinely applied in all parametric analyses of long-tailed variables. Our Box-Cox and curvature automated transformations are based on a Python script, implemented as an easy-to-use script tool in ArcGIS.