Parallel auto-correlative statistics with VTK.
Energy Technology Data Exchange (ETDEWEB)
Pebay, Philippe Pierre; Bennett, Janine Camille
2013-08-01
This report summarizes existing statistical engines in VTK and presents both the serial and parallel auto-correlative statistics engines. It is a sequel to [PT08, BPRT09b, PT09, BPT09, PT10] which studied the parallel descriptive, correlative, multi-correlative, principal component analysis, contingency, k-means, and order statistics engines. The ease of use of the new parallel auto-correlative statistics engine is illustrated by the means of C++ code snippets and algorithm verification is provided. This report justifies the design of the statistics engines with parallel scalability in mind, and provides scalability and speed-up analysis results for the autocorrelative statistics engine.
An improved Fuzzy Kappa statistic that accounts for spatial autocorrelation
Hagen - Zanker, A.H.
2009-01-01
The Fuzzy Kappa statistic expresses the agreement between two categorical raster maps. The statistic goes beyond cell-by-cell comparison and gives partial credit to cells based on the categories found in the neighborhood. When matching categories are found at shorter distances the agreement is
International Nuclear Information System (INIS)
Barbeito, Inés; Zaragoza, Sonia; Tarrío-Saavedra, Javier; Naya, Salvador
2017-01-01
Highlights: • Intelligent web platform development for energy efficiency management in buildings. • Controlling and supervising thermal comfort and energy consumption in buildings. • Statistical quality control procedure to deal with autocorrelated data. • Open source alternative using R software. - Abstract: In this paper, a case study of performing a reliable statistical procedure to evaluate the quality of HVAC systems in buildings using data retrieved from an ad hoc big data web energy platform is presented. The proposed methodology based on statistical quality control (SQC) is used to analyze the real state of thermal comfort and energy efficiency of the offices of the company FRIDAMA (Spain) in a reliable way. Non-conformities or alarms, and the actual assignable causes of these out of control states are detected. The capability to meet specification requirements is also analyzed. Tools and packages implemented in the open-source R software are employed to apply the different procedures. First, this study proposes to fit ARIMA time series models to CTQ variables. Then, the application of Shewhart and EWMA control charts to the time series residuals is proposed to control and monitor thermal comfort and energy consumption in buildings. Once thermal comfort and consumption variability are estimated, the implementation of capability indexes for autocorrelated variables is proposed to calculate the degree to which standards specifications are met. According with case study results, the proposed methodology has detected real anomalies in HVAC installation, helping to detect assignable causes and to make appropriate decisions. One of the goals is to perform and describe step by step this statistical procedure in order to be replicated by practitioners in a better way.
Impact of Autocorrelation on Principal Components and Their Use in Statistical Process Control
DEFF Research Database (Denmark)
Vanhatalo, Erik; Kulahci, Murat
2015-01-01
A basic assumption when using principal component analysis (PCA) for inferential purposes, such as in statistical process control (SPC), is that the data are independent in time. In many industrial processes, frequent sampling and process dynamics make this assumption unrealistic rendering sampled...
Functional Maximum Autocorrelation Factors
DEFF Research Database (Denmark)
Larsen, Rasmus; Nielsen, Allan Aasbjerg
2005-01-01
MAF outperforms the functional PCA in concentrating the interesting' spectra/shape variation in one end of the eigenvalue spectrum and allows for easier interpretation of effects. Conclusions. Functional MAF analysis is a useful methods for extracting low dimensional models of temporally or spatially......Purpose. We aim at data where samples of an underlying function are observed in a spatial or temporal layout. Examples of underlying functions are reflectance spectra and biological shapes. We apply functional models based on smoothing splines and generalize the functional PCA in......\\verb+~+\\$\\backslash\\$cite{ramsay97} to functional maximum autocorrelation factors (MAF)\\verb+~+\\$\\backslash\\$cite{switzer85,larsen2001d}. We apply the method to biological shapes as well as reflectance spectra. {\\$\\backslash\\$bf Methods}. MAF seeks linear combination of the original variables that maximize autocorrelation between...
Balance Maintenance in the Upright Body Position: Analysis of Autocorrelation
Directory of Open Access Journals (Sweden)
Stodolka¹ Jacek
2016-04-01
Full Text Available The present research aimed to analyze values of the autocorrelation function measured for different time values of ground reaction forces during stable upright standing. It was hypothesized that if recording of force in time depended on the quality and way of regulating force by the central nervous system (as a regulator, then the application of autocorrelation for time series in the analysis of force changes in time function would allow to determine regulator properties and its functioning. The study was performed on 82 subjects (students, athletes, senior and junior soccer players and subjects who suffered from lower limb injuries. The research was conducted with the use of two Kistler force plates and was based on measurements of ground reaction forces taken during a 15 s period of standing upright while relaxed. The results of the autocorrelation function were statistically analyzed. The research revealed a significant correlation between a derivative extreme and velocity of reaching the extreme by the autocorrelation function, described as gradient strength. Low correlation values (all statistically significant were observed between time of the autocorrelation curve passing through 0 axis and time of reaching the first peak by the said function. Parameters computed on the basis of the autocorrelation function are a reliable means to evaluate the process of flow of stimuli in the nervous system. Significant correlations observed between the parameters of the autocorrelation function indicate that individual parameters provide similar properties of the central nervous system.
Spatial Autocorrelation and Uncertainty Associated with Remotely-Sensed Data
Directory of Open Access Journals (Sweden)
Daniel A. Griffith
2016-06-01
Full Text Available Virtually all remotely sensed data contain spatial autocorrelation, which impacts upon their statistical features of uncertainty through variance inflation, and the compounding of duplicate information. Estimating the nature and degree of this spatial autocorrelation, which is usually positive and very strong, has been hindered by computational intensity associated with the massive number of pixels in realistically-sized remotely-sensed images, a situation that more recently has changed. Recent advances in spatial statistical estimation theory support the extraction of information and the distilling of knowledge from remotely-sensed images in a way that accounts for latent spatial autocorrelation. This paper summarizes an effective methodological approach to achieve this end, illustrating results with a 2002 remotely sensed-image of the Florida Everglades, and simulation experiments. Specifically, uncertainty of spatial autocorrelation parameter in a spatial autoregressive model is modeled with a beta-beta mixture approach and is further investigated with three different sampling strategies: coterminous sampling, random sub-region sampling, and increasing domain sub-regions. The results suggest that uncertainty associated with remotely-sensed data should be cast in consideration of spatial autocorrelation. It emphasizes that one remaining challenge is to better quantify the spatial variability of spatial autocorrelation estimates across geographic landscapes.
Velocity and stress autocorrelation decay in isothermal dissipative particle dynamics
Chaudhri, Anuj; Lukes, Jennifer R.
2010-02-01
The velocity and stress autocorrelation decay in a dissipative particle dynamics ideal fluid model is analyzed in this paper. The autocorrelation functions are calculated at three different friction parameters and three different time steps using the well-known Groot/Warren algorithm and newer algorithms including self-consistent leap-frog, self-consistent velocity Verlet and Shardlow first and second order integrators. At low friction values, the velocity autocorrelation function decays exponentially at short times, shows slower-than exponential decay at intermediate times, and approaches zero at long times for all five integrators. As friction value increases, the deviation from exponential behavior occurs earlier and is more pronounced. At small time steps, all the integrators give identical decay profiles. As time step increases, there are qualitative and quantitative differences between the integrators. The stress correlation behavior is markedly different for the algorithms. The self-consistent velocity Verlet and the Shardlow algorithms show very similar stress autocorrelation decay with change in friction parameter, whereas the Groot/Warren and leap-frog schemes show variations at higher friction factors. Diffusion coefficients and shear viscosities are calculated using Green-Kubo integration of the velocity and stress autocorrelation functions. The diffusion coefficients match well-known theoretical results at low friction limits. Although the stress autocorrelation function is different for each integrator, fluctuates rapidly, and gives poor statistics for most of the cases, the calculated shear viscosities still fall within range of theoretical predictions and nonequilibrium studies.
Multivariate Process Control with Autocorrelated Data
DEFF Research Database (Denmark)
Kulahci, Murat
2011-01-01
As sensor and computer technology continues to improve, it becomes a normal occurrence that we confront with high dimensional data sets. As in many areas of industrial statistics, this brings forth various challenges in statistical process control and monitoring. This new high dimensional data...... often exhibit not only cross-‐correlation among the quality characteristics of interest but also serial dependence as a consequence of high sampling frequency and system dynamics. In practice, the most common method of monitoring multivariate data is through what is called the Hotelling’s T2 statistic....... In this paper, we discuss the effect of autocorrelation (when it is ignored) on multivariate control charts based on these methods and provide some practical suggestions and remedies to overcome this problem....
Assessment of smoothed spectra using autocorrelation function
International Nuclear Information System (INIS)
Urbanski, P.; Kowalska, E.
2006-01-01
Recently, data and signal smoothing became almost standard procedures in the spectrometric and chromatographic methods. In radiometry, the main purpose to apply smoothing is minimisation of the statistical fluctuation and avoid distortion. The aim of the work was to find a qualitative parameter, which could be used, as a figure of merit for detecting distortion of the smoothed spectra, based on the linear model. It is assumed that as long as the part of the raw spectrum removed by the smoothing procedure (v s ) will be of random nature, the smoothed spectrum can be considered as undistorted. Thanks to this feature of the autocorrelation function, drifts of the mean value in the removed noise vs as well as its periodicity can be more easily detected from the autocorrelogram than from the original data
A simple method to estimate interwell autocorrelation
Energy Technology Data Exchange (ETDEWEB)
Pizarro, J.O.S.; Lake, L.W. [Univ. of Texas, Austin, TX (United States)
1997-08-01
The estimation of autocorrelation in the lateral or interwell direction is important when performing reservoir characterization studies using stochastic modeling. This paper presents a new method to estimate the interwell autocorrelation based on parameters, such as the vertical range and the variance, that can be estimated with commonly available data. We used synthetic fields that were generated from stochastic simulations to provide data to construct the estimation charts. These charts relate the ratio of areal to vertical variance and the autocorrelation range (expressed variously) in two directions. Three different semivariogram models were considered: spherical, exponential and truncated fractal. The overall procedure is demonstrated using field data. We find that the approach gives the most self-consistent results when it is applied to previously identified facies. Moreover, the autocorrelation trends follow the depositional pattern of the reservoir, which gives confidence in the validity of the approach.
General simulation algorithm for autocorrelated binary processes.
Serinaldi, Francesco; Lombardo, Federico
2017-02-01
The apparent ubiquity of binary random processes in physics and many other fields has attracted considerable attention from the modeling community. However, generation of binary sequences with prescribed autocorrelation is a challenging task owing to the discrete nature of the marginal distributions, which makes the application of classical spectral techniques problematic. We show that such methods can effectively be used if we focus on the parent continuous process of beta distributed transition probabilities rather than on the target binary process. This change of paradigm results in a simulation procedure effectively embedding a spectrum-based iterative amplitude-adjusted Fourier transform method devised for continuous processes. The proposed algorithm is fully general, requires minimal assumptions, and can easily simulate binary signals with power-law and exponentially decaying autocorrelation functions corresponding, for instance, to Hurst-Kolmogorov and Markov processes. An application to rainfall intermittency shows that the proposed algorithm can also simulate surrogate data preserving the empirical autocorrelation.
General simulation algorithm for autocorrelated binary processes
Serinaldi, Francesco; Lombardo, Federico
2017-02-01
The apparent ubiquity of binary random processes in physics and many other fields has attracted considerable attention from the modeling community. However, generation of binary sequences with prescribed autocorrelation is a challenging task owing to the discrete nature of the marginal distributions, which makes the application of classical spectral techniques problematic. We show that such methods can effectively be used if we focus on the parent continuous process of beta distributed transition probabilities rather than on the target binary process. This change of paradigm results in a simulation procedure effectively embedding a spectrum-based iterative amplitude-adjusted Fourier transform method devised for continuous processes. The proposed algorithm is fully general, requires minimal assumptions, and can easily simulate binary signals with power-law and exponentially decaying autocorrelation functions corresponding, for instance, to Hurst-Kolmogorov and Markov processes. An application to rainfall intermittency shows that the proposed algorithm can also simulate surrogate data preserving the empirical autocorrelation.
O'Shaughnessy, Patrick; Cavanaugh, Joseph E
2015-01-01
Industrial hygienists now commonly use direct-reading instruments to evaluate hazards in the workplace. The stored values over time from these instruments constitute a time series of measurements that are often autocorrelated. Given the need to statistically compare two occupational scenarios using values from a direct-reading instrument, a t-test must consider measurement autocorrelation or the resulting test will have a largely inflated type-1 error probability (false rejection of the null hypothesis). A method is described for both the one-sample and two-sample cases which properly adjusts for autocorrelation. This method involves the computation of an "equivalent sample size" that effectively decreases the actual sample size when determining the standard error of the mean for the time series. An example is provided for the one-sample case, and an example is given where a two-sample t-test is conducted for two autocorrelated time series comprised of lognormally distributed measurements.
Linear Prediction Using Refined Autocorrelation Function
Directory of Open Access Journals (Sweden)
M. Shahidur Rahman
2007-07-01
Full Text Available This paper proposes a new technique for improving the performance of linear prediction analysis by utilizing a refined version of the autocorrelation function. Problems in analyzing voiced speech using linear prediction occur often due to the harmonic structure of the excitation source, which causes the autocorrelation function to be an aliased version of that of the vocal tract impulse response. To estimate the vocal tract characteristics accurately, however, the effect of aliasing must be eliminated. In this paper, we employ homomorphic deconvolution technique in the autocorrelation domain to eliminate the aliasing effect occurred due to periodicity. The resulted autocorrelation function of the vocal tract impulse response is found to produce significant improvement in estimating formant frequencies. The accuracy of formant estimation is verified on synthetic vowels for a wide range of pitch frequencies typical for male and female speakers. The validity of the proposed method is also illustrated by inspecting the spectral envelopes of natural speech spoken by high-pitched female speaker. The synthesis filter obtained by the current method is guaranteed to be stable, which makes the method superior to many of its alternatives.
Autocorrelation in queuing network-type production systems - revisited
DEFF Research Database (Denmark)
Nielsen, Erland Hejn
2007-01-01
, either production managers are missing important aspects in production planning, or the 'realistic' autocorrelation patterns inherent in actual production setups are not like those considered in the literature. In this paper, relevant and 'realistic' types of autocorrelation schemes are characterised...
Logistic regression for southern pine beetle outbreaks with spatial and temporal autocorrelation
M. L. Gumpertz; C.-T. Wu; John M. Pye
2000-01-01
Regional outbreaks of southern pine beetle (Dendroctonus frontalis Zimm.) show marked spatial and temporal patterns. While these patterns are of interest in themselves, we focus on statistical methods for estimating the effects of underlying environmental factors in the presence of spatial and temporal autocorrelation. The most comprehensive available information on...
Power properties of invariant tests for spatial autocorrelation in linear regression
Martellosio, F.
2006-01-01
Many popular tests for residual spatial autocorrelation in the context of the linear regression model belong to the class of invariant tests. This paper derives a number of exact properties of the power function of such tests. In particular, we extend the work of Krämer (2005, Journal of Statistical
Rigorous home range estimation with movement data: a new autocorrelated kernel density estimator.
Fleming, C H; Fagan, W F; Mueller, T; Olson, K A; Leimgruber, P; Calabrese, J M
2015-05-01
Quantifying animals' home ranges is a key problem in ecology and has important conservation and wildlife management applications. Kernel density estimation (KDE) is a workhorse technique for range delineation problems that is both statistically efficient and nonparametric. KDE assumes that the data are independent and identically distributed (IID). However, animal tracking data, which are routinely used as inputs to KDEs, are inherently autocorrelated and violate this key assumption. As we demonstrate, using realistically autocorrelated data in conventional KDEs results in grossly underestimated home ranges. We further show that the performance of conventional KDEs actually degrades as data quality improves, because autocorrelation strength increases as movement paths become more finely resolved. To remedy these flaws with the traditional KDE method, we derive an autocorrelated KDE (AKDE) from first principles to use autocorrelated data, making it perfectly suited for movement data sets. We illustrate the vastly improved performance of AKDE using analytical arguments, relocation data from Mongolian gazelles, and simulations based upon the gazelle's observed movement process. By yielding better minimum area estimates for threatened wildlife populations, we believe that future widespread use of AKDE will have significant impact on ecology and conservation biology.
Spectrum sensing algorithm based on autocorrelation energy in cognitive radio networks
Ren, Shengwei; Zhang, Li; Zhang, Shibing
2016-10-01
Cognitive radio networks have wide applications in the smart home, personal communications and other wireless communication. Spectrum sensing is the main challenge in cognitive radios. This paper proposes a new spectrum sensing algorithm which is based on the autocorrelation energy of signal received. By taking the autocorrelation energy of the received signal as the statistics of spectrum sensing, the effect of the channel noise on the detection performance is reduced. Simulation results show that the algorithm is effective and performs well in low signal-to-noise ratio. Compared with the maximum generalized eigenvalue detection (MGED) algorithm, function of covariance matrix based detection (FMD) algorithm and autocorrelation-based detection (AD) algorithm, the proposed algorithm has 2 11 dB advantage.
A Comparison of Various Forecasting Methods for Autocorrelated Time Series
Directory of Open Access Journals (Sweden)
Karin Kandananond
2012-07-01
Full Text Available The accuracy of forecasts significantly affects the overall performance of a whole supply chain system. Sometimes, the nature of consumer products might cause difficulties in forecasting for the future demands because of its complicated structure. In this study, two machine learning methods, artificial neural network (ANN and support vector machine (SVM, and a traditional approach, the autoregressive integrated moving average (ARIMA model, were utilized to predict the demand for consumer products. The training data used were the actual demand of six different products from a consumer product company in Thailand. Initially, each set of data was analysed using Ljung‐Box‐Q statistics to test for autocorrelation. Afterwards, each method was applied to different sets of data. The results indicated that the SVM method had a better forecast quality (in terms of MAPE than ANN and ARIMA in every category of products.
Response predictions using the observed autocorrelation function
DEFF Research Database (Denmark)
Nielsen, Ulrik Dam; H. Brodtkorb, Astrid; Jensen, Jørgen Juncher
2018-01-01
This article studies a procedure that facilitates short-time, deterministic predictions of the wave-induced motion of a marine vessel, where it is understood that the future motion of the vessel is calculated ahead of time. Such predictions are valuable to assist in the execution of many marine......-induced response in study. Thus, predicted (future) values ahead of time for a given time history recording are computed through a mathematical combination of the sample autocorrelation function and previous measurements recorded just prior to the moment of action. Importantly, the procedure does not need input...... show that predictions can be successfully made in a time horizon corresponding to about 8-9 wave periods ahead of current time (the moment of action)....
Hayslett, H T
1991-01-01
Statistics covers the basic principles of Statistics. The book starts by tackling the importance and the two kinds of statistics; the presentation of sample data; the definition, illustration and explanation of several measures of location; and the measures of variation. The text then discusses elementary probability, the normal distribution and the normal approximation to the binomial. Testing of statistical hypotheses and tests of hypotheses about the theoretical proportion of successes in a binomial population and about the theoretical mean of a normal population are explained. The text the
Consequences of spatial autocorrelation for niche-based models
DEFF Research Database (Denmark)
Segurado, P.; Araújo, Miguel B.; Kunin, W. E.
2006-01-01
1. Spatial autocorrelation is an important source of bias in most spatial analyses. We explored the bias introduced by spatial autocorrelation on the explanatory and predictive power of species' distribution models, and make recommendations for dealing with the problem. 2. Analyses were based o...
Inference for local autocorrelations in locally stationary models.
Zhao, Zhibiao
2015-04-01
For non-stationary processes, the time-varying correlation structure provides useful insights into the underlying model dynamics. We study estimation and inferences for local autocorrelation process in locally stationary time series. Our constructed simultaneous confidence band can be used to address important hypothesis testing problems, such as whether the local autocorrelation process is indeed time-varying and whether the local autocorrelation is zero. In particular, our result provides an important generalization of the R function acf() to locally stationary Gaussian processes. Simulation studies and two empirical applications are developed. For the global temperature series, we find that the local autocorrelations are time-varying and have a "V" shape during 1910-1960. For the S&P 500 index, we conclude that the returns satisfy the efficient-market hypothesis whereas the magnitudes of returns show significant local autocorrelations.
Crude oil market efficiency and modeling. Insights from the multiscaling autocorrelation pattern
International Nuclear Information System (INIS)
Alvarez-Ramirez, Jose; Alvarez, Jesus; Solis, Ricardo
2010-01-01
Empirical research on market inefficiencies focuses on the detection of autocorrelations in price time series. In the case of crude oil markets, statistical support is claimed for weak efficiency over a wide range of time-scales. However, the results are still controversial since theoretical arguments point to deviations from efficiency as prices tend to revert towards an equilibrium path. This paper studies the efficiency of crude oil markets by using lagged detrended fluctuation analysis (DFA) to detect delay effects in price autocorrelations quantified in terms of a multiscaling Hurst exponent (i.e., autocorrelations are dependent of the time scale). Results based on spot price data for the period 1986-2009 indicate important deviations from efficiency associated to lagged autocorrelations, so imposing the random walk for crude oil prices has pronounced costs for forecasting. Evidences in favor of price reversion to a continuously evolving mean underscores the importance of adequately incorporating delay effects and multiscaling behavior in the modeling of crude oil price dynamics. (author)
Crude oil market efficiency and modeling. Insights from the multiscaling autocorrelation pattern
Energy Technology Data Exchange (ETDEWEB)
Alvarez-Ramirez, Jose [Departamento de Ingenieria de Procesos e Hidraulica, Universidad Autonoma Metropolitana-Iztapalapa, Apartado Postal 55-534, Mexico D.F., 09340 (Mexico); Departamento de Economia, Universidad Autonoma Metropolitana-Iztapalapa, Apartado Postal 55-534, Mexico D.F., 09340 (Mexico); Alvarez, Jesus [Departamento de Ingenieria de Procesos e Hidraulica, Universidad Autonoma Metropolitana-Iztapalapa, Apartado Postal 55-534, Mexico D.F., 09340 (Mexico); Solis, Ricardo [Departamento de Economia, Universidad Autonoma Metropolitana-Iztapalapa, Apartado Postal 55-534, Mexico D.F., 09340 (Mexico)
2010-09-15
Empirical research on market inefficiencies focuses on the detection of autocorrelations in price time series. In the case of crude oil markets, statistical support is claimed for weak efficiency over a wide range of time-scales. However, the results are still controversial since theoretical arguments point to deviations from efficiency as prices tend to revert towards an equilibrium path. This paper studies the efficiency of crude oil markets by using lagged detrended fluctuation analysis (DFA) to detect delay effects in price autocorrelations quantified in terms of a multiscaling Hurst exponent (i.e., autocorrelations are dependent of the time scale). Results based on spot price data for the period 1986-2009 indicate important deviations from efficiency associated to lagged autocorrelations, so imposing the random walk for crude oil prices has pronounced costs for forecasting. Evidences in favor of price reversion to a continuously evolving mean underscores the importance of adequately incorporating delay effects and multiscaling behavior in the modeling of crude oil price dynamics. (author)
Links to sources of cancer-related statistics, including the Surveillance, Epidemiology and End Results (SEER) Program, SEER-Medicare datasets, cancer survivor prevalence data, and the Cancer Trends Progress Report.
Autocorrelation and cross-correlation in time series of homicide and attempted homicide
Machado Filho, A.; da Silva, M. F.; Zebende, G. F.
2014-04-01
We propose in this paper to establish the relationship between homicides and attempted homicides by a non-stationary time-series analysis. This analysis will be carried out by Detrended Fluctuation Analysis (DFA), Detrended Cross-Correlation Analysis (DCCA), and DCCA cross-correlation coefficient, ρ(n). Through this analysis we can identify a positive cross-correlation between homicides and attempted homicides. At the same time, looked at from the point of view of autocorrelation (DFA), this analysis can be more informative depending on time scale. For short scale (days), we cannot identify auto-correlations, on the scale of weeks DFA presents anti-persistent behavior, and for long time scales (n>90 days) DFA presents a persistent behavior. Finally, the application of this new type of statistical analysis proved to be efficient and, in this sense, this paper can contribute to a more accurate descriptive statistics of crime.
International Nuclear Information System (INIS)
2005-01-01
For the years 2004 and 2005 the figures shown in the tables of Energy Review are partly preliminary. The annual statistics published in Energy Review are presented in more detail in a publication called Energy Statistics that comes out yearly. Energy Statistics also includes historical time-series over a longer period of time (see e.g. Energy Statistics, Statistics Finland, Helsinki 2004.) The applied energy units and conversion coefficients are shown in the back cover of the Review. Explanatory notes to the statistical tables can be found after tables and figures. The figures presents: Changes in GDP, energy consumption and electricity consumption, Carbon dioxide emissions from fossile fuels use, Coal consumption, Consumption of natural gas, Peat consumption, Domestic oil deliveries, Import prices of oil, Consumer prices of principal oil products, Fuel prices in heat production, Fuel prices in electricity production, Price of electricity by type of consumer, Average monthly spot prices at the Nord pool power exchange, Total energy consumption by source and CO 2 -emissions, Supplies and total consumption of electricity GWh, Energy imports by country of origin in January-June 2003, Energy exports by recipient country in January-June 2003, Consumer prices of liquid fuels, Consumer prices of hard coal, natural gas and indigenous fuels, Price of natural gas by type of consumer, Price of electricity by type of consumer, Price of district heating by type of consumer, Excise taxes, value added taxes and fiscal charges and fees included in consumer prices of some energy sources and Energy taxes, precautionary stock fees and oil pollution fees
Autocorrelation in queuing network type production systems - revisited
DEFF Research Database (Denmark)
Nielsen, Erland Hejn
-production systems (Takahashi and Nakamura, 1998) establishes that autocorrelation plays definitely a non-negligible role in relation to the dimensioning as well as functioning of Kanban-controlled production flow lines . This must logically either imply that production managers are missing an important aspect...... in their production planning reasoning, or that the 'realistic' autocorrelation patterns , inherent in actual production setups , are not like those so far considered in the literature. In this paper, an attempt to characterise relevant and 'realistic' types of autocorrelation schemes as well as their levels...
International Nuclear Information System (INIS)
2001-01-01
For the year 2000, part of the figures shown in the tables of the Energy Review are preliminary or estimated. The annual statistics of the Energy Review appear in more detail from the publication Energiatilastot - Energy Statistics issued annually, which also includes historical time series over a longer period (see e.g. Energiatilastot 1999, Statistics Finland, Helsinki 2000, ISSN 0785-3165). The inside of the Review's back cover shows the energy units and the conversion coefficients used for them. Explanatory notes to the statistical tables can be found after tables and figures. The figures presents: Changes in the volume of GNP and energy consumption, Changes in the volume of GNP and electricity, Coal consumption, Natural gas consumption, Peat consumption, Domestic oil deliveries, Import prices of oil, Consumer prices of principal oil products, Fuel prices for heat production, Fuel prices for electricity production, Carbon dioxide emissions from the use of fossil fuels, Total energy consumption by source and CO 2 -emissions, Electricity supply, Energy imports by country of origin in 2000, Energy exports by recipient country in 2000, Consumer prices of liquid fuels, Consumer prices of hard coal, natural gas and indigenous fuels, Average electricity price by type of consumer, Price of district heating by type of consumer, Excise taxes, value added taxes and fiscal charges and fees included in consumer prices of some energy sources and Energy taxes and precautionary stock fees on oil products
International Nuclear Information System (INIS)
2000-01-01
For the year 1999 and 2000, part of the figures shown in the tables of the Energy Review are preliminary or estimated. The annual statistics of the Energy Review appear in more detail from the publication Energiatilastot - Energy Statistics issued annually, which also includes historical time series over a longer period (see e.g., Energiatilastot 1998, Statistics Finland, Helsinki 1999, ISSN 0785-3165). The inside of the Review's back cover shows the energy units and the conversion coefficients used for them. Explanatory notes to the statistical tables can be found after tables and figures. The figures presents: Changes in the volume of GNP and energy consumption, Changes in the volume of GNP and electricity, Coal consumption, Natural gas consumption, Peat consumption, Domestic oil deliveries, Import prices of oil, Consumer prices of principal oil products, Fuel prices for heat production, Fuel prices for electricity production, Carbon dioxide emissions, Total energy consumption by source and CO 2 -emissions, Electricity supply, Energy imports by country of origin in January-March 2000, Energy exports by recipient country in January-March 2000, Consumer prices of liquid fuels, Consumer prices of hard coal, natural gas and indigenous fuels, Average electricity price by type of consumer, Price of district heating by type of consumer, Excise taxes, value added taxes and fiscal charges and fees included in consumer prices of some energy sources and Energy taxes and precautionary stock fees on oil products
International Nuclear Information System (INIS)
1999-01-01
For the year 1998 and the year 1999, part of the figures shown in the tables of the Energy Review are preliminary or estimated. The annual statistics of the Energy Review appear in more detail from the publication Energiatilastot - Energy Statistics issued annually, which also includes historical time series over a longer period (see e.g. Energiatilastot 1998, Statistics Finland, Helsinki 1999, ISSN 0785-3165). The inside of the Review's back cover shows the energy units and the conversion coefficients used for them. Explanatory notes to the statistical tables can be found after tables and figures. The figures presents: Changes in the volume of GNP and energy consumption, Changes in the volume of GNP and electricity, Coal consumption, Natural gas consumption, Peat consumption, Domestic oil deliveries, Import prices of oil, Consumer prices of principal oil products, Fuel prices for heat production, Fuel prices for electricity production, Carbon dioxide emissions, Total energy consumption by source and CO 2 -emissions, Electricity supply, Energy imports by country of origin in January-June 1999, Energy exports by recipient country in January-June 1999, Consumer prices of liquid fuels, Consumer prices of hard coal, natural gas and indigenous fuels, Average electricity price by type of consumer, Price of district heating by type of consumer, Excise taxes, value added taxes and fiscal charges and fees included in consumer prices of some energy sources and Energy taxes and precautionary stock fees on oil products
International Nuclear Information System (INIS)
2003-01-01
For the year 2002, part of the figures shown in the tables of the Energy Review are partly preliminary. The annual statistics of the Energy Review also includes historical time-series over a longer period (see e.g. Energiatilastot 2001, Statistics Finland, Helsinki 2002). The applied energy units and conversion coefficients are shown in the inside back cover of the Review. Explanatory notes to the statistical tables can be found after tables and figures. The figures presents: Changes in GDP, energy consumption and electricity consumption, Carbon dioxide emissions from fossile fuels use, Coal consumption, Consumption of natural gas, Peat consumption, Domestic oil deliveries, Import prices of oil, Consumer prices of principal oil products, Fuel prices in heat production, Fuel prices in electricity production, Price of electricity by type of consumer, Average monthly spot prices at the Nord pool power exchange, Total energy consumption by source and CO 2 -emissions, Supply and total consumption of electricity GWh, Energy imports by country of origin in January-June 2003, Energy exports by recipient country in January-June 2003, Consumer prices of liquid fuels, Consumer prices of hard coal, natural gas and indigenous fuels, Price of natural gas by type of consumer, Price of electricity by type of consumer, Price of district heating by type of consumer, Excise taxes, value added taxes and fiscal charges and fees included in consumer prices of some energy sources and Excise taxes, precautionary stock fees on oil pollution fees on energy products
International Nuclear Information System (INIS)
2004-01-01
For the year 2003 and 2004, the figures shown in the tables of the Energy Review are partly preliminary. The annual statistics of the Energy Review also includes historical time-series over a longer period (see e.g. Energiatilastot, Statistics Finland, Helsinki 2003, ISSN 0785-3165). The applied energy units and conversion coefficients are shown in the inside back cover of the Review. Explanatory notes to the statistical tables can be found after tables and figures. The figures presents: Changes in GDP, energy consumption and electricity consumption, Carbon dioxide emissions from fossile fuels use, Coal consumption, Consumption of natural gas, Peat consumption, Domestic oil deliveries, Import prices of oil, Consumer prices of principal oil products, Fuel prices in heat production, Fuel prices in electricity production, Price of electricity by type of consumer, Average monthly spot prices at the Nord pool power exchange, Total energy consumption by source and CO 2 -emissions, Supplies and total consumption of electricity GWh, Energy imports by country of origin in January-March 2004, Energy exports by recipient country in January-March 2004, Consumer prices of liquid fuels, Consumer prices of hard coal, natural gas and indigenous fuels, Price of natural gas by type of consumer, Price of electricity by type of consumer, Price of district heating by type of consumer, Excise taxes, value added taxes and fiscal charges and fees included in consumer prices of some energy sources and Excise taxes, precautionary stock fees on oil pollution fees
International Nuclear Information System (INIS)
2000-01-01
For the year 1999 and 2000, part of the figures shown in the tables of the Energy Review are preliminary or estimated. The annual statistics of the Energy also includes historical time series over a longer period (see e.g., Energiatilastot 1999, Statistics Finland, Helsinki 2000, ISSN 0785-3165). The inside of the Review's back cover shows the energy units and the conversion coefficients used for them. Explanatory notes to the statistical tables can be found after tables and figures. The figures presents: Changes in the volume of GNP and energy consumption, Changes in the volume of GNP and electricity, Coal consumption, Natural gas consumption, Peat consumption, Domestic oil deliveries, Import prices of oil, Consumer prices of principal oil products, Fuel prices for heat production, Fuel prices for electricity production, Carbon dioxide emissions, Total energy consumption by source and CO 2 -emissions, Electricity supply, Energy imports by country of origin in January-June 2000, Energy exports by recipient country in January-June 2000, Consumer prices of liquid fuels, Consumer prices of hard coal, natural gas and indigenous fuels, Average electricity price by type of consumer, Price of district heating by type of consumer, Excise taxes, value added taxes and fiscal charges and fees included in consumer prices of some energy sources and Energy taxes and precautionary stock fees on oil products
Estimating the variation, autocorrelation, and environmental sensitivity of phenotypic selection
Chevin, Luis-Miguel; Visser, Marcel E.; Tufto, Jarle
2015-01-01
Despite considerable interest in temporal and spatial variation of phenotypic selection, very few methods allow quantifying this variation while correctly accounting for the error variance of each individual estimate. Furthermore, the available methods do not estimate the autocorrelation of
Thirty-two phase sequences design with good autocorrelation ...
Indian Academy of Sciences (India)
mum peak aperiodic autocorrelation sidelobe level one are called Barker Sequences. ... the generation and processing of polyphase signals have now become easy ..... Cook C E, Bernfield M 1967 An introduction to theory and application.
Estimating the variation, autocorrelation, and environmental sensitivity of phenotypic selection
Chevin, Luis-Miguel; Visser, Marcel E.; Tufto, Jarle
Despite considerable interest in temporal and spatial variation of phenotypic selection, very few methods allow quantifying this variation while correctly accounting for the error variance of each individual estimate. Furthermore, the available methods do not estimate the autocorrelation of
Spatial autocorrelation analysis of tourist arrivals using municipal data: A Serbian example
Directory of Open Access Journals (Sweden)
Stankov Uglješa
2017-01-01
Full Text Available Spatial autocorrelation methodologies can be used to reveal patterns and temporal changes of different spatial variables, including tourism arrivals. The research adopts a GIS-based approach to spatially analyse tourist arrivals in Serbia, using Global Moran's I and Anselin's Local Moran's I statistics applied on the level of municipalities. To assess feasibility of this approach the article discusses spatial changes of tourist arrivals in order to identify potentially significant trends of interest for tourism development policy in Serbia. There is a significant spatial inequality in the distribution of tourism arrivals in Serbia that is not adequately addressed in tourism development plans. The results of global autocorrelation suggest the existence of low and decreasing spatial clustering for domestic tourist arrivals and high, relatively stable spatial clustering for international tourists. Local autocorrelation statistics revealed different of domestic and international tourism arrivals. In order to assess feasibility of this approach these results are discussed in their significance to tourism development policy in Serbia.
Modified Exponential Weighted Moving Average (EWMA) Control Chart on Autocorrelation Data
Herdiani, Erna Tri; Fandrilla, Geysa; Sunusi, Nurtiti
2018-03-01
In general, observations of the statistical process control are assumed to be mutually independence. However, this assumption is often violated in practice. Consequently, statistical process controls were developed for interrelated processes, including Shewhart, Cumulative Sum (CUSUM), and exponentially weighted moving average (EWMA) control charts in the data that were autocorrelation. One researcher stated that this chart is not suitable if the same control limits are used in the case of independent variables. For this reason, it is necessary to apply the time series model in building the control chart. A classical control chart for independent variables is usually applied to residual processes. This procedure is permitted provided that residuals are independent. In 1978, Shewhart modification for the autoregressive process was introduced by using the distance between the sample mean and the target value compared to the standard deviation of the autocorrelation process. In this paper we will examine the mean of EWMA for autocorrelation process derived from Montgomery and Patel. Performance to be investigated was investigated by examining Average Run Length (ARL) based on the Markov Chain Method.
Energy Technology Data Exchange (ETDEWEB)
Rouyer, A. [CEA Bruyeres-le-Chatel, 91 (France)
2005-10-15
Coded aperture imagery is particularly suited for imaging objects emitting penetrating radiation (hard X rays, gamma, neutrons), or for particles with rectilinear trajectories (electrons, protons, alpha particles, etc.). It is used when methods based on classical optical principles (reflection, refraction, diffraction), are invalid, or when the source emission is too weak for the well known pinhole method to give a usable image. The optical system consists in an aperture through an absorbing screen, named coding aperture, whose transmission is calculated in such a way that the spatial resolution is similar to that of a simple pinhole device, but with a far superior radiation collecting efficiency. We present a new decoding method,, called filtered autocorrelation, and illustrate its performances on images obtained with various coding apertures. (author)
Influence of the nuclear autocorrelation function on the positron production in heavy-ion collisions
International Nuclear Information System (INIS)
Tomoda, T.; Weidenmueller, H.A.
1983-01-01
The influence of a nuclear reaction on atomic positron production in heavy-ion collisions is investigated. Using statistical concepts, we describe the nuclear S matrix for a heavy-ion induced reaction as a statistically fluctuating function of energy. The positron production rate is then dependent on the autocorrelation function of this S matrix, and on the ratio of the ''direct'' versus the ''fluctuating'' part of the nuclear cross section. Numerical calculations show that in this way, current experimental results on positron production in heavy-ion collisions can be reproduced in a semiquantitative fashion
Autocorrelations in hybrid Monte Carlo simulations
International Nuclear Information System (INIS)
Schaefer, Stefan; Virotta, Francesco
2010-11-01
Simulations of QCD suffer from severe critical slowing down towards the continuum limit. This problem is known to be prominent in the topological charge, however, all observables are affected to various degree by these slow modes in the Monte Carlo evolution. We investigate the slowing down in high statistics simulations and propose a new error analysis method, which gives a realistic estimate of the contribution of the slow modes to the errors. (orig.)
Binary codes with impulse autocorrelation functions for dynamic experiments
International Nuclear Information System (INIS)
Corran, E.R.; Cummins, J.D.
1962-09-01
A series of binary codes exist which have autocorrelation functions approximating to an impulse function. Signals whose behaviour in time can be expressed by such codes have spectra which are 'whiter' over a limited bandwidth and for a finite time than signals from a white noise generator. These codes are used to determine system dynamic responses using the correlation technique. Programmes have been written to compute codes of arbitrary length and to compute 'cyclic' autocorrelation and cross-correlation functions. Complete listings of these programmes are given, and a code of 1019 bits is presented. (author)
Estimating the Autocorrelated Error Model with Trended Data: Further Results,
1979-11-01
Perhaps the most serious deficiency of OLS in the presence of autocorrelation is not inefficiency but bias in its estimated standard errors--a bias...k for all t has variance var(b) = o2/ Tk2 2This refutes Maeshiro’s (1976) conjecture that "an estimator utilizing relevant extraneous information
Waveguide superconducting single-photon autocorrelators for quantum photonic applications
Sahin, D.; Gaggero, A.; Frucci, G.; Jahanmirinejad, S.; Sprengers, J.P.; Mattioli, F.; Leoni, R.; Beetz, J.; Lermer, M.; Kamp, M.; Höfling, S.; Fiore, A.; Hasan, Z.U.; Hemmer, P.R.; Lee, H.; Santori, C.M.
2013-01-01
We report a novel component for integrated quantum photonic applications, a waveguide single-photon autocorrelator. It is based on two superconducting nanowire detectors patterned onto the same GaAs ridge waveguide. Combining the electrical output of the two detectors in a correlation card enables
Performances Of Estimators Of Linear Models With Autocorrelated ...
African Journals Online (AJOL)
The performances of five estimators of linear models with Autocorrelated error terms are compared when the independent variable is autoregressive. The results reveal that the properties of the estimators when the sample size is finite is quite similar to the properties of the estimators when the sample size is infinite although ...
New approaches for calculating Moran's index of spatial autocorrelation.
Chen, Yanguang
2013-01-01
Spatial autocorrelation plays an important role in geographical analysis; however, there is still room for improvement of this method. The formula for Moran's index is complicated, and several basic problems remain to be solved. Therefore, I will reconstruct its mathematical framework using mathematical derivation based on linear algebra and present four simple approaches to calculating Moran's index. Moran's scatterplot will be ameliorated, and new test methods will be proposed. The relationship between the global Moran's index and Geary's coefficient will be discussed from two different vantage points: spatial population and spatial sample. The sphere of applications for both Moran's index and Geary's coefficient will be clarified and defined. One of theoretical findings is that Moran's index is a characteristic parameter of spatial weight matrices, so the selection of weight functions is very significant for autocorrelation analysis of geographical systems. A case study of 29 Chinese cities in 2000 will be employed to validate the innovatory models and methods. This work is a methodological study, which will simplify the process of autocorrelation analysis. The results of this study will lay the foundation for the scaling analysis of spatial autocorrelation.
New approaches for calculating Moran's index of spatial autocorrelation.
Directory of Open Access Journals (Sweden)
Yanguang Chen
Full Text Available Spatial autocorrelation plays an important role in geographical analysis; however, there is still room for improvement of this method. The formula for Moran's index is complicated, and several basic problems remain to be solved. Therefore, I will reconstruct its mathematical framework using mathematical derivation based on linear algebra and present four simple approaches to calculating Moran's index. Moran's scatterplot will be ameliorated, and new test methods will be proposed. The relationship between the global Moran's index and Geary's coefficient will be discussed from two different vantage points: spatial population and spatial sample. The sphere of applications for both Moran's index and Geary's coefficient will be clarified and defined. One of theoretical findings is that Moran's index is a characteristic parameter of spatial weight matrices, so the selection of weight functions is very significant for autocorrelation analysis of geographical systems. A case study of 29 Chinese cities in 2000 will be employed to validate the innovatory models and methods. This work is a methodological study, which will simplify the process of autocorrelation analysis. The results of this study will lay the foundation for the scaling analysis of spatial autocorrelation.
Performances of estimators of linear auto-correlated error model ...
African Journals Online (AJOL)
The performances of five estimators of linear models with autocorrelated disturbance terms are compared when the independent variable is exponential. The results reveal that for both small and large samples, the Ordinary Least Squares (OLS) compares favourably with the Generalized least Squares (GLS) estimators in ...
Momentum autocorrelation function of a classic diatomic chain
Energy Technology Data Exchange (ETDEWEB)
Yu, Ming B., E-mail: mingbyu@gmail.com
2016-10-23
A classical harmonic diatomic chain is studied using the recurrence relations method. The momentum autocorrelation function results from contributions of acoustic and optical branches. By use of convolution theorem, analytical expressions for the acoustic and optical contributions are derived as even-order Bessel function expansions with coefficients given in terms of integrals of elliptic functions in real axis and a contour parallel to the imaginary axis, respectively. - Highlights: • Momentum autocorrelation function of a classic diatomic chain is studied. • It is derived as even-order Bessel function expansion using the convolution theorem. • The expansion coefficients are integrals of elliptic functions. • Addition theorem is used to reduce complex elliptic function to complex sum of real ones.
Biometric feature extraction using local fractal auto-correlation
International Nuclear Information System (INIS)
Chen Xi; Zhang Jia-Shu
2014-01-01
Image texture feature extraction is a classical means for biometric recognition. To extract effective texture feature for matching, we utilize local fractal auto-correlation to construct an effective image texture descriptor. Three main steps are involved in the proposed scheme: (i) using two-dimensional Gabor filter to extract the texture features of biometric images; (ii) calculating the local fractal dimension of Gabor feature under different orientations and scales using fractal auto-correlation algorithm; and (iii) linking the local fractal dimension of Gabor feature under different orientations and scales into a big vector for matching. Experiments and analyses show our proposed scheme is an efficient biometric feature extraction approach. (condensed matter: structural, mechanical, and thermal properties)
Zhou, Yunyi; Tao, Chenyang; Lu, Wenlian; Feng, Jianfeng
2018-04-20
Functional connectivity is among the most important tools to study brain. The correlation coefficient, between time series of different brain areas, is the most popular method to quantify functional connectivity. Correlation coefficient in practical use assumes the data to be temporally independent. However, the time series data of brain can manifest significant temporal auto-correlation. A widely applicable method is proposed for correcting temporal auto-correlation. We considered two types of time series models: (1) auto-regressive-moving-average model, (2) nonlinear dynamical system model with noisy fluctuations, and derived their respective asymptotic distributions of correlation coefficient. These two types of models are most commonly used in neuroscience studies. We show the respective asymptotic distributions share a unified expression. We have verified the validity of our method, and shown our method exhibited sufficient statistical power for detecting true correlation on numerical experiments. Employing our method on real dataset yields more robust functional network and higher classification accuracy than conventional methods. Our method robustly controls the type I error while maintaining sufficient statistical power for detecting true correlation in numerical experiments, where existing methods measuring association (linear and nonlinear) fail. In this work, we proposed a widely applicable approach for correcting the effect of temporal auto-correlation on functional connectivity. Empirical results favor the use of our method in functional network analysis. Copyright © 2018. Published by Elsevier B.V.
Distributions of Autocorrelated First-Order Kinetic Outcomes: Illness Severity.
Directory of Open Access Journals (Sweden)
James D Englehardt
Full Text Available Many complex systems produce outcomes having recurring, power law-like distributions over wide ranges. However, the form necessarily breaks down at extremes, whereas the Weibull distribution has been demonstrated over the full observed range. Here the Weibull distribution is derived as the asymptotic distribution of generalized first-order kinetic processes, with convergence driven by autocorrelation, and entropy maximization subject to finite positive mean, of the incremental compounding rates. Process increments represent multiplicative causes. In particular, illness severities are modeled as such, occurring in proportion to products of, e.g., chronic toxicant fractions passed by organs along a pathway, or rates of interacting oncogenic mutations. The Weibull form is also argued theoretically and by simulation to be robust to the onset of saturation kinetics. The Weibull exponential parameter is shown to indicate the number and widths of the first-order compounding increments, the extent of rate autocorrelation, and the degree to which process increments are distributed exponential. In contrast with the Gaussian result in linear independent systems, the form is driven not by independence and multiplicity of process increments, but by increment autocorrelation and entropy. In some physical systems the form may be attracting, due to multiplicative evolution of outcome magnitudes towards extreme values potentially much larger and smaller than control mechanisms can contain. The Weibull distribution is demonstrated in preference to the lognormal and Pareto I for illness severities versus (a toxicokinetic models, (b biologically-based network models, (c scholastic and psychological test score data for children with prenatal mercury exposure, and (d time-to-tumor data of the ED01 study.
Size determinations of plutonium colloids using autocorrelation photon spectroscopy
International Nuclear Information System (INIS)
Triay, I.R.; Rundberg, R.S.; Mitchell, A.J.; Ott, M.A.; Hobart, D.E.; Palmer, P.D.; Newton, T.W.; Thompson, J.L.
1989-01-01
Autocorrelation Photon Spectroscopy (APS) is a light-scattering technique utilized to determine the size distribution of colloidal suspensions. The capabilities of the APS methodology have been assessed by analyzing colloids of known sizes. Plutonium(IV) colloid samples were prepared by a variety of methods including: dilution; peptization; and alpha-induced auto-oxidation of Pu(III). The size of theses Pu colloids was analyzed using APS. The sizes determined for the Pu colloids studied varied from 1 to 370 nanometers. 7 refs., 5 figs., 3 tabs
Spectral velocity estimation using autocorrelation functions for sparse data sets
DEFF Research Database (Denmark)
2006-01-01
The distribution of velocities of blood or tissue is displayed using ultrasound scanners by finding the power spectrum of the received signal. This is currently done by making a Fourier transform of the received signal and then showing spectra in an M-mode display. It is desired to show a B......-mode image for orientation, and data for this has to acquired interleaved with the flow data. The power spectrum can be calculated from the Fourier transform of the autocorrelation function Ry (k), where its span of lags k is given by the number of emission N in the data segment for velocity estimation...
Stable Blind Deconvolution over the Reals from Additional Autocorrelations
Walk, Philipp
2017-10-22
Recently the one-dimensional time-discrete blind deconvolution problem was shown to be solvable uniquely, up to a global phase, by a semi-definite program for almost any signal, provided its autocorrelation is known. We will show in this work that under a sufficient zero separation of the corresponding signal in the $z-$domain, a stable reconstruction against additive noise is possible. Moreover, the stability constant depends on the signal dimension and on the signals magnitude of the first and last coefficients. We give an analytical expression for this constant by using spectral bounds of Vandermonde matrices.
Autocorrelation based reconstruction of two-dimensional binary objects
International Nuclear Information System (INIS)
Mejia-Barbosa, Y.; Castaneda, R.
2005-10-01
A method for reconstructing two-dimensional binary objects from its autocorrelation function is discussed. The objects consist of a finite set of identical elements. The reconstruction algorithm is based on the concept of class of element pairs, defined as the set of element pairs with the same separation vector. This concept allows to solve the redundancy introduced by the element pairs of each class. It is also shown that different objects, consisting of an equal number of elements and the same classes of pairs, provide Fraunhofer diffraction patterns with identical intensity distributions. However, the method predicts all the possible objects that produce the same Fraunhofer pattern. (author)
LETTER TO THE EDITOR: Exhaustive search for low-autocorrelation binary sequences
Mertens, S.
1996-09-01
Binary sequences with low autocorrelations are important in communication engineering and in statistical mechanics as ground states of the Bernasconi model. Computer searches are the main tool in the construction of such sequences. Owing to the exponential size 0305-4470/29/18/005/img1 of the configuration space, exhaustive searches are limited to short sequences. We discuss an exhaustive search algorithm with run-time characteristic 0305-4470/29/18/005/img2 and apply it to compile a table of exact ground states of the Bernasconi model up to N = 48. The data suggest F > 9 for the optimal merit factor in the limit 0305-4470/29/18/005/img3.
Wilson, Lorna R M; Hopcraft, Keith I
2017-12-01
The problem of zero crossings is of great historical prevalence and promises extensive application. The challenge is to establish precisely how the autocorrelation function or power spectrum of a one-dimensional continuous random process determines the density function of the intervals between the zero crossings of that process. This paper investigates the case where periodicities are incorporated into the autocorrelation function of a smooth process. Numerical simulations, and statistics about the number of crossings in a fixed interval, reveal that in this case the zero crossings segue between a random and deterministic point process depending on the relative time scales of the periodic and nonperiodic components of the autocorrelation function. By considering the Laplace transform of the density function, we show that incorporating correlation between successive intervals is essential to obtaining accurate results for the interval variance. The same method enables prediction of the density function tail in some regions, and we suggest approaches for extending this to cover all regions. In an ever-more complex world, the potential applications for this scale of regularity in a random process are far reaching and powerful.
Wilson, Lorna R. M.; Hopcraft, Keith I.
2017-12-01
The problem of zero crossings is of great historical prevalence and promises extensive application. The challenge is to establish precisely how the autocorrelation function or power spectrum of a one-dimensional continuous random process determines the density function of the intervals between the zero crossings of that process. This paper investigates the case where periodicities are incorporated into the autocorrelation function of a smooth process. Numerical simulations, and statistics about the number of crossings in a fixed interval, reveal that in this case the zero crossings segue between a random and deterministic point process depending on the relative time scales of the periodic and nonperiodic components of the autocorrelation function. By considering the Laplace transform of the density function, we show that incorporating correlation between successive intervals is essential to obtaining accurate results for the interval variance. The same method enables prediction of the density function tail in some regions, and we suggest approaches for extending this to cover all regions. In an ever-more complex world, the potential applications for this scale of regularity in a random process are far reaching and powerful.
Covariance Estimation and Autocorrelation of NORAD Two-Line Element Sets
National Research Council Canada - National Science Library
Osweiler, Victor P
2006-01-01
This thesis investigates NORAD two-line element sets (TLE) containing satellite mean orbital elements for the purpose of estimating a covariance matrix and formulating an autocorrelation relationship...
Intensity autocorrelation measurements of frequency combs in the terahertz range
Benea-Chelmus, Ileana-Cristina; Rösch, Markus; Scalari, Giacomo; Beck, Mattias; Faist, Jérôme
2017-09-01
We report on direct measurements of the emission character of quantum cascade laser based frequency combs, using intensity autocorrelation. Our implementation is based on fast electro-optic sampling, with a detection spectral bandwidth matching the emission bandwidth of the comb laser, around 2.5 THz. We find the output of these frequency combs to be continuous even in the locked regime, but accompanied by a strong intensity modulation. Moreover, with our record temporal resolution of only few hundreds of femtoseconds, we can resolve correlated intensity modulation occurring on time scales as short as the gain recovery time, about 4 ps. By direct comparison with pulsed terahertz light originating from a photoconductive emitter, we demonstrate the peculiar emission pattern of these lasers. The measurement technique is self-referenced and ultrafast, and requires no reconstruction. It will be of significant importance in future measurements of ultrashort pulses from quantum cascade lasers.
An autocorrelation method to detect low frequency earthquakes within tremor
Brown, J.R.; Beroza, G.C.; Shelly, D.R.
2008-01-01
Recent studies have shown that deep tremor in the Nankai Trough under western Shikoku consists of a swarm of low frequency earthquakes (LFEs) that occur as slow shear slip on the down-dip extension of the primary seismogenic zone of the plate interface. The similarity of tremor in other locations suggests a similar mechanism, but the absence of cataloged low frequency earthquakes prevents a similar analysis. In this study, we develop a method for identifying LFEs within tremor. The method employs a matched-filter algorithm, similar to the technique used to infer that tremor in parts of Shikoku is comprised of LFEs; however, in this case we do not assume the origin times or locations of any LFEs a priori. We search for LFEs using the running autocorrelation of tremor waveforms for 6 Hi-Net stations in the vicinity of the tremor source. Time lags showing strong similarity in the autocorrelation represent either repeats, or near repeats, of LFEs within the tremor. We test the method on an hour of Hi-Net recordings of tremor and demonstrates that it extracts both known and previously unidentified LFEs. Once identified, we cross correlate waveforms to measure relative arrival times and locate the LFEs. The results are able to explain most of the tremor as a swarm of LFEs and the locations of newly identified events appear to fill a gap in the spatial distribution of known LFEs. This method should allow us to extend the analysis of Shelly et al. (2007a) to parts of the Nankai Trough in Shikoku that have sparse LFE coverage, and may also allow us to extend our analysis to other regions that experience deep tremor, but where LFEs have not yet been identified. Copyright 2008 by the American Geophysical Union.
Energy Technology Data Exchange (ETDEWEB)
Froelicher, B; Dalfes, A [Commissariat a l' Energie Atomique, Saclay (France). Centre d' Etudes Nucleaires
1968-07-01
A study is made of the passage of the auto-correlation function to the frequency spectrum by a numerical Fourier transformation. Two principal characteristics of auto-correlation functions, the time between two points and the total time, are related to two oscillations which appear in the frequency spectrum and which deform it. Various methods are proposed for reducing the effect of these two parasitic oscillations and for re-obtaining the real spectrum. (authors) [French] On etudie le passage de la fonction d'autocorrelation au spectre de frequence par transformee de Fourier numerique. Deux caracteristiques principales des fonctions d'autocorrelation, la duree entre points et la duree totale sont reliees a deux oscillations qui apparaissent dans le spectre de frequence et le deforment. Diverses methodes sont proposees pour reduire l'effet de ces deux oscillations parasites, et retrouver le spectre reel. (auteurs)
The specification of weight structures in network autocorrelation models of social influence
Leenders, Roger Th.A.J.
2002-01-01
Many physical and social phenomena are embedded within networks of interdependencies, the so-called 'context' of these phenomena. In network analysis, this type of process is typically modeled as a network autocorrelation model. Parameter estimates and inferences based on autocorrelation models,
A Quantized Analog Delay for an ir-UWB Quadrature Downconversion Autocorrelation Receiver
Bagga, S.; Zhang, L.; Serdijn, W.A.; Long, J.R.; Busking, E.B.
2005-01-01
A quantized analog delay is designed as a requirement for the autocorrelation function in the quadrature downconversion autocorrelation receiver (QDAR). The quantized analog delay is comprised of a quantizer, multiple binary delay lines and an adder circuit. Being the foremost element, the quantizer
Generalised partial autocorrelations and the mutual information between past and future
DEFF Research Database (Denmark)
Proietti, Tommaso; Luati, Alessandra
the generalized partial autocorrelations as the partial autocorrelation coefficients of an auxiliary process, we derive their properties and relate them to essential features of the original process. Based on a parameterisation suggested by Barndorff-Nielsen and Schou (1973) and on Whittle likelihood, we develop...
MATLAB-Based Program for Teaching Autocorrelation Function and Noise Concepts
Jovanovic Dolecek, G.
2012-01-01
An attractive MATLAB-based tool for teaching the basics of autocorrelation function and noise concepts is presented in this paper. This tool enhances traditional in-classroom lecturing. The demonstrations of the tool described here highlight the description of the autocorrelation function (ACF) in a general case for wide-sense stationary (WSS)…
An investigation on thermal patterns in Iran based on spatial autocorrelation
Fallah Ghalhari, Gholamabbas; Dadashi Roudbari, Abbasali
2018-02-01
The present study aimed at investigating temporal-spatial patterns and monthly patterns of temperature in Iran using new spatial statistical methods such as cluster and outlier analysis, and hotspot analysis. To do so, climatic parameters, monthly average temperature of 122 synoptic stations, were assessed. Statistical analysis showed that January with 120.75% had the most fluctuation among the studied months. Global Moran's Index revealed that yearly changes of temperature in Iran followed a strong spatially clustered pattern. Findings showed that the biggest thermal cluster pattern in Iran, 0.975388, occurred in May. Cluster and outlier analyses showed that thermal homogeneity in Iran decreases in cold months, while it increases in warm months. This is due to the radiation angle and synoptic systems which strongly influence thermal order in Iran. The elevations, however, have the most notable part proved by Geographically weighted regression model. Iran's thermal analysis through hotspot showed that hot thermal patterns (very hot, hot, and semi-hot) were dominant in the South, covering an area of 33.5% (about 552,145.3 km2). Regions such as mountain foot and low lands lack any significant spatial autocorrelation, 25.2% covering about 415,345.1 km2. The last is the cold thermal area (very cold, cold, and semi-cold) with about 25.2% covering about 552,145.3 km2 of the whole area of Iran.
Baek, Eun Kyeng; Ferron, John M
2013-03-01
Multilevel models (MLM) have been used as a method for analyzing multiple-baseline single-case data. However, some concerns can be raised because the models that have been used assume that the Level-1 error covariance matrix is the same for all participants. The purpose of this study was to extend the application of MLM of single-case data in order to accommodate across-participant variation in the Level-1 residual variance and autocorrelation. This more general model was then used in the analysis of single-case data sets to illustrate the method, to estimate the degree to which the autocorrelation and residual variances differed across participants, and to examine whether inferences about treatment effects were sensitive to whether or not the Level-1 error covariance matrix was allowed to vary across participants. The results from the analyses of five published studies showed that when the Level-1 error covariance matrix was allowed to vary across participants, some relatively large differences in autocorrelation estimates and error variance estimates emerged. The changes in modeling the variance structure did not change the conclusions about which fixed effects were statistically significant in most of the studies, but there was one exception. The fit indices did not consistently support selecting either the more complex covariance structure, which allowed the covariance parameters to vary across participants, or the simpler covariance structure. Given the uncertainty in model specification that may arise when modeling single-case data, researchers should consider conducting sensitivity analyses to examine the degree to which their conclusions are sensitive to modeling choices.
Wittemyer, George; Polansky, Leo; Douglas-Hamilton, Iain; Getz, Wayne M
2008-12-09
The internal state of an individual-as it relates to thirst, hunger, fear, or reproductive drive-can be inferred by referencing points on its movement path to external environmental and sociological variables. Using time-series approaches to characterize autocorrelative properties of step-length movements collated every 3 h for seven free-ranging African elephants, we examined the influence of social rank, predation risk, and seasonal variation in resource abundance on periodic properties of movement. The frequency domain methods of Fourier and wavelet analyses provide compact summaries of temporal autocorrelation and show both strong diurnal and seasonal based periodicities in the step-length time series. This autocorrelation is weaker during the wet season, indicating random movements are more common when ecological conditions are good. Periodograms of socially dominant individuals are consistent across seasons, whereas subordinate individuals show distinct differences diverging from that of dominants during the dry season. We link temporally localized statistical properties of movement to landscape features and find that diurnal movement correlation is more common within protected wildlife areas, and multiday movement correlations found among lower ranked individuals are typically outside of protected areas where predation risks are greatest. A frequency-related spatial analysis of movement-step lengths reveal that rest cycles related to the spatial distribution of critical resources (i.e., forage and water) are responsible for creating the observed patterns. Our approach generates unique information regarding the spatial-temporal interplay between environmental and individual characteristics, providing an original approach for understanding the movement ecology of individual animals and the spatial organization of animal populations.
Adaptive endpoint detection of seismic signal based on auto-correlated function
International Nuclear Information System (INIS)
Fan Wanchun; Shi Ren
2001-01-01
Based on the analysis of auto-correlation function, the notion of the distance between auto-correlation function was quoted, and the characterization of the noise and the signal with noise were discussed by using the distance. Then, the method of auto- adaptable endpoint detection of seismic signal based on auto-correlated similarity was summed up. The steps of implementation and determining of the thresholds were presented in detail. The experimental results that were compared with the methods based on artificial detecting show that this method has higher sensitivity even in a low signal with noise ratio circumstance
A broadly tunable autocorrelator for ultra-short, ultra-high power infrared optical pulses
Energy Technology Data Exchange (ETDEWEB)
Szarmes, E.B.; Madey, J.M.J. [Duke Univ., Durham, NC (United States)
1995-12-31
We describe the design of a crossed-beam, optical autocorrelator that uses an uncoated, birefringent beamsplitter to split a linearly polarized incident pulse into two orthogonally polarized pulses, and a Type II, SHG crystal to generate the intensity autocorrelation function. The uncoated beamsplitter accommodates extremely broad tunability while precluding any temporal distortion of ultrashort optical pulses at the dielectric interface, and the specific design provides efficient operation between 1 {mu}m and 4 {mu}m. Furthermore, the use of Type II SHG completely eliminates any single-beam doubling, so the autocorrelator can be operated at very shallow crossed-beam angles without generating a background pedestal. The autocorrelator has been constructed and installed in the Mark III laboratory at Duke University as a broadband diagnostic for ongoing compression experiments on the chirped-pulse FEL.
Autocorrelated process control: Geometric Brownian Motion approach versus Box-Jenkins approach
Salleh, R. M.; Zawawi, N. I.; Gan, Z. F.; Nor, M. E.
2018-04-01
Existing of autocorrelation will bring a significant effect on the performance and accuracy of process control if the problem does not handle carefully. When dealing with autocorrelated process, Box-Jenkins method will be preferred because of the popularity. However, the computation of Box-Jenkins method is too complicated and challenging which cause of time-consuming. Therefore, an alternative method which known as Geometric Brownian Motion (GBM) is introduced to monitor the autocorrelated process. One real case of furnace temperature data is conducted to compare the performance of Box-Jenkins and GBM methods in monitoring autocorrelation process. Both methods give the same results in terms of model accuracy and monitoring process control. Yet, GBM is superior compared to Box-Jenkins method due to its simplicity and practically with shorter computational time.
Energy Technology Data Exchange (ETDEWEB)
Constant, E.; Mevel, E.; Zair, A.; Bagnoud, V.; Salin, F. [Bordeaux-1 Univ., Talence (FR). Centre Lasers Intenses et Applications (CELIA)
2001-07-01
We designed a dispersionless autocorrelator with a sub-femtosecond resolution suitable for the characterization of ultrashort X-UV pulses. We present a proof of feasibility experiment with 11 fs infrared pulses. (orig.)
Detecting land cover change using a sliding window temporal autocorrelation approach
CSIR Research Space (South Africa)
Kleynhans, W
2012-07-01
Full Text Available There has been recent developments in the use of hypertemporal satellite time series data for land cover change detection and classification. Recently, an Autocorrelation function (ACF) change detection method was proposed to detect the development...
Adaptive endpoint detection of seismic signal based on auto-correlated function
International Nuclear Information System (INIS)
Fan Wanchun; Shi Ren
2000-01-01
There are certain shortcomings for the endpoint detection by time-waveform envelope and/or by checking the travel table (both labelled as the artificial detection method). Based on the analysis of the auto-correlation function, the notion of the distance between auto-correlation functions was quoted, and the characterizations of the noise and the signal with noise were discussed by using the distance. Then, the method of auto-adaptable endpoint detection of seismic signal based on auto-correlated similarity was summed up. The steps of implementation and determining of the thresholds were presented in detail. The experimental results that were compared with the methods based on artificial detecting show that this method has higher sensitivity even in a low SNR circumstance
Spectral Velocity Estimation using the Autocorrelation Function and Sparse data Sequences
DEFF Research Database (Denmark)
Jensen, Jørgen Arendt
2005-01-01
Ultrasound scanners can be used for displaying the distribution of velocities in blood vessels by finding the power spectrum of the received signal. It is desired to show a B-mode image for orientation and data for this has to be acquired interleaved with the flow data. Techniques for maintaining...... both the B-mode frame rate, and at the same time have the highest possible $f_{prf}$ only limited by the depth of investigation, are, thus, of great interest. The power spectrum can be calculated from the Fourier transform of the autocorrelation function $R_r(k)$. The lag $k$ corresponds...... of the sequence. The audio signal has also been synthesized from the autocorrelation data by passing white, Gaussian noise through a filter designed from the power spectrum of the autocorrelation function. The results show that both the full velocity range can be maintained at the same time as a B-mode image...
Directory of Open Access Journals (Sweden)
Jianxin Feng
2014-01-01
Full Text Available The recursive estimation problem is studied for a class of uncertain dynamical systems with different delay rates sensor network and autocorrelated process noises. The process noises are assumed to be autocorrelated across time and the autocorrelation property is described by the covariances between different time instants. The system model under consideration is subject to multiplicative noises or stochastic uncertainties. The sensor delay phenomenon occurs in a random way and each sensor in the sensor network has an individual delay rate which is characterized by a binary switching sequence obeying a conditional probability distribution. By using the orthogonal projection theorem and an innovation analysis approach, the desired recursive robust estimators including recursive robust filter, predictor, and smoother are obtained. Simulation results are provided to demonstrate the effectiveness of the proposed approaches.
Nanoscale and femtosecond optical autocorrelator based on a single plasmonic nanostructure
International Nuclear Information System (INIS)
Melentiev, P N; Afanasiev, A E; Balykin, V I; Tausenev, A V; Konyaschenko, A V; Klimov, V V
2014-01-01
We demonstrated a nanoscale size, ultrafast and multiorder optical autocorrelator with a single plasmonic nanostructure for measuring the spatio-temporal dynamics of femtosecond laser light. As a nanostructure, we use a split hole resonator (SHR), which was made in an aluminium nanofilm. The Al material yields the fastest response time (100 as). The SHR nanostructure ensures a high nonlinear optical efficiency of the interaction with laser radiation, which leads to (1) the second, (2) the third harmonics generation and (3) the multiphoton luminescence, which, in turn, are used to perform multi-order autocorrelation measurements. The nano-sized SHR makes it possible to conduct autocorrelation measurements (i) with a subwavelength spatial resolution and (ii) with no significant influence on the duration of the laser pulse. The time response realized by the SHR nanostructure is about 10 fs. (letter)
Statistics of polarization speckle: theory versus experiment
DEFF Research Database (Denmark)
Wang, Wei; Hanson, Steen Grüner; Takeda, Mitsuo
2010-01-01
In this paper, we reviewed our recent work on the statistical properties of polarization speckle, described by stochastic Stokes parameters fluctuating in space. Based on the Gaussian assumption for the random electric field components and polar-interferometer, we investigated theoretically...... and experimentally the statistics of Stokes parameters of polarization speckle, including probability density function of Stokes parameters with the spatial degree of polarization, autocorrelation of Stokes vector and statistics of spatial derivatives for Stokes parameters....
International Nuclear Information System (INIS)
Nagano, Seido; Ichimaru, Setsuo
1980-01-01
The memory function for the velocity autocorrelation function in a strongly coupled, one-component plasma is analyzed in the short time and long time domains, respectively, with the aid of the frequency-moment sum rules and the hydrodynamic consideration evoking the idea of the generalized Stokes friction. A series of interpolation schemes with successively improved accuracies are then introduced. Numerical investigations of those interpolation schemes clarify the physical origin of the three different types of the velocity autocorrelation function observed in the molecular dynamics simulation at different regimes of the coupling constant. (author)
Robustness of variance and autocorrelation as indicators of critical slowing down
Dakos, V.; Nes, van E.H.; Odorico, D' P.; Scheffer, M.
2012-01-01
Ecosystems close to a critical threshold lose resilience, in the sense that perturbations can more easily push them into an alternative state. Recently, it has been proposed that such loss of resilience may be detected from elevated autocorrelation and variance in the fluctuations of the state of an
Directory of Open Access Journals (Sweden)
Su-Wei Fan
2010-06-01
Full Text Available Many studies described relationships between plant species and intrinsic or exogenous factors, but few quantified spatial scales of species patterns. In this study, quantitative methods were used to explore the spatial scale of understory species (including resident and transient species, in order to identify the influential factors of species distribution. Resident species (including herbaceous species, climbers and tree ferns < 1 m high were investigated on seven transects, each 5-meter wide and 300-meter long, at Lanjenchi plot in Nanjenshan Reserve, southern Taiwan. Transient species (seedling of canopy, subcanopy and shrub species < 1 cm diameter at breast height were censused in three of the seven transects. The herb coverage and seedling abundance were calculated for each 5 × 5 m quadrat along the transects, and Moran’s I and Galiano’s new local variance (NLV indices were then used to identify the spatial scale of autocorrelation for each species. Patterns of species abundance of understory layer varied among species at fine scale within 50 meters. Resident species showed a higher proportion of significant autocorrelation than the transient species. Species with large size or prolonged fronds or stems tended to show larger scales in autocorrelation. However, dispersal syndromes and fruit types did not relate to any species’ spatial patterns. Several species showed a significant autocorrelation at a 180-meter class which happened to correspond to the local replicates of topographical features in hilltops. The spatial patterns of understory species at Lanjenchi plot are mainly influenced by species’ intrinsic traits and topographical characteristics.
Inoue, Akihiko; Kasahara, Yukio
2004-01-01
Let {Xn : ∈Z} be a fractional ARIMA(p,d,q) process with partial autocorrelation function α(·). In this paper, we prove that if d∈(−1/2,0) then |α(n)|~|d|/n as n→∞. This extends the previous result for the case 0
Limit theory for the sample autocorrelations and extremes of a GARCH (1,1) process
Mikosch, T; Starica, C
2000-01-01
The asymptotic theory for the sample autocorrelations and extremes of a GARCH(I, 1) process is provided. Special attention is given to the case when the sum of the ARCH and GARCH parameters is close to 1, that is, when one is close to an infinite Variance marginal distribution. This situation has
Panel data models extended to spatial error autocorrelation or a spatially lagged dependent variable
Elhorst, J. Paul
2001-01-01
This paper surveys panel data models extended to spatial error autocorrelation or a spatially lagged dependent variable. In particular, it focuses on the specification and estimation of four panel data models commonly used in applied research: the fixed effects model, the random effects model, the
If not properly account for, auto-correlated errors in observations can lead to inaccurate results in soil moisture data analysis and reanalysis. Here, we propose a more generalized form of the triple collocation algorithm (GTC) capable of decomposing the total error variance of remotely-sensed surf...
A model-free approach to eliminate autocorrelation when testing for process capability
DEFF Research Database (Denmark)
Vanmann, Kerstin; Kulahci, Murat
2008-01-01
There is an increasing use of on-line data acquisition systems in industry. This usually leads to autocorrelated data and implies that the assumption of independent observations has to be re-examined. Most decision procedures for capability analysis assume independent data. In this article we pre...
Operator theory of angular momentum nad orientational auto-correlation functions
International Nuclear Information System (INIS)
Evans, M.W.
1982-01-01
The rigorous relation between the orientational auto-correlation function and the angular momentum autocorrelation function is described in two cases of interest. First when description of the complete zero THz- spectrum is required from the Mori continued fraction expansion for the angular momentum autocorrelation function and second when rotation/translation effects are important. The Mori-Evans theory of 1976, relying on the simple Shimizu relation is found to be essentially unaffected by the higher order corrections recently worked out by Ford and co-workers in the Markov limit. The mutual interaction of rotation and translation is important in determining the details of both the orientational and angular momentum auto-correlation function's (a.c.f.'s) in the presence of sample anisotropy or a symmetry breaking field. In this case it is essential to regard the angular momentum a.c.f. as non-Markovian and methods are developed to relate this to the orientational a.c.f. in the presence of rotation/translation coupling. (author)
Akçay, A.E.; Biller, B.; Tayur, S.
2012-01-01
We consider a repeated newsvendor setting where the parameters of the demand distribution are unknown, and we study the problem of setting inventory targets using only a limited amount of historical demand data. We assume that the demand process is autocorrelated and represented by an
A spatio-temporal autocorrelation change detection approach using hyper-temporal satellite data
CSIR Research Space (South Africa)
Kleynhans, W
2013-07-01
Full Text Available -1 IEEE International Geoscience and Remote Sensing Symposium, Melbourne, Australia 21-26 July 2013 A SPATIO-TEMPORAL AUTOCORRELATION CHANGE DETECTION APPROACH USING HYPER-TEMPORAL SATELLITE DATA yzW. Kleynhans, yz,B.P Salmon,zK. J. Wessels...
Velocity-Autocorrelation Function in Liquids, Deduced from Neutron Incoherent Scattering Results
DEFF Research Database (Denmark)
Carneiro, Kim
1976-01-01
The Fourier transform p(ω) of the velocity-autocorrelation function is derived from neutron incoherent scattering results, obtained from the two liquids Ar and H2. The quality and significance of the results are discussed with special emphasis on the long-time t-3/2 tail, found in computer simula...
Simultaneous maximization of spatial and temporal autocorrelation in spatio-temporal data
DEFF Research Database (Denmark)
Nielsen, Allan Aasbjerg
2002-01-01
. This is done by solving the generalized eigenproblem represented by the Rayleigh coefficient where is the dispersion of and is the dispersion of the difference between and spatially shifted. Hence, the new variates are obtained from the conjugate eigenvectors and the autocorrelations obtained are , i.e., high...
Auto-correlation analysis of wave heights in the Bay of Bengal
Indian Academy of Sciences (India)
Time series observations of significant wave heights in the Bay of Bengal were subjected to auto- correlation analysis to determine temporal variability scale. The analysis indicates an exponen- tial fall of auto-correlation in the first few hours with a decorrelation time scale of about six hours. A similar figure was found earlier ...
Crase, Beth; Liedloff, Adam; Vesk, Peter A; Fukuda, Yusuke; Wintle, Brendan A
2014-08-01
Species distribution models (SDMs) are widely used to forecast changes in the spatial distributions of species and communities in response to climate change. However, spatial autocorrelation (SA) is rarely accounted for in these models, despite its ubiquity in broad-scale ecological data. While spatial autocorrelation in model residuals is known to result in biased parameter estimates and the inflation of type I errors, the influence of unmodeled SA on species' range forecasts is poorly understood. Here we quantify how accounting for SA in SDMs influences the magnitude of range shift forecasts produced by SDMs for multiple climate change scenarios. SDMs were fitted to simulated data with a known autocorrelation structure, and to field observations of three mangrove communities from northern Australia displaying strong spatial autocorrelation. Three modeling approaches were implemented: environment-only models (most frequently applied in species' range forecasts), and two approaches that incorporate SA; autologistic models and residuals autocovariate (RAC) models. Differences in forecasts among modeling approaches and climate scenarios were quantified. While all model predictions at the current time closely matched that of the actual current distribution of the mangrove communities, under the climate change scenarios environment-only models forecast substantially greater range shifts than models incorporating SA. Furthermore, the magnitude of these differences intensified with increasing increments of climate change across the scenarios. When models do not account for SA, forecasts of species' range shifts indicate more extreme impacts of climate change, compared to models that explicitly account for SA. Therefore, where biological or population processes induce substantial autocorrelation in the distribution of organisms, and this is not modeled, model predictions will be inaccurate. These results have global importance for conservation efforts as inaccurate
Autocorrelation as a source of truncated Lévy flights in foreign exchange rates
Figueiredo, Annibal; Gleria, Iram; Matsushita, Raul; Da Silva, Sergio
2003-05-01
We suggest that the ultraslow speed of convergence associated with truncated Lévy flights (Phys. Rev. Lett. 73 (1994) 2946) may well be explained by autocorrelations in data. We show how a particular type of autocorrelation generates power laws consistent with a truncated Lévy flight. Stock exchanges have been suggested to be modeled by a truncated Lévy flight (Nature 376 (1995) 46; Physica A 297 (2001) 509; Econom. Bull. 7 (2002) 1). Here foreign exchange rate data are taken instead. Scaling power laws in the “probability of return to the origin” are shown to emerge for most currencies. A novel approach to measure how distant a process is from a Gaussian regime is presented.
International Nuclear Information System (INIS)
Masalov, Anatolii V; Chudnovsky, Aleksandr V
2004-01-01
It is shown that the finite thickness of the second-harmonic crystal distorts the results of measurements in nonlinear autocorrelators intended for measuring the durations and fields of femtosecond light pulses mainly due to dispersive broadening (or compression) of the pulses being measured, as well as due to the group velocity mismatch between the fundamental and sum-frequency pulses. The refractive index dispersion of the crystal, scaled by half its thickness, distorts the pulse duration to a certain extent depending on its initial chirp and thus determines the width of the energy distribution recorded in the autocorrelator. As the crystal thickness increases, the group velocity mismatch leads to a transformation of the recorded distribution from the correlation function of intensity to the squared modulus of the field correlation function. In the case of Gaussian pulses, such a transformation does not affect significantly the recorded distribution. Errors of pulse duration measurements are estimated. (nonlinear optical phenomena)
The Effect of Autocorrelation on the Hotelling T-2 Control Chart
DEFF Research Database (Denmark)
Vanhatalo, Erik; Kulahci, Murat
2015-01-01
One of the basic assumptions for traditional univariate and multivariate control charts is that the data are independent in time. For the latter, in many cases, the data are serially dependent (autocorrelated) and cross-correlated because of, for example, frequent sampling and process dynamics......- correlation structures for different magnitudes of shifts in the process mean is not fully explored in the literature. In this article, the performance of the Hotelling T-2 control chart for different shift sizes and various autocorrelation and cross- correlation structures are compared based on the average...... and using the raw data with adjusted control limits calculated through Monte Carlo simulations; and (iii) constructing the control chart for the residuals from a multivariate time series model fitted to the raw data. To limit the complexity, we use a first-order vector autoregressive process and focus...
Lumpkin, Alex H; Rule, D W
2001-01-01
We report the initial measurements of subpicosecond electron beam structure using a nonintercepting technique based on the autocorrelation of coherent diffraction radiation (CDR). A far infrared (FIR) Michelson interferometer with a Golay detector was used to obtain the autocorrelation. The radiation was generated by a thermionic rf gun beam at 40 MeV as it passed through a 5-mm-tall slit/aperture in a metal screen whose surface was at 45 deg. to the beam direction. For the observed bunch lengths of about 450 fs (FWHM) with a shorter time spike on the leading edge, peak currents of about 100 A are indicated. Also a model was developed and used to calculate the CDR from the back of two metal strips separated by a 5-mm vertical gap. The demonstrated nonintercepting aspect of this method could allow on-line bunch length characterizations to be done during free-electron laser experiments.
OFDM Signal Detector Based on Cyclic Autocorrelation Function and its Properties
Directory of Open Access Journals (Sweden)
Z. Fedra
2011-12-01
Full Text Available This paper is devoted to research of the general and particular properties of the OFDM signal detector based on the cyclic autocorrelation function. The cyclic autocorrelation function is estimated using DFT. The parameters of the testing signal have been chosen according to 802.11g WLAN. Some properties are described analytically; all events are examined via computer simulations. It is shown that the detector is able to detect an OFDM signal in the case of multipath propagation, inexact frequency synchronization and without time synchronization. The sensitivity of the detector could be decreased in the above cases. An important condition for proper value of the detector sampling interval was derived. Three types of the channels were studied and compared. Detection threshold SNR=-9 dB was found for the signal under consideration and for two-way propagation.
Autocorrelation analysis of plasma plume light emissions in deep penetration laser welding of steel
Czech Academy of Sciences Publication Activity Database
Mrňa, Libor; Šarbort, Martin; Řeřucha, Šimon; Jedlička, Petr
2017-01-01
Roč. 29, č. 1 (2017), s. 1-10, č. článku 012009. ISSN 1042-346X R&D Projects: GA MŠk(CZ) LO1212; GA MŠk ED0017/01/01 Institutional support: RVO:68081731 Keywords : laser welding * plasma plume * light emissions * autocorrelation analysis * weld depth Subject RIV: BH - Optics, Masers, Lasers OBOR OECD: Optics (including laser optics and quantum optics) Impact factor: 1.492, year: 2016
Spatial autocorrelation method using AR model; Kukan jiko sokanho eno AR model no tekiyo
Energy Technology Data Exchange (ETDEWEB)
Yamamoto, H; Obuchi, T; Saito, T [Iwate University, Iwate (Japan). Faculty of Engineering
1996-05-01
Examination was made about the applicability of the AR model to the spatial autocorrelation (SAC) method, which analyzes the surface wave phase velocity in a microtremor, for the estimation of the underground structure. In this examination, microtremor data recorded in Morioka City, Iwate Prefecture, was used. In the SAC method, a spatial autocorrelation function with the frequency as a variable is determined from microtremor data observed by circular arrays. Then, the Bessel function is adapted to the spatial autocorrelation coefficient with the distance between seismographs as a variable for the determination of the phase velocity. The result of the AR model application in this study and the results of the conventional BPF and FFT method were compared. It was then found that the phase velocities obtained by the BPF and FFT methods were more dispersed than the same obtained by the AR model. The dispersion in the BPF method is attributed to the bandwidth used in the band-pass filter and, in the FFT method, to the impact of the bandwidth on the smoothing of the cross spectrum. 2 refs., 7 figs.
Badenhausser, I; Gouat, M; Goarant, A; Cornulier, T; Bretagnolle, V
2012-10-01
Agricultural intensification in western Europe has caused a dramatic loss of grassland surfaces in farmlands, which have resulted in strong declines in grassland invertebrates, leading to cascade effects at higher trophic levels among consumers of invertebrates. Grasshoppers are important components of grassland invertebrate assemblages in European agricultural ecosystems, particularly as prey for bird species. Understanding how grasshopper populations are distributed in fragmented landscapes with low grassland availability is critical for both studies in biodiversity conservation and insect management. We assessed the range and strength of spatial autocorrelation for two grasshopper taxa (Gomphocerinae subfamily and Calliptamus italicus L.) across an intensive farmland in western France. Data from surveys carried out over 8 yr in 1,715 grassland fields were analyzed using geostatistics. Weak spatial patterns were observed at small spatial scales, suggesting important local effects of management practices on grasshopper densities. Spatial autocorrelation patterns for both grasshopper taxa were only detected at intermediate scales. For Gomphocerinae, the range of spatial autocorrelation varied from 802 to 2,613 m according to the year, depending both on grasshopper density and on grassland surfaces in the study site, whereas spatial patterns for the Italian locust were more variable and not related to grasshopper density or grassland surfaces. Spatial patterns in the distribution of Gomphocerinae supported our hypothesis that habitat availability was a major driver of grasshopper distribution in the landscape, and suggested it was related to density-dependent processes such as dispersal.
What autocorrelation tells us about motor variability: insights from dart throwing.
Directory of Open Access Journals (Sweden)
Robert J van Beers
Full Text Available In sports such as golf and darts it is important that one can produce ballistic movements of an object towards a goal location with as little variability as possible. A factor that influences this variability is the extent to which motor planning is updated from movement to movement based on observed errors. Previous work has shown that for reaching movements, our motor system uses the learning rate (the proportion of an error that is corrected for in the planning of the next movement that is optimal for minimizing the endpoint variability. Here we examined whether the learning rate is hard-wired and therefore automatically optimal, or whether it is optimized through experience. We compared the performance of experienced dart players and beginners in a dart task. A hallmark of the optimal learning rate is that the lag-1 autocorrelation of movement endpoints is zero. We found that the lag-1 autocorrelation of experienced dart players was near zero, implying a near-optimal learning rate, whereas it was negative for beginners, suggesting a larger than optimal learning rate. We conclude that learning rates for trial-by-trial motor learning are optimized through experience. This study also highlights the usefulness of the lag-1 autocorrelation as an index of performance in studying motor-skill learning.
Directory of Open Access Journals (Sweden)
Benjamin H. Letcher
2016-02-01
Full Text Available Water temperature is a primary driver of stream ecosystems and commonly forms the basis of stream classifications. Robust models of stream temperature are critical as the climate changes, but estimating daily stream temperature poses several important challenges. We developed a statistical model that accounts for many challenges that can make stream temperature estimation difficult. Our model identifies the yearly period when air and water temperature are synchronized, accommodates hysteresis, incorporates time lags, deals with missing data and autocorrelation and can include external drivers. In a small stream network, the model performed well (RMSE = 0.59°C, identified a clear warming trend (0.63 °C decade−1 and a widening of the synchronized period (29 d decade−1. We also carefully evaluated how missing data influenced predictions. Missing data within a year had a small effect on performance (∼0.05% average drop in RMSE with 10% fewer days with data. Missing all data for a year decreased performance (∼0.6 °C jump in RMSE, but this decrease was moderated when data were available from other streams in the network.
Adaptive non-collinear autocorrelation of few-cycle pulses with an angular tunable bi-mirror
Energy Technology Data Exchange (ETDEWEB)
Treffer, A., E-mail: treffer@mbi-berlin.de; Bock, M.; König, S.; Grunwald, R. [Max Born Institute for Nonlinear Optics and Short-Pulse Spectroscopy, Max Born Strasse 2A, D-12489 Berlin (Germany); Brunne, J.; Wallrabe, U. [Laboratory for Microactuators, Department of Microsystems Engineering, IMTEK, University of Freiburg, Georges-Koehler-Allee 102, Freiburg 79110 (Germany)
2016-02-01
Adaptive autocorrelation with an angular tunable micro-electro-mechanical system is reported. A piezo-actuated Fresnel bi-mirror structure was applied to measure the second order autocorrelation of near-infrared few-cycle laser pulses in a non-collinear setup at tunable superposition angles. Because of enabling measurements with variable scaling and minimizing the influence of distortions by adaptive self-reconstruction, the approach extends the capability of autocorrelators. Flexible scaling and robustness against localized amplitude obscurations are demonstrated. The adaptive reconstruction of temporal frequency information by the Fourier analysis of autocorrelation data is shown. Experimental results and numerical simulations of the beam propagation and interference are compared for variable angles.
Understanding Statistics - Cancer Statistics
Annual reports of U.S. cancer statistics including new cases, deaths, trends, survival, prevalence, lifetime risk, and progress toward Healthy People targets, plus statistical summaries for a number of common cancer types.
Damage detection and isolation via autocorrelation: a step toward passive sensing
Chang, Y. S.; Yuan, F. G.
2018-03-01
Passive sensing technique may eliminate the need of expending power from actuators and thus provide a means of developing a compact and simple structural health monitoring system. More importantly, it may provide a solution for monitoring the aircraft subjected to environmental loading from air flow during operation. In this paper, a non-contact auto-correlation based technique is exploited as a feasibility study for passive sensing application to detect damage and isolate the damage location. Its theoretical basis bears some resemblance to reconstructing Green's function from diffusive wavefield through cross-correlation. Localized high pressure air from air compressor are randomly and continuously applied on the one side surface of the aluminum panels through the air blow gun. A laser Doppler vibrometer (LDV) was used to scan a 90 mm × 90 mm area to create a 6 × 6 2D-array signals from the opposite side of the panels. The scanned signals were auto-correlated to reconstruct a "selfimpulse response" (or Green's function). The premise for stably reconstructing the accurate Green's function requires long sensing times. For a 609.6 mm × 609.6 mm flat aluminum panel, the sensing times roughly at least four seconds is sufficient to establish converged Green's function through correlation. For the integral stiffened aluminum panel, the geometrical features of the panel expedite the formation of the diffusive wavefield and thus shorten the sensing times. The damage is simulated by gluing a magnet onto the panels. Reconstructed Green's functions (RGFs) are used for damage detection and damage isolation based on an imaging condition with mean square deviation of the RGFs from the pristine and the damaged structure and the results are shown in color maps. The auto-correlation based technique is shown to consistently detect the simulated damage, image and isolate the damage in the structure subjected to high pressure air excitation. This technique may be transformed into
International Nuclear Information System (INIS)
Roy, M.D.; Nag, B.R.
1981-01-01
A method has been developed for determining the auto-correlation functions of the fluctuations in the transverse and the parallel components of hot carrier-velocity in a semiconductor by Monte Carlo simulation. The functions for electrons in InSb are determined by this method for applied electric fields of 50 V/cm, 75 V/cm, and 100 V/cm. With increasing value of the time interval the transverse auto-correlation function fall nearly exponentially to zero, but the parallel function falls sharply to a negative peak, then rises to positive values and finally becomes zero. The interval beyond which the auto-correlation function is zero and the correlation time are also evaluated. The correlation time is found to be approximately 1.6 times the relaxation time calculated from the chord mobility. The effect of the flight sampling time on the value of variance of the displacement, is investigated in terms of the low frequency diffusion constants, determined from the variation of the correlation functions. It is found that the diffusion constants become independent of the sampling time if it is of the order of one hundred times the relaxation time. The frequency-dependent diffusion constants are calculated from the correlation functions. The transverse diffusion constant falls monotonically with frequency for all the field strengths studied. The parallel diffusion constant has similar variation for the lower fields (50 V/cm and 75 V/cm) but it has a peak at about 44 GHz for the field of 100 V/cm. (orig.)
Directory of Open Access Journals (Sweden)
Ximing Zhang
2017-11-01
Full Text Available In coal seam gas (CSG wells, water is periodically removed from the wellbore in order to keep the bottom-hole flowing pressure at low levels, facilitating the desorption of methane gas from the coal bed. In order to calculate gas flow rate and further optimize well performance, it is necessary to accurately monitor the liquid level in real-time. This paper presents a novel method based on autocorrelation function (ACF analysis for determining the liquid level in CSG wells under intense noise conditions. The method involves the calculation of the acoustic travel time in the annulus and processing the autocorrelation signal in order to extract the weak echo under high background noise. In contrast to previous works, the non-linear dependence of the acoustic velocity on temperature and pressure is taken into account. To locate the liquid level of a coal seam gas well the travel time is computed iteratively with the non-linear velocity model. Afterwards, the proposed method is validated using experimental laboratory investigations that have been developed for liquid level detection under two scenarios, representing the combination of low pressure, weak signal, and intense noise generated by gas flowing and leakage. By adopting an evaluation indicator called Crest Factor, the results have shown the superiority of the ACF-based method compared to Fourier filtering (FFT. In the two scenarios, the maximal measurement error from the proposed method was 0.34% and 0.50%, respectively. The latent periodic characteristic of the reflected signal can be extracted by the ACF-based method even when the noise is larger than 1.42 Pa, which is impossible for FFT-based de-noising. A case study focused on a specific CSG well is presented to illustrate the feasibility of the proposed approach, and also to demonstrate that signal processing with autocorrelation analysis can improve the sensitivity of the detection system.
New autocorrelation technique for the IR FEL optical pulse width measurements
Energy Technology Data Exchange (ETDEWEB)
Amirmadhi, F.; Brau, K.A.; Becker, C. [Vanderbilt Univ., Nashville, TN (United States)] [and others
1995-12-31
We have developed a new technique for the autocorrelation measurement of optical pulse width at the Vanderbilt University FEL center. This method is based on nonlinear absorption and transmission characteristics of semiconductors such as Ge, Te and InAs suitable for the wavelength range from 2 to over 6 microns. This approach, aside being simple and low cost, removes the phase matching condition that is generally required for the standard frequency doubling technique and covers a greater wavelength range per nonlinear material. In this paper we will describe the apparatus, explain the principal mechanism involved and compare data which have been acquired with both frequency doubling and two-photon absorption.
Autcha Araveeporn
2013-01-01
This paper compares a Least-Squared Random Coefficient Autoregressive (RCA) model with a Least-Squared RCA model based on Autocorrelated Errors (RCA-AR). We looked at only the first order models, denoted RCA(1) and RCA(1)-AR(1). The efficiency of the Least-Squared method was checked by applying the models to Brownian motion and Wiener process, and the efficiency followed closely the asymptotic properties of a normal distribution. In a simulation study, we compared the performance of RCA(1) an...
Sire, Clément
2004-09-24
We study the autocorrelation function of a conserved spin system following a quench at the critical temperature. Defining the correlation length L(t) approximately t(1/z), we find that for times t' and t satisfying L(t')infinity limit, we show that lambda(')(c)=d+2 and phi=z/2. We give a heuristic argument suggesting that this result is, in fact, valid for any dimension d and spin vector dimension n. We present numerical simulations for the conserved Ising model in d=1 and d=2, which are fully consistent with the present theory.
Energy Technology Data Exchange (ETDEWEB)
De Pretto, Lucas R., E-mail: lucas.de.pretto@usp.br; Nogueira, Gesse E. C.; Freitas, Anderson Z. [Instituto de Pesquisas Energéticas e Nucleares, IPEN–CNEN/SP, Avenida Lineu Prestes, 2242, 05508-000 São Paulo (Brazil)
2016-04-28
Functional modalities of Optical Coherence Tomography (OCT) based on speckle analysis are emerging in the literature. We propose a simple approach to the autocorrelation of OCT signal to enable volumetric flow rate differentiation, based on decorrelation time. Our results show that this technique could distinguish flows separated by 3 μl/min, limited by the acquisition speed of the system. We further perform a B-scan of gradient flow inside a microchannel, enabling the visualization of the drag effect on the walls.
Transforming the autocorrelation function of a time series to detect land cover change
CSIR Research Space (South Africa)
Salmon
2015-07-01
Full Text Available stream_source_info Salmon_2016.pdf.txt stream_content_type text/plain stream_size 1090 Content-Encoding ISO-8859-1 stream_name Salmon_2016.pdf.txt Content-Type text/plain; charset=ISO-8859-1 International Geoscience... and Remote Sensing Symposium (IEEE IGARSS), 10-15 July 2016, Beijing Transforming the autocorrelation function of a time series to detect land cover change Salmon, B.P., Kleynhans, W., Olivier, J.C. and Schwegmann, C.P. ABSTRACT Regional...
Sign reversals of the output autocorrelation function for the stochastic Bernoulli-Verhulst equation
Energy Technology Data Exchange (ETDEWEB)
Lumi, N., E-mail: Neeme.Lumi@tlu.ee; Mankin, R., E-mail: Romi.Mankin@tlu.ee [Institute of Mathematics and Natural Sciences, Tallinn University, 29 Narva Road, 10120 Tallinn (Estonia)
2015-10-28
We consider a stochastic Bernoulli-Verhulst equation as a model for population growth processes. The effect of fluctuating environment on the carrying capacity of a population is modeled as colored dichotomous noise. Relying on the composite master equation an explicit expression for the stationary autocorrelation function (ACF) of population sizes is found. On the basis of this expression a nonmonotonic decay of the ACF by increasing lag-time is shown. Moreover, in a certain regime of the noise parameters the ACF demonstrates anticorrelation as well as related sign reversals at some values of the lag-time. The conditions for the appearance of this highly unexpected effect are also discussed.
Statistical methods for segmentation and classification of images
DEFF Research Database (Denmark)
Rosholm, Anders
1997-01-01
The central matter of the present thesis is Bayesian statistical inference applied to classification of images. An initial review of Markov Random Fields relates to the modeling aspect of the indicated main subject. In that connection, emphasis is put on the relatively unknown sub-class of Pickard...... with a Pickard Random Field modeling of a considered (categorical) image phenomemon. An extension of the fast PRF based classification technique is presented. The modification introduces auto-correlation into the model of an involved noise process, which previously has been assumed independent. The suitability...... of the extended model is documented by tests on controlled image data containing auto-correlated noise....
BetaBit: A fast generator of autocorrelated binary processes for geophysical research
Serinaldi, Francesco; Lombardo, Federico
2017-05-01
We introduce a fast and efficient non-iterative algorithm, called BetaBit, to simulate autocorrelated binary processes describing the occurrence of natural hazards, system failures, and other physical and geophysical phenomena characterized by persistence, temporal clustering, and low rate of occurrence. BetaBit overcomes the simulation constraints posed by the discrete nature of the marginal distributions of binary processes by using the link existing between the correlation coefficients of this process and those of the standard Gaussian processes. The performance of BetaBit is tested on binary signals with power-law and exponentially decaying autocorrelation functions (ACFs) corresponding to Hurst-Kolmogorov and Markov processes, respectively. An application to real-world sequences describing rainfall intermittency and the occurrence of strong positive phases of the North Atlantic Oscillation (NAO) index shows that BetaBit can also simulate surrogate data preserving the empirical ACF as well as signals with autoregressive moving average (ARMA) dependence structures. Extensions to cyclo-stationary processes accounting for seasonal fluctuations are also discussed.
Broadband short pulse measurement by autocorrelation with a sum-frequency generation set-up
International Nuclear Information System (INIS)
Glotin, F.; Jaroszynski, D.; Marcouille, O.
1995-01-01
Previous spectral and laser pulse length measurements carried out on the CLIO FEL at wavelength λ=8.5 μm suggested that very short light pulses could be generated, about 500 fs wide (FWHM). For these measurements a Michelson interferometer with a Te crystal, as a non-linear detector, was used as a second order autocorrelation device. More recent measurements in similar conditions have confirmed that the laser pulses observed are indeed single: they are not followed by other pulses distant by the slippage length Nλ. As the single micropulse length is likely to depend on the slippage, more measurements at different wavelengths would be useful. This is not directly possible with our actual interferometer set-up, based on a phase-matched non-linear crystal. However, we can use the broadband non-linear medium provided by one of our users' experiments: Sum-Frequency Generation over surfaces. With such autocorrelation set-up, interference fringes are no more visible, but this is largely compensated by the frequency range provided. First tests at 8 μm have already been performed to validate the technic, leading to results similar to those obtained with our previous Michelson set-up
On the Decay Ratio Determination in BWR Stability Analysis by Auto-Correlation Function Techniques
International Nuclear Information System (INIS)
Behringer, K.; Hennig, D.
2002-11-01
A novel auto-correlation function (ACF) method has been investigated for determining the oscillation frequency and the decay ratio in BWR stability analyses. The neutron signals are band-pass filtered to separate the oscillation peak in the power spectral density (PSD) from background. Two linear second-order oscillation models are considered. These models, corrected for signal filtering and including a background term under the peak in the PSD, are then least-squares fitted to the ACF of the previously filtered neutron signal, in order to determine the oscillation frequency and the decay ratio. Our method uses fast Fourier transform techniques with signal segmentation for filtering and ACF estimation. Gliding 'short-term' ACF estimates on a record allow the evaluation of uncertainties. Numerical results are given which have been obtained from neutron data of the recent Forsmark I and Forsmark II NEA benchmark project. Our results are compared with those obtained by other participants in the benchmark project. The present PSI report is an extended version of the publication K. Behringer, D. Hennig 'A novel auto-correlation function method for the determination of the decay ratio in BWR stability studies' (Behringer, Hennig, 2002)
A Comparison of Weights Matrices on Computation of Dengue Spatial Autocorrelation
Suryowati, K.; Bekti, R. D.; Faradila, A.
2018-04-01
Spatial autocorrelation is one of spatial analysis to identify patterns of relationship or correlation between locations. This method is very important to get information on the dispersal patterns characteristic of a region and linkages between locations. In this study, it applied on the incidence of Dengue Hemorrhagic Fever (DHF) in 17 sub districts in Sleman, Daerah Istimewa Yogyakarta Province. The link among location indicated by a spatial weight matrix. It describe the structure of neighbouring and reflects the spatial influence. According to the spatial data, type of weighting matrix can be divided into two types: point type (distance) and the neighbourhood area (contiguity). Selection weighting function is one determinant of the results of the spatial analysis. This study use queen contiguity based on first order neighbour weights, queen contiguity based on second order neighbour weights, and inverse distance weights. Queen contiguity first order and inverse distance weights shows that there is the significance spatial autocorrelation in DHF, but not by queen contiguity second order. Queen contiguity first and second order compute 68 and 86 neighbour list
Hotspot detection using image pattern recognition based on higher-order local auto-correlation
Maeda, Shimon; Matsunawa, Tetsuaki; Ogawa, Ryuji; Ichikawa, Hirotaka; Takahata, Kazuhiro; Miyairi, Masahiro; Kotani, Toshiya; Nojima, Shigeki; Tanaka, Satoshi; Nakagawa, Kei; Saito, Tamaki; Mimotogi, Shoji; Inoue, Soichi; Nosato, Hirokazu; Sakanashi, Hidenori; Kobayashi, Takumi; Murakawa, Masahiro; Higuchi, Tetsuya; Takahashi, Eiichi; Otsu, Nobuyuki
2011-04-01
Below 40nm design node, systematic variation due to lithography must be taken into consideration during the early stage of design. So far, litho-aware design using lithography simulation models has been widely applied to assure that designs are printed on silicon without any error. However, the lithography simulation approach is very time consuming, and under time-to-market pressure, repetitive redesign by this approach may result in the missing of the market window. This paper proposes a fast hotspot detection support method by flexible and intelligent vision system image pattern recognition based on Higher-Order Local Autocorrelation. Our method learns the geometrical properties of the given design data without any defects as normal patterns, and automatically detects the design patterns with hotspots from the test data as abnormal patterns. The Higher-Order Local Autocorrelation method can extract features from the graphic image of design pattern, and computational cost of the extraction is constant regardless of the number of design pattern polygons. This approach can reduce turnaround time (TAT) dramatically only on 1CPU, compared with the conventional simulation-based approach, and by distributed processing, this has proven to deliver linear scalability with each additional CPU.
Directory of Open Access Journals (Sweden)
Nina L. Timofeeva
2014-01-01
Full Text Available The article presents the methodological and technical bases for the creation of regression models that adequately reflect reality. The focus is on methods of removing residual autocorrelation in models. Algorithms eliminating heteroscedasticity and autocorrelation of the regression model residuals: reweighted least squares method, the method of Cochran-Orkutta are given. A model of "pure" regression is build, as well as to compare the effect on the dependent variable of the different explanatory variables when the latter are expressed in different units, a standardized form of the regression equation. The scheme of abatement techniques of heteroskedasticity and autocorrelation for the creation of regression models specific to the social and cultural sphere is developed.
Fujii, Takahide; Nakano, Masanao; Yamashita, Ken; Konishi, Toshihiro; Izumi, Shintaro; Kawaguchi, Hiroshi; Yoshimoto, Masahiko
2013-01-01
This paper describes a robust method of Instantaneous Heart Rate (IHR) and R-peak detection from noisy electrocardiogram (ECG) signals. Generally, the IHR is calculated from the R-wave interval. Then, the R-waves are extracted from the ECG using a threshold. However, in wearable bio-signal monitoring systems, noise increases the incidence of misdetection and false detection of R-peaks. To prevent incorrect detection, we introduce a short-term autocorrelation (STAC) technique and a small-window autocorrelation (SWAC) technique, which leverages the similarity of QRS complex waveforms. Simulation results show that the proposed method improves the noise tolerance of R-peak detection.
ISAR Imaging of Ship Targets Based on an Integrated Cubic Phase Bilinear Autocorrelation Function
Directory of Open Access Journals (Sweden)
Jibin Zheng
2017-03-01
Full Text Available For inverse synthetic aperture radar (ISAR imaging of a ship target moving with ocean waves, the image constructed with the standard range-Doppler (RD technique is blurred and the range-instantaneous-Doppler (RID technique has to be used to improve the image quality. In this paper, azimuth echoes in a range cell of the ship target are modeled as noisy multicomponent cubic phase signals (CPSs after the motion compensation and a RID ISAR imaging algorithm is proposed based on the integrated cubic phase bilinear autocorrelation function (ICPBAF. The ICPBAF is bilinear and based on the two-dimensionally coherent energy accumulation. Compared to five other estimation algorithms, the ICPBAF can acquire higher cross term suppression and anti-noise performance with a reasonable computational cost. Through simulations and analyses with the synthetic model and real radar data, we verify the effectiveness of the ICPBAF and corresponding RID ISAR imaging algorithm.
Cut contribution to momentum autocorrelation function of an impurity in a classical diatomic chain
Yu, Ming B.
2018-02-01
A classic diatomic chain with a mass impurity is studied using the recurrence relations method. The momentum autocorrelation function of the impurity is a sum of contributions from two pairs of resonant poles and three branch cuts. The former results in cosine function and the latter in acoustic and optical branches. By use of convolution theorem, analytical expressions for the acoustic and optical branches are derived as even-order Bessel function expansions. The expansion coefficients are integrals of elliptic functions in the real axis for the acoustic branch and along a contour parallel to the imaginary axis for the optical branch, respectively. An integral is carried out for the calculation of optical branch: ∫0 ϕ dθ/√((1 - r 1 2 sin2 θ)(1 - r 2 2 sin2 θ)) = igsn -1 (sin ϕ) ( r 2 2 > r 1 2 > 1, g is a constant).
Cheng, Yezeng; Larin, Kirill V.
2006-12-01
Fingerprint recognition is one of the most widely used methods of biometrics. This method relies on the surface topography of a finger and, thus, is potentially vulnerable for spoofing by artificial dummies with embedded fingerprints. In this study, we applied the optical coherence tomography (OCT) technique to distinguish artificial materials commonly used for spoofing fingerprint scanning systems from the real skin. Several artificial fingerprint dummies made from household cement and liquid silicone rubber were prepared and tested using a commercial fingerprint reader and an OCT system. While the artificial fingerprints easily spoofed the commercial fingerprint reader, OCT images revealed the presence of them at all times. We also demonstrated that an autocorrelation analysis of the OCT images could be potentially used in automatic recognition systems.
Search for neutrino point sources with an all-sky autocorrelation analysis in IceCube
Energy Technology Data Exchange (ETDEWEB)
Turcati, Andrea; Bernhard, Anna; Coenders, Stefan [TU, Munich (Germany); Collaboration: IceCube-Collaboration
2016-07-01
The IceCube Neutrino Observatory is a cubic kilometre scale neutrino telescope located in the Antarctic ice. Its full-sky field of view gives unique opportunities to study the neutrino emission from the Galactic and extragalactic sky. Recently, IceCube found the first signal of astrophysical neutrinos with energies up to the PeV scale, but the origin of these particles still remains unresolved. Given the observed flux, the absence of observations of bright point-sources is explainable with the presence of numerous weak sources. This scenario can be tested using autocorrelation methods. We present here the sensitivities and discovery potentials of a two-point angular correlation analysis performed on seven years of IceCube data, taken between 2008 and 2015. The test is applied on the northern and southern skies separately, using the neutrino energy information to improve the effectiveness of the method.
Blumenthal, George R.; Johnston, Kathryn V.
1994-01-01
The Sachs-Wolfe effect is known to produce large angular scale fluctuations in the cosmic microwave background radiation (CMBR) due to gravitational potential fluctuations. We show how the angular correlation function of the CMBR can be expressed explicitly in terms of the mass autocorrelation function xi(r) in the universe. We derive analytic expressions for the angular correlation function and its multipole moments in terms of integrals over xi(r) or its second moment, J(sub 3)(r), which does not need to satisfy the sort of integral constraint that xi(r) must. We derive similar expressions for bulk flow velocity in terms of xi and J(sub 3). One interesting result that emerges directly from this analysis is that, for all angles theta, there is a substantial contribution to the correlation function from a wide range of distance r and that radial shape of this contribution does not vary greatly with angle.
Nodule detection methods using autocorrelation features on 3D chest CT scans
International Nuclear Information System (INIS)
Hara, T.; Zhou, X.; Okura, S.; Fujita, H.; Kiryu, T.; Hoshi, H.
2007-01-01
Lung cancer screening using low dose X-ray CT scan has been an acceptable examination to detect cancers at early stage. We have been developing an automated detection scheme for lung nodules on CT scan by using second-order autocorrelation features and the initial performance for small nodules (< 10 mm) shows a high true-positive rate with less than four false-positive marks per case. In this study, an open database of lung images, LIDC (Lung Image Database Consortium), was employed to evaluate our detection scheme as an consistency test. The detection performance for solid and solitary nodules in LIDC, included in the first data set opened by the consortium, was 83% (10/12) true-positive rate with 3.3 false-positive marks per case. (orig.)
Positron-electron autocorrelation function study of E-center in phosphorus-doped silicon
International Nuclear Information System (INIS)
Ho, K.F.; Beling, C.D.; Fung, S.; Biasini, M.; Ferro, G.; Gong, M.
2004-01-01
Two dimensional fourier transformed angular correlation of annihilation radiation (2D-FT-ACAR) spectra have been taken for 10 19 cm -3 phosphorus-doped Si in the as grown state and after being subjected to 1.8 MeV e - fluences of 2 x 10 18 cm -2 . In the spectra of the irradiated samples, the zero-crossing points are observed to displace outwards from the bravais lattice positions. It is suggested that this results from positrons annihilating with electrons in localized orbitals at the defect site. An attempt is made to extract just the component of the defect's positron-electron autocorrelation function that relates to the localized defect orbitals. It is argued that such an extracted real-space function may provide a suitable means for obtaining a mapping of localized defect orbitals. (orig.)
Autocorrelation spectra of an air-fluidized granular system measured by NMR
Lasic, S.; Stepisnik, J.; Mohoric, A.; Sersa, I.; Planinsic, G.
2006-09-01
A novel insight into the dynamics of a fluidized granular system is given by a nuclear magnetic resonance method that yields the spin-echo attenuation proportional to the spectrum of the grain positional fluctuation. Measurements of the air-fluidized oil-filled spheres and mustard seeds at different degrees of fluidization and grain volume fractions provide the velocity autocorrelation that differs from the commonly anticipated exponential Enskog decay. An empiric formula, which corresponds to the model of grain caging at collisions with adjacent beads, fits well to the experimental data. Its parameters are the characteristic collision time, the free path between collisions and the cage-breaking rate or the diffusion-like constant, which decreases with increasing grain volume fraction. Mean-squared displacements calculated from the correlation spectrum clearly show transitions from ballistic, through sub-diffusion and into diffusion regimes of grain motion.
Ibey, Bennett; Subramanian, Hariharan; Ericson, Nance; Xu, Weijian; Wilson, Mark; Cote, Gerard L.
2005-03-01
A blood perfusion and oxygenation sensor has been developed for in situ monitoring of transplanted organs. In processing in situ data, motion artifacts due to increased perfusion can create invalid oxygenation saturation values. In order to remove the unwanted artifacts from the pulsatile signal, adaptive filtering was employed using a third wavelength source centered at 810nm as a reference signal. The 810 nm source resides approximately at the isosbestic point in the hemoglobin absorption curve where the absorbance of light is nearly equal for oxygenated and deoxygenated hemoglobin. Using an autocorrelation based algorithm oxygenation saturation values can be obtained without the need for large sampling data sets allowing for near real-time processing. This technique has been shown to be more reliable than traditional techniques and proven to adequately improve the measurement of oxygenation values in varying perfusion states.
Autocorrelation Study of Solar Wind Plasma and IMF Properties as Measured by the MAVEN Spacecraft
Marquette, Melissa L.; Lillis, Robert J.; Halekas, J. S.; Luhmann, J. G.; Gruesbeck, J. R.; Espley, J. R.
2018-04-01
It has long been a goal of the heliophysics community to understand solar wind variability at heliocentric distances other than 1 AU, especially at ˜1.5 AU due to not only the steepening of solar wind stream interactions outside 1 AU but also the number of missions available there to measure it. In this study, we use 35 months of solar wind and interplanetary magnetic field (IMF) data taken at Mars by the Mars Atmosphere and Volatile EvolutioN (MAVEN) spacecraft to conduct an autocorrelation analysis of the solar wind speed, density, and dynamic pressure, which is derived from the speed and density, as well as the IMF strength and orientation. We found that the solar wind speed is coherent, that is, has an autocorrelation coefficient above 1/e, over roughly 56 hr, while the density and pressure are coherent over smaller intervals of roughly 25 and 20 hr, respectively, and that the IMF strength is coherent over time intervals of approximately 20 hr, while the cone and clock angles are considerably less steady but still somewhat coherent up to time lags of roughly 16 hr. We also found that when the speed, density, pressure, or IMF strength is higher than average, the solar wind or IMF becomes uncorrelated more quickly, while when they are below average, it tends to be steadier. This analysis allows us to make estimates of the values of solar wind plasma and IMF parameters when they are not directly measured and provide an approximation of the error associated with that estimate.
Directory of Open Access Journals (Sweden)
Xuesong FENG, Ph.D Candidate
2009-01-01
Full Text Available It is expected that improvement of transport networks could give rise to the change of spatial distributions of population-related factors and car ownership, which are expected to further influence travel demand. To properly reflect such an interdependence mechanism, an aggregate multinomial logit (A-MNL model was firstly applied to represent the spatial distributions of these exogenous variables of the travel demand model by reflecting the influence of transport networks. Next, the spatial autocorrelation analysis is introduced into the log-transformed A-MNL model (called SPA-MNL model. Thereafter, the SPA-MNL model is integrated into the four-step travel demand model with feedback (called 4-STEP model. As a result, an integrated travel demand model is newly developed and named as the SPA-STEP model. Using person trip data collected in Beijing, the performance of the SPA-STEP model is empirically compared with the 4-STEP model. It was proven that the SPA-STEP model is superior to the 4-STEP model in accuracy; most of the estimated parameters showed statistical differences in values. Moreover, though the results of the simulations to the same set of assumed scenarios by the 4-STEP model and the SPA-STEP model consistently suggested the same sustainable path for the future development of Beijing, it was found that the environmental sustainability and the traffic congestion for these scenarios were generally overestimated by the 4-STEP model compared with the corresponding analyses by the SPA-STEP model. Such differences were clearly generated by the introduction of the new modeling step with spatial autocorrelation.
A general statistical test for correlations in a finite-length time series.
Hanson, Jeffery A; Yang, Haw
2008-06-07
The statistical properties of the autocorrelation function from a time series composed of independently and identically distributed stochastic variables has been studied. Analytical expressions for the autocorrelation function's variance have been derived. It has been found that two common ways of calculating the autocorrelation, moving-average and Fourier transform, exhibit different uncertainty characteristics. For periodic time series, the Fourier transform method is preferred because it gives smaller uncertainties that are uniform through all time lags. Based on these analytical results, a statistically robust method has been proposed to test the existence of correlations in a time series. The statistical test is verified by computer simulations and an application to single-molecule fluorescence spectroscopy is discussed.
Evaluating the coefficients of autocorrelation in a series of annual run-off of the Far East rivers
Energy Technology Data Exchange (ETDEWEB)
Sakharyuk, A V
1981-01-01
An evaluation is made of the coefficients of autocorrelation in series of annual river run-off based on group analysis using data on the distribution law of sampling correlation coefficients of temporal series subordinate to the III type Pearson's distribution.
Barnard, P.L.; Rubin, D.M.; Harney, J.; Mustain, N.
2007-01-01
This extensive field test of an autocorrelation technique for determining grain size from digital images was conducted using a digital bed-sediment camera, or 'beachball' camera. Using 205 sediment samples and >1200 images from a variety of beaches on the west coast of the US, grain size ranging from sand to granules was measured from field samples using both the autocorrelation technique developed by Rubin [Rubin, D.M., 2004. A simple autocorrelation algorithm for determining grain size from digital images of sediment. Journal of Sedimentary Research, 74(1): 160-165.] and traditional methods (i.e. settling tube analysis, sieving, and point counts). To test the accuracy of the digital-image grain size algorithm, we compared results with manual point counts of an extensive image data set in the Santa Barbara littoral cell. Grain sizes calculated using the autocorrelation algorithm were highly correlated with the point counts of the same images (r2 = 0.93; n = 79) and had an error of only 1%. Comparisons of calculated grain sizes and grain sizes measured from grab samples demonstrated that the autocorrelation technique works well on high-energy dissipative beaches with well-sorted sediment such as in the Pacific Northwest (r2 ??? 0.92; n = 115). On less dissipative, more poorly sorted beaches such as Ocean Beach in San Francisco, results were not as good (r2 ??? 0.70; n = 67; within 3% accuracy). Because the algorithm works well compared with point counts of the same image, the poorer correlation with grab samples must be a result of actual spatial and vertical variability of sediment in the field; closer agreement between grain size in the images and grain size of grab samples can be achieved by increasing the sampling volume of the images (taking more images, distributed over a volume comparable to that of a grab sample). In all field tests the autocorrelation method was able to predict the mean and median grain size with ???96% accuracy, which is more than
On the 2nd order autocorrelation of an XUV attosecond pulse train
International Nuclear Information System (INIS)
Tzallas, P.; Benis, E.; Nikolopoulos, L.A.A.; Tsakiris, G.D.; Witte, K.; Charalambidis, P
2005-01-01
Full text: We present the first direct measurement of sub-fs light bunching that has been achieved, extending well established fs optical metrology to XUV as pulses. A mean train pulse duration of 780 as has been extracted through a 2 nd order autocorrelation approach, utilizing a nonlinear effect that is induced solely by the XUV radiation to be characterized. The approach is based on (i) a bisected spherical mirror XUV wavefront divider used as an autocorrelator and (ii) the two photon ionization of atomic He by a superposition of the 7 th to the 15 th harmonic of a Ti:sapph laser. The measured temporal mean width is more than twice its Fourier transform limited (FTL) value, in contrast to the as train pulse durations measured through other approaches, which where found much closer to the FTL values. We have investigated, and discuss here the origin of this discrepancy. An assessment of the validity of the 2 nd order AC approach for the broad band XUV radiation of as pulses is implemented through ab initio calculations (solution of the 3D TDSE of He in the presence of the superposition of the harmonic superposition) modeling the spectral and temporal response of the two-XUV-photon He ionization detector employed. It is found that both the spectral and temporal response are not affecting the measured duration. The mean width of the as train bursts is estimated from the spectral phases of the individual harmonics as they result from the rescattering model, taking into account the spatially modulated temporal width of the radiation due to the spatiotemporal intensity distribution of the driving field during the harmonic generation process. The measured value is found in reasonable agreement with the estimated duration. The method used for the 2 nd order AC in itself initiates further XUV-pump-XUV-probe studies of sub-fs-scale dynamics and at the same time becomes highly pertinent in connection with nonlinear experiments using XUV free - electron laser sources. Refs
Risk-based transfer responses to climate change, simulated through autocorrelated stochastic methods
Kirsch, B.; Characklis, G. W.
2009-12-01
Maintaining municipal water supply reliability despite growing demands can be achieved through a variety of mechanisms, including supply strategies such as temporary transfers. However, much of the attention on transfers has been focused on market-based transfers in the western United States largely ignoring the potential for transfers in the eastern U.S. The different legal framework of the eastern and western U.S. leads to characteristic differences between their respective transfers. Western transfers tend to be agricultural-to-urban and involve raw, untreated water, with the transfer often involving a simple change in the location and/or timing of withdrawals. Eastern transfers tend to be contractually established urban-to-urban transfers of treated water, thereby requiring the infrastructure to transfer water between utilities. Utilities require the tools to be able to evaluate transfer decision rules and the resulting expected future transfer behavior. Given the long-term planning horizons of utilities, potential changes in hydrologic patterns due to climate change must be considered. In response, this research develops a method for generating a stochastic time series that reproduces the historic autocorrelation and can be adapted to accommodate future climate scenarios. While analogous in operation to an autoregressive model, this method reproduces the seasonal autocorrelation structure, as opposed to assuming the strict stationarity produced by an autoregressive model. Such urban-to-urban transfers are designed to be rare, transient events used primarily during times of severe drought, and incorporating Monte Carlo techniques allows for the development of probability distributions of likely outcomes. This research evaluates a system risk-based, urban-to-urban transfer agreement between three utilities in the Triangle region of North Carolina. Two utilities maintain their own surface water supplies in adjoining watersheds and look to obtain transfers via
International Nuclear Information System (INIS)
Lim, Gyeong Hui
2008-03-01
This book consists of 15 chapters, which are basic conception and meaning of statistical thermodynamics, Maxwell-Boltzmann's statistics, ensemble, thermodynamics function and fluctuation, statistical dynamics with independent particle system, ideal molecular system, chemical equilibrium and chemical reaction rate in ideal gas mixture, classical statistical thermodynamics, ideal lattice model, lattice statistics and nonideal lattice model, imperfect gas theory on liquid, theory on solution, statistical thermodynamics of interface, statistical thermodynamics of a high molecule system and quantum statistics
Auto-correlation based intelligent technique for complex waveform presentation and measurement
International Nuclear Information System (INIS)
Rana, K P S; Singh, R; Sayann, K S
2009-01-01
Waveform acquisition and presentation forms the heart of many measurement systems. Particularly, data acquisition and presentation of repeating complex signals like sine sweep and frequency-modulated signals introduces the challenge of waveform time period estimation and live waveform presentation. This paper presents an intelligent technique, for waveform period estimation of both the complex and simple waveforms, based on the normalized auto-correlation method. The proposed technique is demonstrated using LabVIEW based intensive simulations on several simple and complex waveforms. Implementation of the technique is successfully demonstrated using LabVIEW based virtual instrumentation. Sine sweep vibration waveforms are successfully presented and measured for electrodynamic shaker system generated vibrations. The proposed method is also suitable for digital storage oscilloscope (DSO) triggering, for complex signals acquisition and presentation. This intelligence can be embodied into the DSO, making it an intelligent measurement system, catering wide varieties of the waveforms. The proposed technique, simulation results, robustness study and implementation results are presented in this paper.
Simultaneous measurement of particle velocity and size based on gray difference and autocorrelation
Institute of Scientific and Technical Information of China (English)
无
2010-01-01
The gray of two images of a same particle taken by a digital camera with different exposure times is different too. Based on the gray difference of particle images in a double-exposed photo and autocorrelation processing of digital images,this paper proposes a method for measuring particle velocities and sizes simultaneously. This paper also introduces the theoretical foundation of this method,the process of particle imaging and image processing,and the simultaneous measurement of velocity and size of a low speed flow field with 35 μm and 75 μm standard particles. The graphical measurement results can really reflect the flow characteristics of the flow field. In addition,although the measured velocity and size histograms of these two kinds of standard particles are slightly wider than the theoretical ones,they are all still similar to the normal distribution,and the peak velocities and diameters of the histograms are consistent with the default values. Therefore,this measurement method is capable of providing moderate measurement accuracy,and it can be further developed for high-speed flow field measurements.
Shiue, Ren-Jye; Gao, Yuanda; Wang, Yifei; Peng, Cheng; Robertson, Alexander D; Efetov, Dmitri K; Assefa, Solomon; Koppens, Frank H L; Hone, James; Englund, Dirk
2015-11-11
Graphene and other two-dimensional (2D) materials have emerged as promising materials for broadband and ultrafast photodetection and optical modulation. These optoelectronic capabilities can augment complementary metal-oxide-semiconductor (CMOS) devices for high-speed and low-power optical interconnects. Here, we demonstrate an on-chip ultrafast photodetector based on a two-dimensional heterostructure consisting of high-quality graphene encapsulated in hexagonal boron nitride. Coupled to the optical mode of a silicon waveguide, this 2D heterostructure-based photodetector exhibits a maximum responsivity of 0.36 A/W and high-speed operation with a 3 dB cutoff at 42 GHz. From photocurrent measurements as a function of the top-gate and source-drain voltages, we conclude that the photoresponse is consistent with hot electron mediated effects. At moderate peak powers above 50 mW, we observe a saturating photocurrent consistent with the mechanisms of electron-phonon supercollision cooling. This nonlinear photoresponse enables optical on-chip autocorrelation measurements with picosecond-scale timing resolution and exceptionally low peak powers.
Lardy, Matthew A; Lebrun, Laurie; Bullard, Drew; Kissinger, Charles; Gobbi, Alberto
2012-05-25
In modern day drug discovery campaigns, computational chemists have to be concerned not only about improving the potency of molecules but also reducing any off-target ADMET activity. There are a plethora of antitargets that computational chemists may have to consider. Fortunately many antitargets have crystal structures deposited in the PDB. These structures are immediately useful to our Autocorrelator: an automated model generator that optimizes variables for building computational models. This paper describes the use of the Autocorrelator to construct high quality docking models for cytochrome P450 2C9 (CYP2C9) from two publicly available crystal structures. Both models result in strong correlation coefficients (R² > 0.66) between the predicted and experimental determined log(IC₅₀) values. Results from the two models overlap well with each other, converging on the same scoring function, deprotonated charge state, and predicted the binding orientation for our collection of molecules.
... What Is Cancer? Cancer Statistics Cancer Disparities Cancer Statistics Cancer has a major impact on society in ... success of efforts to control and manage cancer. Statistics at a Glance: The Burden of Cancer in ...
Energy Technology Data Exchange (ETDEWEB)
Kirby, R. Jason [Sanford Burnham Prebys Medical Discovery Institute, Conrad Prebys Center for Chemical Genomics, 6400 Sanger Rd, Orlando, FL 32827 (United States); Qi, Feng [Sanford Burnham Prebys Medical Discovery Institute, Applied Bioinformatics Facility, 6400 Sanger Rd, Orlando, FL 32827 (United States); Phatak, Sharangdhar; Smith, Layton H. [Sanford Burnham Prebys Medical Discovery Institute, Conrad Prebys Center for Chemical Genomics, 6400 Sanger Rd, Orlando, FL 32827 (United States); Malany, Siobhan, E-mail: smalany@sbpdiscovery.org [Sanford Burnham Prebys Medical Discovery Institute, Conrad Prebys Center for Chemical Genomics, 6400 Sanger Rd, Orlando, FL 32827 (United States)
2016-08-15
Cardiac safety assays incorporating label-free detection of human stem-cell derived cardiomyocyte contractility provide human relevance and medium throughput screening to assess compound-induced cardiotoxicity. In an effort to provide quantitative analysis of the large kinetic datasets resulting from these real-time studies, we applied bioinformatic approaches based on nonlinear dynamical system analysis, including limit cycle analysis and autocorrelation function, to systematically assess beat irregularity. The algorithms were integrated into a software program to seamlessly generate results for 96-well impedance-based data. Our approach was validated by analyzing dose- and time-dependent changes in beat patterns induced by known proarrhythmic compounds and screening a cardiotoxicity library to rank order compounds based on their proarrhythmic potential. We demonstrate a strong correlation for dose-dependent beat irregularity monitored by electrical impedance and quantified by autocorrelation analysis to traditional manual patch clamp potency values for hERG blockers. In addition, our platform identifies non-hERG blockers known to cause clinical arrhythmia. Our method provides a novel suite of medium-throughput quantitative tools for assessing compound effects on cardiac contractility and predicting compounds with potential proarrhythmia and may be applied to in vitro paradigms for pre-clinical cardiac safety evaluation. - Highlights: • Impedance-based monitoring of human iPSC-derived cardiomyocyte contractility • Limit cycle analysis of impedance data identifies aberrant oscillation patterns. • Nonlinear autocorrelation function quantifies beat irregularity. • Identification of hERG and non-hERG inhibitors with known risk of arrhythmia • Automated software processes limit cycle and autocorrelation analyses of 96w data.
International Nuclear Information System (INIS)
Kirby, R. Jason; Qi, Feng; Phatak, Sharangdhar; Smith, Layton H.; Malany, Siobhan
2016-01-01
Cardiac safety assays incorporating label-free detection of human stem-cell derived cardiomyocyte contractility provide human relevance and medium throughput screening to assess compound-induced cardiotoxicity. In an effort to provide quantitative analysis of the large kinetic datasets resulting from these real-time studies, we applied bioinformatic approaches based on nonlinear dynamical system analysis, including limit cycle analysis and autocorrelation function, to systematically assess beat irregularity. The algorithms were integrated into a software program to seamlessly generate results for 96-well impedance-based data. Our approach was validated by analyzing dose- and time-dependent changes in beat patterns induced by known proarrhythmic compounds and screening a cardiotoxicity library to rank order compounds based on their proarrhythmic potential. We demonstrate a strong correlation for dose-dependent beat irregularity monitored by electrical impedance and quantified by autocorrelation analysis to traditional manual patch clamp potency values for hERG blockers. In addition, our platform identifies non-hERG blockers known to cause clinical arrhythmia. Our method provides a novel suite of medium-throughput quantitative tools for assessing compound effects on cardiac contractility and predicting compounds with potential proarrhythmia and may be applied to in vitro paradigms for pre-clinical cardiac safety evaluation. - Highlights: • Impedance-based monitoring of human iPSC-derived cardiomyocyte contractility • Limit cycle analysis of impedance data identifies aberrant oscillation patterns. • Nonlinear autocorrelation function quantifies beat irregularity. • Identification of hERG and non-hERG inhibitors with known risk of arrhythmia • Automated software processes limit cycle and autocorrelation analyses of 96w data
Autocorrelation descriptor improvements for QSAR: 2DA_Sign and 3DA_Sign
Sliwoski, Gregory; Mendenhall, Jeffrey; Meiler, Jens
2016-03-01
Quantitative structure-activity relationship (QSAR) is a branch of computer aided drug discovery that relates chemical structures to biological activity. Two well established and related QSAR descriptors are two- and three-dimensional autocorrelation (2DA and 3DA). These descriptors encode the relative position of atoms or atom properties by calculating the separation between atom pairs in terms of number of bonds (2DA) or Euclidean distance (3DA). The sums of all values computed for a given small molecule are collected in a histogram. Atom properties can be added with a coefficient that is the product of atom properties for each pair. This procedure can lead to information loss when signed atom properties are considered such as partial charge. For example, the product of two positive charges is indistinguishable from the product of two equivalent negative charges. In this paper, we present variations of 2DA and 3DA called 2DA_Sign and 3DA_Sign that avoid information loss by splitting unique sign pairs into individual histograms. We evaluate these variations with models trained on nine datasets spanning a range of drug target classes. Both 2DA_Sign and 3DA_Sign significantly increase model performance across all datasets when compared with traditional 2DA and 3DA. Lastly, we find that limiting 3DA_Sign to maximum atom pair distances of 6 Å instead of 12 Å further increases model performance, suggesting that conformational flexibility may hinder performance with longer 3DA descriptors. Consistent with this finding, limiting the number of bonds in 2DA_Sign from 11 to 5 fails to improve performance.
Songhurst, Anna; Coulson, Tim
2014-03-01
Few universal trends in spatial patterns of wildlife crop-raiding have been found. Variations in wildlife ecology and movements, and human spatial use have been identified as causes of this apparent unpredictability. However, varying spatial patterns of spatial autocorrelation (SA) in human-wildlife conflict (HWC) data could also contribute. We explicitly explore the effects of SA on wildlife crop-raiding data in order to facilitate the design of future HWC studies. We conducted a comparative survey of raided and nonraided fields to determine key drivers of crop-raiding. Data were subsampled at different spatial scales to select independent raiding data points. The model derived from all data was fitted to subsample data sets. Model parameters from these models were compared to determine the effect of SA. Most methods used to account for SA in data attempt to correct for the change in P-values; yet, by subsampling data at broader spatial scales, we identified changes in regression estimates. We consequently advocate reporting both model parameters across a range of spatial scales to help biological interpretation. Patterns of SA vary spatially in our crop-raiding data. Spatial distribution of fields should therefore be considered when choosing the spatial scale for analyses of HWC studies. Robust key drivers of elephant crop-raiding included raiding history of a field and distance of field to a main elephant pathway. Understanding spatial patterns and determining reliable socio-ecological drivers of wildlife crop-raiding is paramount for designing mitigation and land-use planning strategies to reduce HWC. Spatial patterns of HWC are complex, determined by multiple factors acting at more than one scale; therefore, studies need to be designed with an understanding of the effects of SA. Our methods are accessible to a variety of practitioners to assess the effects of SA, thereby improving the reliability of conservation management actions.
Directory of Open Access Journals (Sweden)
Brady J Mattsson
Full Text Available Understanding interactions between mobile species distributions and landcover characteristics remains an outstanding challenge in ecology. Multiple factors could explain species distributions including endogenous evolutionary traits leading to conspecific clustering and endogenous habitat features that support life history requirements. Birds are a useful taxon for examining hypotheses about the relative importance of these factors among species in a community. We developed a hierarchical Bayes approach to model the relationships between bird species occupancy and local landcover variables accounting for spatial autocorrelation, species similarities, and partial observability. We fit alternative occupancy models to detections of 90 bird species observed during repeat visits to 316 point-counts forming a 400-m grid throughout the Patuxent Wildlife Research Refuge in Maryland, USA. Models with landcover variables performed significantly better than our autologistic and null models, supporting the hypothesis that local landcover heterogeneity is important as an exogenous driver for species distributions. Conspecific clustering alone was a comparatively poor descriptor of local community composition, but there was evidence for spatial autocorrelation in all species. Considerable uncertainty remains whether landcover combined with spatial autocorrelation is most parsimonious for describing bird species distributions at a local scale. Spatial structuring may be weaker at intermediate scales within which dispersal is less frequent, information flows are localized, and landcover types become spatially diversified and therefore exhibit little aggregation. Examining such hypotheses across species assemblages contributes to our understanding of community-level associations with conspecifics and landscape composition.
Mohebbi, Mohammadreza; Wolfe, Rory; Forbes, Andrew
2014-01-01
This paper applies the generalised linear model for modelling geographical variation to esophageal cancer incidence data in the Caspian region of Iran. The data have a complex and hierarchical structure that makes them suitable for hierarchical analysis using Bayesian techniques, but with care required to deal with problems arising from counts of events observed in small geographical areas when overdispersion and residual spatial autocorrelation are present. These considerations lead to nine regression models derived from using three probability distributions for count data: Poisson, generalised Poisson and negative binomial, and three different autocorrelation structures. We employ the framework of Bayesian variable selection and a Gibbs sampling based technique to identify significant cancer risk factors. The framework deals with situations where the number of possible models based on different combinations of candidate explanatory variables is large enough such that calculation of posterior probabilities for all models is difficult or infeasible. The evidence from applying the modelling methodology suggests that modelling strategies based on the use of generalised Poisson and negative binomial with spatial autocorrelation work well and provide a robust basis for inference. PMID:24413702
Holcomb, David Andrew; Messier, Kyle P; Serre, Marc L; Rowny, Jakob G; Stewart, Jill R
2018-06-11
Predictive modeling is promising as an inexpensive tool to assess water quality. We developed geostatistical predictive models of microbial water quality that empirically modelled spatiotemporal autocorrelation in measured fecal coliform (FC) bacteria concentrations to improve prediction. We compared five geostatistical models featuring different autocorrelation structures, fit to 676 observations from 19 locations in North Carolina's Jordan Lake watershed using meteorological and land cover predictor variables. Though stream distance metrics (with and without flow-weighting) failed to improve prediction over the Euclidean distance metric, incorporating temporal autocorrelation substantially improved prediction over the space-only models. We predicted FC throughout the stream network daily for one year, designating locations "impaired", "unimpaired", or "unassessed" if the probability of exceeding the state standard was >90%, 10% but <90%, respectively. We could assign impairment status to more of the stream network on days any FC were measured, suggesting frequent sample-based monitoring remains necessary, though implementing spatiotemporal predictive models may reduce the number of concurrent sampling locations required to adequately assess water quality. Together, these results suggest that prioritizing sampling at different times and conditions using geographically sparse monitoring networks is adequate to build robust and informative geostatistical models of water quality impairment.
Statistical features of pre-compound processes in nuclear reactions
International Nuclear Information System (INIS)
Hussein, M.S.; Rego, R.A.
1983-04-01
Several statistical aspects of multistep compound processes are discussed. The connection between the cross-section auto-correlation function and the average number of maxima is emphasized. The restrictions imposed by the non-zero value of the energy step used in measuring the excitation fuction and the experimental error are discussed. Applications are made to the system 25 Mg( 3 He,p) 27 Al. (Author) [pt
STATISTICAL ANALYSIS OF RAW SUGAR MATERIAL FOR SUGAR PRODUCER COMPLEX
A. A. Gromkovskii; O. I. Sherstyuk
2015-01-01
Summary. In the article examines the statistical data on the development of average weight and average sugar content of sugar beet roots. The successful solution of the problem of forecasting these raw indices is essential for solving problems of sugar producing complex control. In the paper by calculating the autocorrelation function demonstrated that the predominant trend component of the growth raw characteristics. For construct the prediction model is proposed to use an autoregressive fir...
Shanahan, Daniel R
2016-01-01
journal in which a reporting statement was published was shown to influence the number of citations that statement will gather over time. Similarly, the number of article accesses also influenced the number of citations, although to a lesser extent than the impact factor. This demonstrates that citation counts are not purely a reflection of scientific merit and the impact factor is, in fact, auto-correlated.
Directory of Open Access Journals (Sweden)
Daniel R. Shanahan
2016-03-01
. The impact factor of the journal in which a reporting statement was published was shown to influence the number of citations that statement will gather over time. Similarly, the number of article accesses also influenced the number of citations, although to a lesser extent than the impact factor. This demonstrates that citation counts are not purely a reflection of scientific merit and the impact factor is, in fact, auto-correlated.
Accounting for and predicting the influence of spatial autocorrelation in water quality modeling
Miralha, L.; Kim, D.
2017-12-01
Although many studies have attempted to investigate the spatial trends of water quality, more attention is yet to be paid to the consequences of considering and ignoring the spatial autocorrelation (SAC) that exists in water quality parameters. Several studies have mentioned the importance of accounting for SAC in water quality modeling, as well as the differences in outcomes between models that account for and ignore SAC. However, the capacity to predict the magnitude of such differences is still ambiguous. In this study, we hypothesized that SAC inherently possessed by a response variable (i.e., water quality parameter) influences the outcomes of spatial modeling. We evaluated whether the level of inherent SAC is associated with changes in R-Squared, Akaike Information Criterion (AIC), and residual SAC (rSAC), after accounting for SAC during modeling procedure. The main objective was to analyze if water quality parameters with higher Moran's I values (inherent SAC measure) undergo a greater increase in R² and a greater reduction in both AIC and rSAC. We compared a non-spatial model (OLS) to two spatial regression approaches (spatial lag and error models). Predictor variables were the principal components of topographic (elevation and slope), land cover, and hydrological soil group variables. We acquired these data from federal online sources (e.g. USGS). Ten watersheds were selected, each in a different state of the USA. Results revealed that water quality parameters with higher inherent SAC showed substantial increase in R² and decrease in rSAC after performing spatial regressions. However, AIC values did not show significant changes. Overall, the higher the level of inherent SAC in water quality variables, the greater improvement of model performance. This indicates a linear and direct relationship between the spatial model outcomes (R² and rSAC) and the degree of SAC in each water quality variable. Therefore, our study suggests that the inherent level of
A generalization of random matrix theory and its application to statistical physics.
Wang, Duan; Zhang, Xin; Horvatic, Davor; Podobnik, Boris; Eugene Stanley, H
2017-02-01
To study the statistical structure of crosscorrelations in empirical data, we generalize random matrix theory and propose a new method of cross-correlation analysis, known as autoregressive random matrix theory (ARRMT). ARRMT takes into account the influence of auto-correlations in the study of cross-correlations in multiple time series. We first analytically and numerically determine how auto-correlations affect the eigenvalue distribution of the correlation matrix. Then we introduce ARRMT with a detailed procedure of how to implement the method. Finally, we illustrate the method using two examples taken from inflation rates for air pressure data for 95 US cities.
International Nuclear Information System (INIS)
Por, G.
1999-08-01
A program package was developed to estimate the time dependent auto-correlation function (ACF) from the time signals of soft X-ray records taken along the various lines-of-sights in JET-SHOTS, and also to estimate the time dependent Decay Ratio (DR) from that. On the basis of ACF the time dependent auto-power spectral density (APSD) was also calculated. The steps and objectives of this work were: eliminating the white detection noise, trends and slow variation from the time signals, since ordinary methods can give good estimate of the time dependent ACF and DR only for 'nearly' stationary signals, developing an automatic algorithm for finding the maxima and minima of ACF, since they are the basis for DR estimation, evaluating and testing different DR estimators for JET-SHOT, with the aim of finding parts of the signals, where the oscillating character is strong, estimating time dependent ACF and APSD that can follow the relatively fast variation in the time signal. The methods that we have developed for data processing of transient signals are: White detection noise removal and preparation for trend removal - weak components, white detection noise and high frequency components are filtered from the signal using the so-called soft-threshold wavelet filter. Removal of trends and slow variation - Three-point differentiation of the pre-filtered signal is used to remove trends and slow variation. Here we made use of the DERIV function of IDL program language. This leads to a filtered signal that has zero mean value in each time step. Calculation of the time dependent ACF - The signal treated by the two previous steps is used as the input. Calculated ACF value is added in each new time step, but the previously accumulated ACF value is multiplied by a weighting factor. Thus the new sample has 100% contribution, while the contributions from the previous samples are forgotten quickly. DR calculation - DR is a measure of the decay of oscillating ACF. This parameter was shown
... this page: https://medlineplus.gov/usestatistics.html MedlinePlus Statistics To use the sharing features on this page, ... By Quarter View image full size Quarterly User Statistics Quarter Page Views Unique Visitors Oct-Dec-98 ...
Pestman, Wiebe R
2009-01-01
This textbook provides a broad and solid introduction to mathematical statistics, including the classical subjects hypothesis testing, normal regression analysis, and normal analysis of variance. In addition, non-parametric statistics and vectorial statistics are considered, as well as applications of stochastic analysis in modern statistics, e.g., Kolmogorov-Smirnov testing, smoothing techniques, robustness and density estimation. For students with some elementary mathematical background. With many exercises. Prerequisites from measure theory and linear algebra are presented.
Whole Frog Project and Virtual Frog Dissection Statistics wwwstats output for January 1 through duplicate or extraneous accesses. For example, in these statistics, while a POST requesting an image is as well. Note that this under-represents the bytes requested. Starting date for following statistics
International Nuclear Information System (INIS)
Rompotis, Dimitrios
2016-02-01
In this work, a single-shot temporal metrology scheme operating in the vacuum-extreme ultraviolet spectral range has been designed and experimentally implemented. Utilizing an anti-collinear geometry, a second-order intensity autocorrelation measurement of a vacuum ultraviolet pulse can be performed by encoding temporal delay information on the beam propagation coordinate. An ion-imaging time-of-flight spectrometer, offering micrometer resolution has been set-up for this purpose. This instrument enables the detection of a magnified image of the spatial distribution of ions exclusively generated by direct two-photon absorption in the combined counter-propagating pulse focus and thus obtain the second-order intensity autocorrelation measurement on a single-shot basis. Additionally, an intense VUV light source based on high-harmonic generation has been experimentally realized. It delivers intense sub-20 fs Ti:Sa fifth-harmonic pulses utilizing a loose-focusing geometry in a long Ar gas cell. The VUV pulses centered at 161.8 nm reach pulse energies of 1.1 μJ per pulse, while the corresponding pulse duration is measured with a second-order, fringe-resolved autocorrelation scheme to be 18 ± 1 fs on average. Non-resonant, two-photon ionization of Kr and Xe and three-photon ionization of Ne verify the fifth-harmonic pulse intensity and indicate the feasibility of multi-photon VUV pump/VUV probe studies of ultrafast atomic and molecular dynamics. Finally, the extended functionally of the counter-propagating pulse metrology approach is demonstrated by a single-shot VUV pump/VUV probe experiment aiming at the investigation of ultrafast dissociation dynamics of O 2 excited in the Schumann-Runge continuum at 162 nm.
Czech Academy of Sciences Publication Activity Database
Fekete, Ladislav; Kůsová, Kateřina; Petrák, Václav; Kratochvílová, Irena
2012-01-01
Roč. 14, č. 8 (2012), s. 1-10 ISSN 1388-0764 R&D Projects: GA AV ČR KAN200100801; GA TA ČR TA01011165; GA ČR(CZ) GAP304/10/1951; GA ČR GPP204/12/P235 Institutional research plan: CEZ:AV0Z10100520 Keywords : lateral grain size distribution * AFM * autocorrelation function * nanodiamond Subject RIV: BM - Solid Matter Physics ; Magnetism Impact factor: 2.175, year: 2012
A Statistical Toolbox For Mining And Modeling Spatial Data
Directory of Open Access Journals (Sweden)
D’Aubigny Gérard
2016-12-01
Full Text Available Most data mining projects in spatial economics start with an evaluation of a set of attribute variables on a sample of spatial entities, looking for the existence and strength of spatial autocorrelation, based on the Moran’s and the Geary’s coefficients, the adequacy of which is rarely challenged, despite the fact that when reporting on their properties, many users seem likely to make mistakes and to foster confusion. My paper begins by a critical appraisal of the classical definition and rational of these indices. I argue that while intuitively founded, they are plagued by an inconsistency in their conception. Then, I propose a principled small change leading to corrected spatial autocorrelation coefficients, which strongly simplifies their relationship, and opens the way to an augmented toolbox of statistical methods of dimension reduction and data visualization, also useful for modeling purposes. A second section presents a formal framework, adapted from recent work in statistical learning, which gives theoretical support to our definition of corrected spatial autocorrelation coefficients. More specifically, the multivariate data mining methods presented here, are easily implementable on the existing (free software, yield methods useful to exploit the proposed corrections in spatial data analysis practice, and, from a mathematical point of view, whose asymptotic behavior, already studied in a series of papers by Belkin & Niyogi, suggests that they own qualities of robustness and a limited sensitivity to the Modifiable Areal Unit Problem (MAUP, valuable in exploratory spatial data analysis.
Wang, X.; Tu, C. Y.; He, J.; Wang, L.
2017-12-01
It has been a longstanding debate on what the nature of Elsässer variables z- observed in the Alfvénic solar wind is. It is widely believed that z- represents inward propagating Alfvén waves and undergoes non-linear interaction with z+ to produce energy cascade. However, z- variations sometimes show nature of convective structures. Here we present a new data analysis on z- autocorrelation functions to get some definite information on its nature. We find that there is usually a break point on the z- auto-correlation function when the fluctuations show nearly pure Alfvénicity. The break point observed by Helios-2 spacecraft near 0.3 AU is at the first time lag ( 81 s), where the autocorrelation coefficient has the value less than that at zero-time lag by a factor of more than 0.4. The autocorrelation function breaks also appear in the WIND observations near 1 AU. The z- autocorrelation function is separated by the break into two parts: fast decreasing part and slowly decreasing part, which cannot be described in a whole by an exponential formula. The breaks in the z- autocorrelation function may represent that the z- time series are composed of high-frequency white noise and low-frequency apparent structures, which correspond to the flat and steep parts of the function, respectively. This explanation is supported by a simple test with a superposition of an artificial random data series and a smoothed random data series. Since in many cases z- autocorrelation functions do not decrease very quickly at large time lag and cannot be considered as the Lanczos type, no reliable value for correlation-time can be derived. Our results showed that in these cases with high Alfvénicity, z- should not be considered as inward-propagating wave. The power-law spectrum of z+ should be made by fluid turbulence cascade process presented by Kolmogorov.
Sadovskii, Michael V
2012-01-01
This volume provides a compact presentation of modern statistical physics at an advanced level. Beginning with questions on the foundations of statistical mechanics all important aspects of statistical physics are included, such as applications to ideal gases, the theory of quantum liquids and superconductivity and the modern theory of critical phenomena. Beyond that attention is given to new approaches, such as quantum field theory methods and non-equilibrium problems.
Laracuente, Nicholas; Grossman, Carl
2013-03-01
We developed an algorithm and software to calculate autocorrelation functions from real-time photon-counting data using the fast, parallel capabilities of graphical processor units (GPUs). Recent developments in hardware and software have allowed for general purpose computing with inexpensive GPU hardware. These devices are more suited for emulating hardware autocorrelators than traditional CPU-based software applications by emphasizing parallel throughput over sequential speed. Incoming data are binned in a standard multi-tau scheme with configurable points-per-bin size and are mapped into a GPU memory pattern to reduce time-expensive memory access. Applications include dynamic light scattering (DLS) and fluorescence correlation spectroscopy (FCS) experiments. We ran the software on a 64-core graphics pci card in a 3.2 GHz Intel i5 CPU based computer running Linux. FCS measurements were made on Alexa-546 and Texas Red dyes in a standard buffer (PBS). Software correlations were compared to hardware correlator measurements on the same signals. Supported by HHMI and Swarthmore College
Directory of Open Access Journals (Sweden)
Mei-Yu LEE
2014-11-01
Full Text Available This paper investigates the effect of the nonzero autocorrelation coefficients on the sampling distributions of the Durbin-Watson test estimator in three time-series models that have different variance-covariance matrix assumption, separately. We show that the expected values and variances of the Durbin-Watson test estimator are slightly different, but the skewed and kurtosis coefficients are considerably different among three models. The shapes of four coefficients are similar between the Durbin-Watson model and our benchmark model, but are not the same with the autoregressive model cut by one-lagged period. Second, the large sample case shows that the three models have the same expected values, however, the autoregressive model cut by one-lagged period explores different shapes of variance, skewed and kurtosis coefficients from the other two models. This implies that the large samples lead to the same expected values, 2(1 – ρ0, whatever the variance-covariance matrix of the errors is assumed. Finally, comparing with the two sample cases, the shape of each coefficient is almost the same, moreover, the autocorrelation coefficients are negatively related with expected values, are inverted-U related with variances, are cubic related with skewed coefficients, and are U related with kurtosis coefficients.
Velocity auto-correlation and hot-electron diffusion constant in GaAs and InP
International Nuclear Information System (INIS)
Deb Roy, M.
1982-01-01
Auto-correlation functions of the fluctuations in the electron velocities transverse and parallel to the applied electric field are calculated by the Monte Carlo method for GaAs and InP at three different values of field strength which are around three times the threshold field for negative differential mobility in each case. From these the frequency-dependent diffusion coefficients transverse and parallel to the applied field and the figure of merit for noise performance when used in a microwave amplifying device are determined. The results indicate that the transverse auto-correlation function Csub(t)(s) falls nearly exponentially to zero with increasing interval s while the parallel function Csub(p)(s) falls sharply, attains a minimum and then rises towards zero. In each case a higher field gives a higher rate of fall and makes the correlation functions zero within a shorter interval. The transverses diffusion coefficient falls monotonically with the frequency but the parallel diffusion coefficient generally starts with a low value at low frequencies, rises to a maximum and then falls. InP, with a larger separation between the central and the satellite valleys, has a higher value of the low frequency transverse diffusion coefficient and a lower value of its parallel counterpart. The noise performance of microwave semiconductor amplifying devices depends mainly on the low frequency parallel diffusion constant and consequently devices made out of materials like InP with a large separation between valleys are likely to have better noise characteristics. (orig.)
Goodman, Joseph W
2015-01-01
This book discusses statistical methods that are useful for treating problems in modern optics, and the application of these methods to solving a variety of such problems This book covers a variety of statistical problems in optics, including both theory and applications. The text covers the necessary background in statistics, statistical properties of light waves of various types, the theory of partial coherence and its applications, imaging with partially coherent light, atmospheric degradations of images, and noise limitations in the detection of light. New topics have been introduced i
Energy Technology Data Exchange (ETDEWEB)
Eliazar, Iddo, E-mail: eliazar@post.tau.ac.il
2017-05-15
The exponential, the normal, and the Poisson statistical laws are of major importance due to their universality. Harmonic statistics are as universal as the three aforementioned laws, but yet they fall short in their ‘public relations’ for the following reason: the full scope of harmonic statistics cannot be described in terms of a statistical law. In this paper we describe harmonic statistics, in their full scope, via an object termed harmonic Poisson process: a Poisson process, over the positive half-line, with a harmonic intensity. The paper reviews the harmonic Poisson process, investigates its properties, and presents the connections of this object to an assortment of topics: uniform statistics, scale invariance, random multiplicative perturbations, Pareto and inverse-Pareto statistics, exponential growth and exponential decay, power-law renormalization, convergence and domains of attraction, the Langevin equation, diffusions, Benford’s law, and 1/f noise. - Highlights: • Harmonic statistics are described and reviewed in detail. • Connections to various statistical laws are established. • Connections to perturbation, renormalization and dynamics are established.
International Nuclear Information System (INIS)
Eliazar, Iddo
2017-01-01
The exponential, the normal, and the Poisson statistical laws are of major importance due to their universality. Harmonic statistics are as universal as the three aforementioned laws, but yet they fall short in their ‘public relations’ for the following reason: the full scope of harmonic statistics cannot be described in terms of a statistical law. In this paper we describe harmonic statistics, in their full scope, via an object termed harmonic Poisson process: a Poisson process, over the positive half-line, with a harmonic intensity. The paper reviews the harmonic Poisson process, investigates its properties, and presents the connections of this object to an assortment of topics: uniform statistics, scale invariance, random multiplicative perturbations, Pareto and inverse-Pareto statistics, exponential growth and exponential decay, power-law renormalization, convergence and domains of attraction, the Langevin equation, diffusions, Benford’s law, and 1/f noise. - Highlights: • Harmonic statistics are described and reviewed in detail. • Connections to various statistical laws are established. • Connections to perturbation, renormalization and dynamics are established.
Szulc, Stefan
1965-01-01
Statistical Methods provides a discussion of the principles of the organization and technique of research, with emphasis on its application to the problems in social statistics. This book discusses branch statistics, which aims to develop practical ways of collecting and processing numerical data and to adapt general statistical methods to the objectives in a given field.Organized into five parts encompassing 22 chapters, this book begins with an overview of how to organize the collection of such information on individual units, primarily as accomplished by government agencies. This text then
... Testing Treatment & Outcomes Health Professionals Statistics More Resources Candidiasis Candida infections of the mouth, throat, and esophagus Vaginal candidiasis Invasive candidiasis Definition Symptoms Risk & Prevention Sources Diagnosis ...
Petocz, Peter; Sowey, Eric
2012-01-01
The term "data snooping" refers to the practice of choosing which statistical analyses to apply to a set of data after having first looked at those data. Data snooping contradicts a fundamental precept of applied statistics, that the scheme of analysis is to be planned in advance. In this column, the authors shall elucidate the…
Petocz, Peter; Sowey, Eric
2008-01-01
In this article, the authors focus on hypothesis testing--that peculiarly statistical way of deciding things. Statistical methods for testing hypotheses were developed in the 1920s and 1930s by some of the most famous statisticians, in particular Ronald Fisher, Jerzy Neyman and Egon Pearson, who laid the foundations of almost all modern methods of…
Glaz, Joseph
2009-01-01
Suitable for graduate students and researchers in applied probability and statistics, as well as for scientists in biology, computer science, pharmaceutical science and medicine, this title brings together a collection of chapters illustrating the depth and diversity of theory, methods and applications in the area of scan statistics.
Lyons, L.
2016-01-01
Accelerators and detectors are expensive, both in terms of money and human effort. It is thus important to invest effort in performing a good statistical anal- ysis of the data, in order to extract the best information from it. This series of five lectures deals with practical aspects of statistical issues that arise in typical High Energy Physics analyses.
Nick, Todd G
2007-01-01
Statistics is defined by the Medical Subject Headings (MeSH) thesaurus as the science and art of collecting, summarizing, and analyzing data that are subject to random variation. The two broad categories of summarizing and analyzing data are referred to as descriptive and inferential statistics. This chapter considers the science and art of summarizing data where descriptive statistics and graphics are used to display data. In this chapter, we discuss the fundamentals of descriptive statistics, including describing qualitative and quantitative variables. For describing quantitative variables, measures of location and spread, for example the standard deviation, are presented along with graphical presentations. We also discuss distributions of statistics, for example the variance, as well as the use of transformations. The concepts in this chapter are useful for uncovering patterns within the data and for effectively presenting the results of a project.
Blakemore, J S
1962-01-01
Semiconductor Statistics presents statistics aimed at complementing existing books on the relationships between carrier densities and transport effects. The book is divided into two parts. Part I provides introductory material on the electron theory of solids, and then discusses carrier statistics for semiconductors in thermal equilibrium. Of course a solid cannot be in true thermodynamic equilibrium if any electrical current is passed; but when currents are reasonably small the distribution function is but little perturbed, and the carrier distribution for such a """"quasi-equilibrium"""" co
Wannier, Gregory Hugh
1966-01-01
Until recently, the field of statistical physics was traditionally taught as three separate subjects: thermodynamics, statistical mechanics, and kinetic theory. This text, a forerunner in its field and now a classic, was the first to recognize the outdated reasons for their separation and to combine the essentials of the three subjects into one unified presentation of thermal physics. It has been widely adopted in graduate and advanced undergraduate courses, and is recommended throughout the field as an indispensable aid to the independent study and research of statistical physics.Designed for
Feiveson, Alan H.; Foy, Millennia; Ploutz-Snyder, Robert; Fiedler, James
2014-01-01
Do you have elevated p-values? Is the data analysis process getting you down? Do you experience anxiety when you need to respond to criticism of statistical methods in your manuscript? You may be suffering from Insufficient Statistical Support Syndrome (ISSS). For symptomatic relief of ISSS, come for a free consultation with JSC biostatisticians at our help desk during the poster sessions at the HRP Investigators Workshop. Get answers to common questions about sample size, missing data, multiple testing, when to trust the results of your analyses and more. Side effects may include sudden loss of statistics anxiety, improved interpretation of your data, and increased confidence in your results.
International Nuclear Information System (INIS)
Vogelsang, R.; Hoheisel, C.
1987-01-01
For a large region of dense fluid states of a Lennard-Jones system, they have calculated the friction coefficient by the force autocorrelation function of a Brownian-type particle by molecular dynamics (MD). The time integral over the force autocorrelation function showed an interesting behavior and the expected plateau value when the mass of the Brownian particle was chosen to be about a factor of 100 larger than the mass of the fluid particle. Sufficient agreement was found for the friction coefficient calculated by this way and that obtained by calculations of the self-diffusion coefficient using the common relation between these coefficients. Furthermore, a modified friction coefficient was determined by integration of the force autocorrelation function up to the first maximum. This coefficient can successfully be used to derive a reasonable soft part of the friction coefficient necessary for the Rice-Allnatt approximation for the shear velocity and simple liquids
Energy Technology Data Exchange (ETDEWEB)
Wendelberger, Laura Jean [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)
2017-08-08
In large datasets, it is time consuming or even impossible to pick out interesting images. Our proposed solution is to find statistics to quantify the information in each image and use those to identify and pick out images of interest.
Department of Homeland Security — Accident statistics available on the Coast Guard’s website by state, year, and one variable to obtain tables and/or graphs. Data from reports has been loaded for...
U.S. Department of Health & Human Services — The CMS Center for Strategic Planning produces an annual CMS Statistics reference booklet that provides a quick reference for summary information about health...
Allegheny County / City of Pittsburgh / Western PA Regional Data Center — Data about the usage of the WPRDC site and its various datasets, obtained by combining Google Analytics statistics with information from the WPRDC's data portal.
Serdobolskii, Vadim Ivanovich
2007-01-01
This monograph presents mathematical theory of statistical models described by the essentially large number of unknown parameters, comparable with sample size but can also be much larger. In this meaning, the proposed theory can be called "essentially multiparametric". It is developed on the basis of the Kolmogorov asymptotic approach in which sample size increases along with the number of unknown parameters.This theory opens a way for solution of central problems of multivariate statistics, which up until now have not been solved. Traditional statistical methods based on the idea of an infinite sampling often break down in the solution of real problems, and, dependent on data, can be inefficient, unstable and even not applicable. In this situation, practical statisticians are forced to use various heuristic methods in the hope the will find a satisfactory solution.Mathematical theory developed in this book presents a regular technique for implementing new, more efficient versions of statistical procedures. ...
... Search Form Controls Cancel Submit Search the CDC Gonorrhea Note: Javascript is disabled or is not supported ... Twitter STD on Facebook Sexually Transmitted Diseases (STDs) Gonorrhea Statistics Recommend on Facebook Tweet Share Compartir Gonorrhea ...
DEFF Research Database (Denmark)
Tryggestad, Kjell
2004-01-01
The study aims is to describe how the inclusion and exclusion of materials and calculative devices construct the boundaries and distinctions between statistical facts and artifacts in economics. My methodological approach is inspired by John Graunt's (1667) Political arithmetic and more recent work...... within constructivism and the field of Science and Technology Studies (STS). The result of this approach is here termed reversible statistics, reconstructing the findings of a statistical study within economics in three different ways. It is argued that all three accounts are quite normal, albeit...... in different ways. The presence and absence of diverse materials, both natural and political, is what distinguishes them from each other. Arguments are presented for a more symmetric relation between the scientific statistical text and the reader. I will argue that a more symmetric relation can be achieved...
MacKenzie, Dana
2004-01-01
The drawbacks of using 19th-century mathematics in physics and astronomy are illustrated. To continue with the expansion of the knowledge about the cosmos, the scientists will have to come in terms with modern statistics. Some researchers have deliberately started importing techniques that are used in medical research. However, the physicists need to identify the brand of statistics that will be suitable for them, and make a choice between the Bayesian and the frequentists approach. (Edited abstract).
International Nuclear Information System (INIS)
Madanipour, Khosro; Tavassoly, Mohammad T.
2009-01-01
We show theoretically and verify experimentally that the modulation transfer function (MTF) of a printing system can be determined by measuring the autocorrelation of a printed Ronchi grating. In practice, two similar Ronchi gratings are printed on two transparencies and the transparencies are superimposed with parallel grating lines. Then, the gratings are uniformly illuminated and the transmitted light from a large section is measured versus the displacement of one grating with respect to the other in a grating pitch interval. This measurement provides the required autocorrelation function for determination of the MTF
Energy Technology Data Exchange (ETDEWEB)
Schulte, Stephan
2011-07-11
The history of cosmic rays started in the beginning of the 20th century. Since then one of the main questions is their origin. Due to the very low flux at the highest energies huge areas have to be instrumented to answer this question. For this purpose the distribution of the arrival directions of cosmic rays is studied. The largest experiment so far is the Pierre Auger Observatory, located in the Pampa in western Argentina with an area of about 3000 km{sup 2}. In recent years it provided many major contributions to the field of cosmic ray physics and its data is the basis of this work. Among other things a correlation analysis of Ultra High Energy Cosmic Rays (UHECRs) with Active Galactic Nuclei (AGN) was performed leading to the first evidence that UHECRs are not isotropically distributed. Here the distribution of arrival directions of cosmic rays at the highest energies (>50 EeV) is examined by using autocorrelation methods to check whether it is compatible with the isotropic expectation or not.This thesis is organised as follows: in the first two chapters a short introduction to the topic is given, followed by a more general discussion on cosmic rays including models of acceleration, possible sources and the propagation of UHECRs in the third chapter. The fourth chapter focuses on the detector design of the Pierre Auger Observatory and event reconstruction at highest energies. Special attention is paid to the monitoring of the High Elevation Auger Telescopes (HEAT). It is a low energy enhancement of the observatory consisting of three tiltable fluorescence telescopes. The calibration of the new sensor setups is described as well as the installation in each HEAT shelter. The next chapter starts with a detailed description of the underlying ideas and motivations of autocorrelation methods: a representation of the 2pt-Correlation Function and its extension, a Minimum Spanning Tree and a Cluster Algorithm with different weighting procedures. The principle of each
International Nuclear Information System (INIS)
Fekete, L.; Kůsová, K.; Petrák, V.; Kratochvílová, I.
2012-01-01
The distribution of sizes is one of the basic characteristics of nanoparticles. Here, we propose a novel way to determine the lateral distribution of sizes from AFM topographies. Our algorithm is based on the autocorrelation function and can be applied both on topographies containing spatially separated and densely packed nanoparticles as well as on topographies of polycrystalline films. As no manual treatment is required, this algorithm can be easily automatable for batch processing. The algorithm works in principle with any kind of spatially mapped information (AFM current maps, optical microscope images, etc.), and as such has no size limitations. However, in the case of AFM topographies, the tip/sample convolution effects will be the factor limiting the smallest size to which the algorithm is applicable. Here, we demonstrate the usefulness of this algorithm on objects with sizes ranging between 20 nm and 1.5 μm.
Hadwin, Paul J.; Sipkens, T. A.; Thomson, K. A.; Liu, F.; Daun, K. J.
2016-01-01
Auto-correlated laser-induced incandescence (AC-LII) infers the soot volume fraction (SVF) of soot particles by comparing the spectral incandescence from laser-energized particles to the pyrometrically inferred peak soot temperature. This calculation requires detailed knowledge of model parameters such as the absorption function of soot, which may vary with combustion chemistry, soot age, and the internal structure of the soot. This work presents a Bayesian methodology to quantify such uncertainties. This technique treats the additional "nuisance" model parameters, including the soot absorption function, as stochastic variables and incorporates the current state of knowledge of these parameters into the inference process through maximum entropy priors. While standard AC-LII analysis provides a point estimate of the SVF, Bayesian techniques infer the posterior probability density, which will allow scientists and engineers to better assess the reliability of AC-LII inferred SVFs in the context of environmental regulations and competing diagnostics.
International Nuclear Information System (INIS)
Hassan, W.; Blodgett, M.
2006-01-01
Shot peening is the primary surface treatment used to create a uniform, consistent, and reliable sub-surface compressive residual stress layer in aero engine components. A by-product of the shot peening process is random surface roughness that can affect the measurements of the resulting residual stresses and therefore impede their NDE assessment. High frequency eddy current conductivity measurements have the potential to assess these residual stresses in Ni-base super alloys. However, the effect of random surface roughness is expected to become significant in the desired measurement frequency range of 10 to 100 MHz. In this paper, a new Multi-Gaussian (MG) auto-correlation function is proposed for modeling the resulting pseudo-random rough profiles. Its use in the calculation of the Apparent Eddy Current Conductivity (AECC) loss due to surface roughness is demonstrated. The numerical results presented need to be validated with experimental measurements
Energy Technology Data Exchange (ETDEWEB)
Chen Yong; Chen Xi; Qian Minping [School of Mathematical Sciences, Peking University, Beijing 100871 (China)
2006-03-17
A general form of the Green-Kubo formula, which describes the fluctuations pertaining to all the steady states whether equilibrium or non-equilibrium, for a system driven by a finite Markov chain with continuous time (briefly, MC) {l_brace}{xi}{sub t}{r_brace}, is shown. The equivalence of different forms of the Green-Kubo formula is exploited. We also look at the differences in terms of the autocorrelation function and the fluctuation spectrum between the equilibrium state and the non-equilibrium steady state. Also, if the MC is in the non-equilibrium steady state, we can always find a complex function {psi}, such that the fluctuation spectrum of {l_brace}{phi}({xi}{sub t}){r_brace} is non-monotonous in [0, + {infinity})
International Nuclear Information System (INIS)
Schulte, Stephan
2011-01-01
The history of cosmic rays started in the beginning of the 20th century. Since then one of the main questions is their origin. Due to the very low flux at the highest energies huge areas have to be instrumented to answer this question. For this purpose the distribution of the arrival directions of cosmic rays is studied. The largest experiment so far is the Pierre Auger Observatory, located in the Pampa in western Argentina with an area of about 3000 km 2 . In recent years it provided many major contributions to the field of cosmic ray physics and its data is the basis of this work. Among other things a correlation analysis of Ultra High Energy Cosmic Rays (UHECRs) with Active Galactic Nuclei (AGN) was performed leading to the first evidence that UHECRs are not isotropically distributed. Here the distribution of arrival directions of cosmic rays at the highest energies (>50 EeV) is examined by using autocorrelation methods to check whether it is compatible with the isotropic expectation or not.This thesis is organised as follows: in the first two chapters a short introduction to the topic is given, followed by a more general discussion on cosmic rays including models of acceleration, possible sources and the propagation of UHECRs in the third chapter. The fourth chapter focuses on the detector design of the Pierre Auger Observatory and event reconstruction at highest energies. Special attention is paid to the monitoring of the High Elevation Auger Telescopes (HEAT). It is a low energy enhancement of the observatory consisting of three tiltable fluorescence telescopes. The calibration of the new sensor setups is described as well as the installation in each HEAT shelter. The next chapter starts with a detailed description of the underlying ideas and motivations of autocorrelation methods: a representation of the 2pt-Correlation Function and its extension, a Minimum Spanning Tree and a Cluster Algorithm with different weighting procedures. The principle of each
Directory of Open Access Journals (Sweden)
A. A. Kovylin
2013-01-01
Full Text Available The article describes the problem of searching for binary pseudo-random sequences with quasi-ideal autocorrelation function, which are to be used in contemporary communication systems, including mobile and wireless data transfer interfaces. In the synthesis of binary sequences sets, the target set is manning them based on the minimax criterion by which a sequence is considered to be optimal according to the intended application. In the course of the research the optimal sequences with order of up to 52 were obtained; the analysis of Run Length Encoding was carried out. The analysis showed regularities in the distribution of series number of different lengths in the codes that are optimal on the chosen criteria, which would make it possible to optimize the searching process for such codes in the future.
International Nuclear Information System (INIS)
Chen Yong; Chen Xi; Qian Minping
2006-01-01
A general form of the Green-Kubo formula, which describes the fluctuations pertaining to all the steady states whether equilibrium or non-equilibrium, for a system driven by a finite Markov chain with continuous time (briefly, MC) {ξ t }, is shown. The equivalence of different forms of the Green-Kubo formula is exploited. We also look at the differences in terms of the autocorrelation function and the fluctuation spectrum between the equilibrium state and the non-equilibrium steady state. Also, if the MC is in the non-equilibrium steady state, we can always find a complex function ψ, such that the fluctuation spectrum of {φ(ξ t )} is non-monotonous in [0, + ∞)
Du, Hai-Wen; Wang, Yong; Zhuang, Da-Fang; Jiang, Xiao-San
2017-08-07
The nest flea index of Meriones unguiculatus is a critical indicator for the prevention and control of plague, which can be used not only to detect the spatial and temporal distributions of Meriones unguiculatus, but also to reveal its cluster rule. This research detected the temporal and spatial distribution characteristics of the plague natural foci of Mongolian gerbils by body flea index from 2005 to 2014, in order to predict plague outbreaks. Global spatial autocorrelation was used to describe the entire spatial distribution pattern of the body flea index in the natural plague foci of typical Chinese Mongolian gerbils. Cluster and outlier analysis and hot spot analysis were also used to detect the intensity of clusters based on geographic information system methods. The quantity of M. unguiculatus nest fleas in the sentinel surveillance sites from 2005 to 2014 and host density data of the study area from 2005 to 2010 used in this study were provided by Chinese Center for Disease Control and Prevention. The epidemic focus regions of the Mongolian gerbils remain the same as the hot spot regions relating to the body flea index. High clustering areas possess a similar pattern as the distribution pattern of the body flea index indicating that the transmission risk of plague is relatively high. In terms of time series, the area of the epidemic focus gradually increased from 2005 to 2007, declined rapidly in 2008 and 2009, and then decreased slowly and began trending towards stability from 2009 to 2014. For the spatial change, the epidemic focus regions began moving northward from the southwest epidemic focus of the Mongolian gerbils from 2005 to 2007, and then moved from north to south in 2007 and 2008. The body flea index of Chinese gerbil foci reveals significant spatial and temporal aggregation characteristics through the employing of spatial autocorrelation. The diversity of temporary and spatial distribution is mainly affected by seasonal variation, the human
Goodman, J. W.
This book is based on the thesis that some training in the area of statistical optics should be included as a standard part of any advanced optics curriculum. Random variables are discussed, taking into account definitions of probability and random variables, distribution functions and density functions, an extension to two or more random variables, statistical averages, transformations of random variables, sums of real random variables, Gaussian random variables, complex-valued random variables, and random phasor sums. Other subjects examined are related to random processes, some first-order properties of light waves, the coherence of optical waves, some problems involving high-order coherence, effects of partial coherence on imaging systems, imaging in the presence of randomly inhomogeneous media, and fundamental limits in photoelectric detection of light. Attention is given to deterministic versus statistical phenomena and models, the Fourier transform, and the fourth-order moment of the spectrum of a detected speckle image.
Schwabl, Franz
2006-01-01
The completely revised new edition of the classical book on Statistical Mechanics covers the basic concepts of equilibrium and non-equilibrium statistical physics. In addition to a deductive approach to equilibrium statistics and thermodynamics based on a single hypothesis - the form of the microcanonical density matrix - this book treats the most important elements of non-equilibrium phenomena. Intermediate calculations are presented in complete detail. Problems at the end of each chapter help students to consolidate their understanding of the material. Beyond the fundamentals, this text demonstrates the breadth of the field and its great variety of applications. Modern areas such as renormalization group theory, percolation, stochastic equations of motion and their applications to critical dynamics, kinetic theories, as well as fundamental considerations of irreversibility, are discussed. The text will be useful for advanced students of physics and other natural sciences; a basic knowledge of quantum mechan...
Jana, Madhusudan
2015-01-01
Statistical mechanics is self sufficient, written in a lucid manner, keeping in mind the exam system of the universities. Need of study this subject and its relation to Thermodynamics is discussed in detail. Starting from Liouville theorem gradually, the Statistical Mechanics is developed thoroughly. All three types of Statistical distribution functions are derived separately with their periphery of applications and limitations. Non-interacting ideal Bose gas and Fermi gas are discussed thoroughly. Properties of Liquid He-II and the corresponding models have been depicted. White dwarfs and condensed matter physics, transport phenomenon - thermal and electrical conductivity, Hall effect, Magneto resistance, viscosity, diffusion, etc. are discussed. Basic understanding of Ising model is given to explain the phase transition. The book ends with a detailed coverage to the method of ensembles (namely Microcanonical, canonical and grand canonical) and their applications. Various numerical and conceptual problems ar...
Guénault, Tony
2007-01-01
In this revised and enlarged second edition of an established text Tony Guénault provides a clear and refreshingly readable introduction to statistical physics, an essential component of any first degree in physics. The treatment itself is self-contained and concentrates on an understanding of the physical ideas, without requiring a high level of mathematical sophistication. A straightforward quantum approach to statistical averaging is adopted from the outset (easier, the author believes, than the classical approach). The initial part of the book is geared towards explaining the equilibrium properties of a simple isolated assembly of particles. Thus, several important topics, for example an ideal spin-½ solid, can be discussed at an early stage. The treatment of gases gives full coverage to Maxwell-Boltzmann, Fermi-Dirac and Bose-Einstein statistics. Towards the end of the book the student is introduced to a wider viewpoint and new chapters are included on chemical thermodynamics, interactions in, for exam...
Long-range autocorrelations of CpG islands in the human genome.
Directory of Open Access Journals (Sweden)
Benjamin Koester
Full Text Available In this paper, we use a statistical estimator developed in astrophysics to study the distribution and organization of features of the human genome. Using the human reference sequence we quantify the global distribution of CpG islands (CGI in each chromosome and demonstrate that the organization of the CGI across a chromosome is non-random, exhibits surprisingly long range correlations (10 Mb and varies significantly among chromosomes. These correlations of CGI summarize functional properties of the genome that are not captured when considering variation in any particular separate (and local feature. The demonstration of the proposed methods to quantify the organization of CGI in the human genome forms the basis of future studies. The most illuminating of these will assess the potential impact on phenotypic variation of inter-individual variation in the organization of the functional features of the genome within and among chromosomes, and among individuals for particular chromosomes.
Mandl, Franz
1988-01-01
The Manchester Physics Series General Editors: D. J. Sandiford; F. Mandl; A. C. Phillips Department of Physics and Astronomy, University of Manchester Properties of Matter B. H. Flowers and E. Mendoza Optics Second Edition F. G. Smith and J. H. Thomson Statistical Physics Second Edition E. Mandl Electromagnetism Second Edition I. S. Grant and W. R. Phillips Statistics R. J. Barlow Solid State Physics Second Edition J. R. Hook and H. E. Hall Quantum Mechanics F. Mandl Particle Physics Second Edition B. R. Martin and G. Shaw The Physics of Stars Second Edition A. C. Phillips Computing for Scient
Rohatgi, Vijay K
2003-01-01
Unified treatment of probability and statistics examines and analyzes the relationship between the two fields, exploring inferential issues. Numerous problems, examples, and diagrams--some with solutions--plus clear-cut, highlighted summaries of results. Advanced undergraduate to graduate level. Contents: 1. Introduction. 2. Probability Model. 3. Probability Distributions. 4. Introduction to Statistical Inference. 5. More on Mathematical Expectation. 6. Some Discrete Models. 7. Some Continuous Models. 8. Functions of Random Variables and Random Vectors. 9. Large-Sample Theory. 10. General Meth
Levine-Wissing, Robin
2012-01-01
All Access for the AP® Statistics Exam Book + Web + Mobile Everything you need to prepare for the Advanced Placement® exam, in a study system built around you! There are many different ways to prepare for an Advanced Placement® exam. What's best for you depends on how much time you have to study and how comfortable you are with the subject matter. To score your highest, you need a system that can be customized to fit you: your schedule, your learning style, and your current level of knowledge. This book, and the online tools that come with it, will help you personalize your AP® Statistics prep
Davidson, Norman
2003-01-01
Clear and readable, this fine text assists students in achieving a grasp of the techniques and limitations of statistical mechanics. The treatment follows a logical progression from elementary to advanced theories, with careful attention to detail and mathematical development, and is sufficiently rigorous for introductory or intermediate graduate courses.Beginning with a study of the statistical mechanics of ideal gases and other systems of non-interacting particles, the text develops the theory in detail and applies it to the study of chemical equilibrium and the calculation of the thermody
Levitsky, David A; Raea Limb, Ji Eun; Wilkinson, Lua; Sewall, Anna; Zhong, Yingyi; Olabi, Ammar; Hunter, Jean
2017-09-01
According to most theories, the amount of food consumed on one day should be negatively related to intake on subsequent days. Several studies have observed such a negative correlation between the amount consumed on one day and the amount consumed two to four days later. The present study attempted to replicate this observation by re-examining data from a previous study where all food ingested over a 30-day observation period was measured. Nine male and seven female participants received a vegan diet prepared, dispensed, and measured in a metabolic unit. Autocorrelations were performed on total food intake consume on one day and that consumed one to five days later. A significant positive correlation was detected between the weight of food eaten on one day and on the amount consumed on the following day (r = 0.29, 95% CI [0.37, 0.20]). No correlation was found between weights of food consumed on one day and up to twelve days later (r = 0.09, 95% CI [0.24, -0.06]), (r = 0.11, 95% CI [0.26, -0.0.26]) (r = 0.02, 95% CI [0.15, -0.7]) (r = -0.08, 95% CI [0.11, -0.09]). The same positive correlation with the previous day's intake was observed at the succeeding breakfast but not at either lunch or dinner. However, the participants underestimated their daily energy need resulting in a small, but statistically significant weight loss. Daily food intake increased slightly (13 g/day), but significantly, across the 30-day period. An analysis of the previous studies revealed that the negative correlations observed by others was caused by a statistical artifact resulting from normalizing data before testing for the correlations. These results, when combined with the published literature, indicate that there is little evidence that humans precisely compensate for the previous day's intake by altering the amount consumed on subsequent days. Moreover, the small but persistent increase in food intake suggests that physiological mechanisms that affect food intake
Indian Academy of Sciences (India)
inference and finite population sampling. Sudhakar Kunte. Elements of statistical computing are discussed in this series. ... which captain gets an option to decide whether to field first or bat first ... may of course not be fair, in the sense that the team which wins ... describe two methods of drawing a random number between 0.
Schrödinger, Erwin
1952-01-01
Nobel Laureate's brilliant attempt to develop a simple, unified standard method of dealing with all cases of statistical thermodynamics - classical, quantum, Bose-Einstein, Fermi-Dirac, and more.The work also includes discussions of Nernst theorem, Planck's oscillator, fluctuations, the n-particle problem, problem of radiation, much more.
Eigenfunction statistics on quantum graphs
International Nuclear Information System (INIS)
Gnutzmann, S.; Keating, J.P.; Piotet, F.
2010-01-01
We investigate the spatial statistics of the energy eigenfunctions on large quantum graphs. It has previously been conjectured that these should be described by a Gaussian Random Wave Model, by analogy with quantum chaotic systems, for which such a model was proposed by Berry in 1977. The autocorrelation functions we calculate for an individual quantum graph exhibit a universal component, which completely determines a Gaussian Random Wave Model, and a system-dependent deviation. This deviation depends on the graph only through its underlying classical dynamics. Classical criteria for quantum universality to be met asymptotically in the large graph limit (i.e. for the non-universal deviation to vanish) are then extracted. We use an exact field theoretic expression in terms of a variant of a supersymmetric σ model. A saddle-point analysis of this expression leads to the estimates. In particular, intensity correlations are used to discuss the possible equidistribution of the energy eigenfunctions in the large graph limit. When equidistribution is asymptotically realized, our theory predicts a rate of convergence that is a significant refinement of previous estimates. The universal and system-dependent components of intensity correlation functions are recovered by means of an exact trace formula which we analyse in the diagonal approximation, drawing in this way a parallel between the field theory and semiclassics. Our results provide the first instance where an asymptotic Gaussian Random Wave Model has been established microscopically for eigenfunctions in a system with no disorder.
International Nuclear Information System (INIS)
Nuamah, N.N.N.N.
1990-12-01
The paradoxical nature of results of the mean approach in pooling cross-section and time series data has been identified to be caused by the presence in the normal equations of phenomena such as autocovariances, multicollinear covariances, drift covariances and drift multicollinear covariances. This paper considers the problem of autocorrelation and suggests ways of solving it. (author). 4 refs
Hoef, M.A. van der; Frenkel, D.
1990-01-01
We report simulations of the velocity autocorrelation function (VACF) of a tagged particle in two- and three-dimensional lattice-gas cellular automata, using a new technique that is about a million times more efficient than the conventional techniques. The simulations clearly show the algebraic
de la Mata, Tamara; Llano, Carlos
2013-07-01
Recent literature on border effect has fostered research on informal barriers to trade and the role played by network dependencies. In relation to social networks, it has been shown that intensity of trade in goods is positively correlated with migration flows between pairs of countries/regions. In this article, we investigate whether such a relation also holds for interregional trade of services. We also consider whether interregional trade flows in services linked with tourism exhibit spatial and/or social network dependence. Conventional empirical gravity models assume the magnitude of bilateral flows between regions is independent of flows to/from regions located nearby in space, or flows to/from regions related through social/cultural/ethic network connections. With this aim, we provide estimates from a set of gravity models showing evidence of statistically significant spatial and network (demographic) dependence in the bilateral flows of the trade of services considered. The analysis has been applied to the Spanish intra- and interregional monetary flows of services from the accommodation, restaurants and travel agencies for the period 2000-2009, using alternative datasets for the migration stocks and definitions of network effects.
International Nuclear Information System (INIS)
Anon.
1994-01-01
For the years 1992 and 1993, part of the figures shown in the tables of the Energy Review are preliminary or estimated. The annual statistics of the Energy Review appear in more detail from the publication Energiatilastot - Energy Statistics issued annually, which also includes historical time series over a longer period. The tables and figures shown in this publication are: Changes in the volume of GNP and energy consumption; Coal consumption; Natural gas consumption; Peat consumption; Domestic oil deliveries; Import prices of oil; Price development of principal oil products; Fuel prices for power production; Total energy consumption by source; Electricity supply; Energy imports by country of origin in 1993; Energy exports by recipient country in 1993; Consumer prices of liquid fuels; Consumer prices of hard coal and natural gas, prices of indigenous fuels; Average electricity price by type of consumer; Price of district heating by type of consumer and Excise taxes and turnover taxes included in consumer prices of some energy sources
Goodman, Joseph W.
2000-07-01
The Wiley Classics Library consists of selected books that have become recognized classics in their respective fields. With these new unabridged and inexpensive editions, Wiley hopes to extend the life of these important works by making them available to future generations of mathematicians and scientists. Currently available in the Series: T. W. Anderson The Statistical Analysis of Time Series T. S. Arthanari & Yadolah Dodge Mathematical Programming in Statistics Emil Artin Geometric Algebra Norman T. J. Bailey The Elements of Stochastic Processes with Applications to the Natural Sciences Robert G. Bartle The Elements of Integration and Lebesgue Measure George E. P. Box & Norman R. Draper Evolutionary Operation: A Statistical Method for Process Improvement George E. P. Box & George C. Tiao Bayesian Inference in Statistical Analysis R. W. Carter Finite Groups of Lie Type: Conjugacy Classes and Complex Characters R. W. Carter Simple Groups of Lie Type William G. Cochran & Gertrude M. Cox Experimental Designs, Second Edition Richard Courant Differential and Integral Calculus, Volume I RIchard Courant Differential and Integral Calculus, Volume II Richard Courant & D. Hilbert Methods of Mathematical Physics, Volume I Richard Courant & D. Hilbert Methods of Mathematical Physics, Volume II D. R. Cox Planning of Experiments Harold S. M. Coxeter Introduction to Geometry, Second Edition Charles W. Curtis & Irving Reiner Representation Theory of Finite Groups and Associative Algebras Charles W. Curtis & Irving Reiner Methods of Representation Theory with Applications to Finite Groups and Orders, Volume I Charles W. Curtis & Irving Reiner Methods of Representation Theory with Applications to Finite Groups and Orders, Volume II Cuthbert Daniel Fitting Equations to Data: Computer Analysis of Multifactor Data, Second Edition Bruno de Finetti Theory of Probability, Volume I Bruno de Finetti Theory of Probability, Volume 2 W. Edwards Deming Sample Design in Business Research
Pivato, Marcus
2013-01-01
We show that, in a sufficiently large population satisfying certain statistical regularities, it is often possible to accurately estimate the utilitarian social welfare function, even if we only have very noisy data about individual utility functions and interpersonal utility comparisons. In particular, we show that it is often possible to identify an optimal or close-to-optimal utilitarian social choice using voting rules such as the Borda rule, approval voting, relative utilitarianism, or a...
Natrella, Mary Gibbons
1963-01-01
Formulated to assist scientists and engineers engaged in army ordnance research and development programs, this well-known and highly regarded handbook is a ready reference for advanced undergraduate and graduate students as well as for professionals seeking engineering information and quantitative data for designing, developing, constructing, and testing equipment. Topics include characterizing and comparing the measured performance of a material, product, or process; general considerations in planning experiments; statistical techniques for analyzing extreme-value data; use of transformations
A Divergence Statistics Extension to VTK for Performance Analysis
Energy Technology Data Exchange (ETDEWEB)
Pebay, Philippe Pierre [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Bennett, Janine Camille [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States)
2015-02-01
This report follows the series of previous documents ([PT08, BPRT09b, PT09, BPT09, PT10, PB13], where we presented the parallel descriptive, correlative, multi-correlative, principal component analysis, contingency, k -means, order and auto-correlative statistics engines which we developed within the Visualization Tool Kit ( VTK ) as a scalable, parallel and versatile statistics package. We now report on a new engine which we developed for the calculation of divergence statistics, a concept which we hereafter explain and whose main goal is to quantify the discrepancy, in a stasticial manner akin to measuring a distance, between an observed empirical distribution and a theoretical, "ideal" one. The ease of use of the new diverence statistics engine is illustrated by the means of C++ code snippets. Although this new engine does not yet have a parallel implementation, it has already been applied to HPC performance analysis, of which we provide an example.
On the structure of dynamic principal component analysis used in statistical process monitoring
DEFF Research Database (Denmark)
Vanhatalo, Erik; Kulahci, Murat; Bergquist, Bjarne
2017-01-01
When principal component analysis (PCA) is used for statistical process monitoring it relies on the assumption that data are time independent. However, industrial data will often exhibit serial correlation. Dynamic PCA (DPCA) has been suggested as a remedy for high-dimensional and time...... for determining the number of principal components to retain. The number of retained principal components is determined by visual inspection of the serial correlation in the squared prediction error statistic, Q (SPE), together with the cumulative explained variance of the model. The methods are illustrated using...... driven method to determine the maximum number of lags in DPCA with a foundation in multivariate time series analysis. The method is based on the behavior of the eigenvalues of the lagged autocorrelation and partial autocorrelation matrices. Given a specific lag structure we also propose a method...
2017-12-08
STATISTICAL LINEAR TIME-VARYING SYSTEM MODEL OF HIGH GRAZING ANGLE SEA CLUTTER FOR COMPUTING INTERFERENCE POWER 1. INTRODUCTION Statistical linear time...beam. We can approximate one of the sinc factors using the Dirichlet kernel to facilitate computation of the integral in (6) as follows: ∣∣∣∣sinc(WB...plotted in Figure 4. The resultant autocorrelation can then be found by substituting (18) into (28). The Python code used to generate Figures 1-4 is found
International Nuclear Information System (INIS)
Anon.
1989-01-01
World data from the United Nation's latest Energy Statistics Yearbook, first published in our last issue, are completed here. The 1984-86 data were revised and 1987 data added for world commercial energy production and consumption, world natural gas plant liquids production, world LP-gas production, imports, exports, and consumption, world residual fuel oil production, imports, exports, and consumption, world lignite production, imports, exports, and consumption, world peat production and consumption, world electricity production, imports, exports, and consumption (Table 80), and world nuclear electric power production
Auto-correlation in the motor/imaginary human EEG signals: A vision about the FDFA fluctuations.
Directory of Open Access Journals (Sweden)
Gilney Figueira Zebende
Full Text Available In this paper we analyzed, by the FDFA root mean square fluctuation (rms function, the motor/imaginary human activity produced by a 64-channel electroencephalography (EEG. We utilized the Physionet on-line databank, a publicly available database of human EEG signals, as a standardized reference database for this study. Herein, we report the use of detrended fluctuation analysis (DFA method for EEG analysis. We show that the complex time series of the EEG exhibits characteristic fluctuations depending on the analyzed channel in the scalp-recorded EEG. In order to demonstrate the effectiveness of the proposed technique, we analyzed four distinct channels represented here by F332, F637 (frontal region of the head and P349, P654 (parietal region of the head. We verified that the amplitude of the FDFA rms function is greater for the frontal channels than for the parietal. To tabulate this information in a better way, we define and calculate the difference between FDFA (in log scale for the channels, thus defining a new path for analysis of EEG signals. Finally, related to the studied EEG signals, we obtain the auto-correlation exponent, αDFA by DFA method, that reveals self-affinity at specific time scale. Our results shows that this strategy can be applied to study the human brain activity in EEG processing.
Tufto, Jarle
2015-08-01
Adaptive responses to autocorrelated environmental fluctuations through evolution in mean reaction norm elevation and slope and an independent component of the phenotypic variance are analyzed using a quantitative genetic model. Analytic approximations expressing the mutual dependencies between all three response modes are derived and solved for the joint evolutionary outcome. Both genetic evolution in reaction norm elevation and plasticity are favored by slow temporal fluctuations, with plasticity, in the absence of microenvironmental variability, being the dominant evolutionary outcome for reasonable parameter values. For fast fluctuations, tracking of the optimal phenotype through genetic evolution and plasticity is limited. If residual fluctuations in the optimal phenotype are large and stabilizing selection is strong, selection then acts to increase the phenotypic variance (bet-hedging adaptive). Otherwise, canalizing selection occurs. If the phenotypic variance increases with plasticity through the effect of microenvironmental variability, this shifts the joint evolutionary balance away from plasticity in favor of genetic evolution. If microenvironmental deviations experienced by each individual at the time of development and selection are correlated, however, more plasticity evolves. The adaptive significance of evolutionary fluctuations in plasticity and the phenotypic variance, transient evolution, and the validity of the analytic approximations are investigated using simulations. © 2015 The Author(s). Evolution © 2015 The Society for the Study of Evolution.
Hao, Hongliang; Xiao, Wen; Chen, Zonghui; Ma, Lan; Pan, Feng
2018-01-01
Heterodyne interferometric vibration metrology is a useful technique for dynamic displacement and velocity measurement as it can provide a synchronous full-field output signal. With the advent of cost effective, high-speed real-time signal processing systems and software, processing of the complex signals encountered in interferometry has become more feasible. However, due to the coherent nature of the laser sources, the sequence of heterodyne interferogram are corrupted by a mixture of coherent speckle and incoherent additive noise, which can severely degrade the accuracy of the demodulated signal and the optical display. In this paper, a new heterodyne interferometric demodulation method by combining auto-correlation analysis and spectral filtering is described leading to an expression for the dynamic displacement and velocity of the object under test that is significantly more accurate in both the amplitude and frequency of the vibrating waveform. We present a mathematical model of the signals obtained from interferograms that contain both vibration information of the measured objects and the noise. A simulation of the signal demodulation process is presented and used to investigate the noise from the system and external factors. The experimental results show excellent agreement with measurements from a commercial Laser Doppler Velocimetry (LDV).
International Nuclear Information System (INIS)
Allen, Bruce; Creighton, Jolien D.E.; Flanagan, Eanna E.; Romano, Joseph D.
2003-01-01
In a previous paper (paper I), we derived a set of near-optimal signal detection techniques for gravitational wave detectors whose noise probability distributions contain non-Gaussian tails. The methods modify standard methods by truncating or clipping sample values which lie in those non-Gaussian tails. The methods were derived, in the frequentist framework, by minimizing false alarm probabilities at fixed false detection probability in the limit of weak signals. For stochastic signals, the resulting statistic consisted of a sum of an autocorrelation term and a cross-correlation term; it was necessary to discard 'by hand' the autocorrelation term in order to arrive at the correct, generalized cross-correlation statistic. In the present paper, we present an alternative derivation of the same signal detection techniques from within the Bayesian framework. We compute, for both deterministic and stochastic signals, the probability that a signal is present in the data, in the limit where the signal-to-noise ratio squared per frequency bin is small, where the signal is nevertheless strong enough to be detected (integrated signal-to-noise ratio large compared to 1), and where the total probability in the non-Gaussian tail part of the noise distribution is small. We show that, for each model considered, the resulting probability is to a good approximation a monotonic function of the detection statistic derived in paper I. Moreover, for stochastic signals, the new Bayesian derivation automatically eliminates the problematic autocorrelation term
National Statistical Commission and Indian Official Statistics*
Indian Academy of Sciences (India)
IAS Admin
a good collection of official statistics of that time. With more .... statistical agencies and institutions to provide details of statistical activities .... ing several training programmes. .... ful completion of Indian Statistical Service examinations, the.
DEFF Research Database (Denmark)
Stein, A.; Ewert, Stephan; Wiegrebe, L.
2005-01-01
, autocorrelation is applied. Considering the large overlap in pitch and modulation perception, this is not parsimonious. Two experiments are presented to investigate the interaction between carrier periodicity, which produces strong pitch sensations, and envelope periodicity using broadband stimuli. Results show......Recent temporal models of pitch and amplitude modulation perception converge on a relatively realistic implementation of cochlear processing followed by a temporal analysis of periodicity. However, for modulation perception, a modulation filterbank is applied whereas for pitch perception...
Tellinghuisen, Joel
2008-01-01
The method of least squares is probably the most powerful data analysis tool available to scientists. Toward a fuller appreciation of that power, this work begins with an elementary review of statistics fundamentals, and then progressively increases in sophistication as the coverage is extended to the theory and practice of linear and nonlinear least squares. The results are illustrated in application to data analysis problems important in the life sciences. The review of fundamentals includes the role of sampling and its connection to probability distributions, the Central Limit Theorem, and the importance of finite variance. Linear least squares are presented using matrix notation, and the significance of the key probability distributions-Gaussian, chi-square, and t-is illustrated with Monte Carlo calculations. The meaning of correlation is discussed, including its role in the propagation of error. When the data themselves are correlated, special methods are needed for the fitting, as they are also when fitting with constraints. Nonlinear fitting gives rise to nonnormal parameter distributions, but the 10% Rule of Thumb suggests that such problems will be insignificant when the parameter is sufficiently well determined. Illustrations include calibration with linear and nonlinear response functions, the dangers inherent in fitting inverted data (e.g., Lineweaver-Burk equation), an analysis of the reliability of the van't Hoff analysis, the problem of correlated data in the Guggenheim method, and the optimization of isothermal titration calorimetry procedures using the variance-covariance matrix for experiment design. The work concludes with illustrations on assessing and presenting results.
Goussev, Arseni; Dorfman, J R
2006-07-01
We consider the time evolution of a wave packet representing a quantum particle moving in a geometrically open billiard that consists of a number of fixed hard-disk or hard-sphere scatterers. Using the technique of multiple collision expansions we provide a first-principle analytical calculation of the time-dependent autocorrelation function for the wave packet in the high-energy diffraction regime, in which the particle's de Broglie wavelength, while being small compared to the size of the scatterers, is large enough to prevent the formation of geometric shadow over distances of the order of the particle's free flight path. The hard-disk or hard-sphere scattering system must be sufficiently dilute in order for this high-energy diffraction regime to be achievable. Apart from the overall exponential decay, the autocorrelation function exhibits a generally complicated sequence of relatively strong peaks corresponding to partial revivals of the wave packet. Both the exponential decay (or escape) rate and the revival peak structure are predominantly determined by the underlying classical dynamics. A relation between the escape rate, and the Lyapunov exponents and Kolmogorov-Sinai entropy of the counterpart classical system, previously known for hard-disk billiards, is strengthened by generalization to three spatial dimensions. The results of the quantum mechanical calculation of the time-dependent autocorrelation function agree with predictions of the semiclassical periodic orbit theory.
Energy Technology Data Exchange (ETDEWEB)
Yamamoto, H; Iwamoto, K; Saito, T; Tachibana, M [Iwate University, Iwate (Japan). Faculty of Engineering
1997-05-27
Methods to learn underground structures by utilizing the dispersion phenomenon of surface waves contained in microtremors include the frequency-wave number analysis method (the F-K method) and the spatial autocorrelation method (the SAC method). Despite the fact that the SAC method is capable of exploring structures at greater depths, the method is not utilized because of its stringent restriction in arrangement of seismometers during observation that they must be arranged evenly on the same circumference. In order to eliminate this restriction in the SAC method, a research group in the Hokuriku University has proposed an expanded spatial autocorrelation (ESAC) method. Using the concept of the ESAC method as its base, a method was realized to improve phase velocity estimation by making a simulation on an array shifted to the radius direction. As a result of the discussion, it was found that the proposed improvement method can be applied to places where waves come from a number of directions, such as urban areas. If the improvement method can be applied, the spatial autocorrelation function needs not be even in the circumferential direction. In other words, the SAC method can be applied to arbitrary arrays. 1 ref., 7 figs.
Statistical spatial properties of speckle patterns generated by multiple laser beams
International Nuclear Information System (INIS)
Le Cain, A.; Sajer, J. M.; Riazuelo, G.
2011-01-01
This paper investigates hot spot characteristics generated by the superposition of multiple laser beams. First, properties of speckle statistics are studied in the context of only one laser beam by computing the autocorrelation function. The case of multiple laser beams is then considered. In certain conditions, it is shown that speckles have an ellipsoidal shape. Analytical expressions of hot spot radii generated by multiple laser beams are derived and compared to numerical estimates made from the autocorrelation function. They are also compared to numerical simulations performed within the paraxial approximation. Excellent agreement is found for the speckle width as well as for the speckle length. Application to the speckle patterns generated in the Laser MegaJoule configuration in the zone where all the beams overlap is presented. Influence of polarization on the size of the speckles as well as on their abundance is studied.
How to statistically analyze nano exposure measurement results: using an ARIMA time series approach
International Nuclear Information System (INIS)
Klein Entink, Rinke H.; Fransman, Wouter; Brouwer, Derk H.
2011-01-01
Measurement strategies for exposure to nano-sized particles differ from traditional integrated sampling methods for exposure assessment by the use of real-time instruments. The resulting measurement series is a time series, where typically the sequential measurements are not independent from each other but show a pattern of autocorrelation. This article addresses the statistical difficulties when analyzing real-time measurements for exposure assessment to manufactured nano objects. To account for autocorrelation patterns, Autoregressive Integrated Moving Average (ARIMA) models are proposed. A simulation study shows the pitfalls of using a standard t-test and the application of ARIMA models is illustrated with three real-data examples. Some practical suggestions for the data analysis of real-time exposure measurements conclude this article.
Chuang, Shin-Shin; Wu, Kung-Tai; Lin, Chen-Yang; Lee, Steven; Chen, Gau-Yang; Kuo, Cheng-Deng
2014-08-01
The Poincaré plot of RR intervals (RRI) is obtained by plotting RRIn+1 against RRIn. The Pearson correlation coefficient (ρRRI), slope (SRRI), Y-intercept (YRRI), standard deviation of instantaneous beat-to-beat RRI variability (SD1RR), and standard deviation of continuous long-term RRI variability (SD2RR) can be defined to characterize the plot. Similarly, the Poincaré plot of autocorrelation function (ACF) of RRI can be obtained by plotting ACFk+1 against ACFk. The corresponding Pearson correlation coefficient (ρACF), slope (SACF), Y-intercept (YACF), SD1ACF, and SD2ACF can be defined similarly to characterize the plot. By comparing the indices of Poincaré plots of RRI and ACF between patients with acute myocardial infarction (AMI) and patients with patent coronary artery (PCA), we found that the ρACF and SACF were significantly larger, whereas the RMSSDACF/SDACF and SD1ACF/SD2ACF were significantly smaller in AMI patients. The ρACF and SACF correlated significantly and negatively with normalized high-frequency power (nHFP), and significantly and positively with normalized very low-frequency power (nVLFP) of heart rate variability in both groups of patients. On the contrary, the RMSSDACF/SDACF and SD1ACF/SD2ACF correlated significantly and positively with nHFP, and significantly and negatively with nVLFP and low-/high-frequency power ratio (LHR) in both groups of patients. We concluded that the ρACF, SACF, RMSSDACF/SDACF, and SD1ACF/SD2ACF, among many other indices of ACF Poincaré plot, can be used to differentiate between patients with AMI and patients with PCA, and that the increase in ρACF and SACF and the decrease in RMSSDACF/SDACF and SD1ACF/SD2ACF suggest an increased sympathetic and decreased vagal modulations in both groups of patients.
Sanden, van der J.J.; Hoekman, D.H.
2005-01-01
In the present paper we review relationships between commonly used statistical approaches to analysis of image texture. The approaches considered characterize image texture by means of the statistics of grey- tone co- occurrence contrast, grey- tone co- occurrence correlation, semivariance, and
Energy Technology Data Exchange (ETDEWEB)
Singh, Bhupinder [Department of Chemistry and Biochemistry, C-100 BNSN, Brigham Young University, Provo, UT 84602 (United States); Velázquez, Daniel; Terry, Jeff [Department of Physics, Illinois Institute of Technology, Chicago, IL 60616 (United States); Linford, Matthew R., E-mail: mrlinford@chem.byu.edu [Department of Chemistry and Biochemistry, C-100 BNSN, Brigham Young University, Provo, UT 84602 (United States)
2014-12-15
Highlights: • We apply the equivalent and autocorrelation widths and variance to XPS narrow scans. • This approach is complementary to traditional peak fitting methods. • It is bias free and responsive to subtle chemical changes in spectra. • It has the potential for machine interpretation of spectra and quality control. • It has the potential for analysis of complex spectra and tracking charging/artifacts. - Abstract: X-ray photoelectron spectroscopy (XPS) is widely used in surface and materials laboratories around the world. It is a near surface technique, providing detailed chemical information about samples in the form of survey and narrow scans. To extract the maximum amount of information about materials it is often necessary to peak fit XPS narrow scans. And while indispensable to XPS data analysis, even experienced practitioners can struggle with their peak fitting. In our previous publication, we introduced the equivalent width (EW{sub XPS}) as both a possible machine automated method, one that requires less expert judgment for characterizing XPS narrow scans, and as an approach that may be well suited for the analysis of complex spectra. The EW{sub XPS} figure of merit was applied to four different data sets. However, as previously noted, other width functions are also regularly employed for analyzing functions. Here we evaluate two other width functions for XPS narrow scan analysis: the autocorrelation width (AW{sub XPS}) and the variance (σ{sub XPS}{sup 2}). These widths were applied to the same four sets of spectra studied before: (a) four C 1s narrow scans of ozone-treated carbon nanotubes (CNTs) (EW{sub XPS}: ∼2.11–2.16 eV, AW{sub XPS}: ∼3.9–4.1 eV, σ{sub XPS}{sup 2}: ∼5.0–5.2 eV, and a modified form of σ{sub XPS}{sup 2}, denoted σ{sub XPS}{sup 2*}: ∼6.3–6.8 eV), (b) silicon wafers with different oxide thicknesses (EW{sub XPS}: ∼1.5–2.9 eV, AW{sub XPS}: ∼2.28–4.9, and σ{sub XPS}{sup 2}: ∼0.7–4.9 eV), (iii
Statistical study of density fluctuations in the tore supra tokamak
International Nuclear Information System (INIS)
Devynck, P.; Fenzi, C.; Garbet, X.; Laviron, C.
1998-03-01
It is believed that the radial anomalous transport in tokamaks is caused by plasma turbulence. Using infra-red laser scattering technique on the Tore Supra tokamak, statistical properties of the density fluctuations are studied as a function of the scales in ohmic as well as additional heating regimes using the lower hybrid or the ion cyclotron frequencies. The probability distributions are compared to a Gaussian in order to estimate the role of intermittency which is found to be negligible. The temporal behaviour of the three-dimensional spectrum is thoroughly discussed; its multifractal character is reflected in the singularity spectrum. The autocorrelation coefficient as well as their long-time incoherence and statistical independence. We also put forward the existence of fluctuations transfer between two distinct but close wavenumbers. A rather clearer image is thus obtained about the way energy is transferred through the turbulent scales. (author)
Statistical processing of technological and radiochemical data
International Nuclear Information System (INIS)
Lahodova, Zdena; Vonkova, Kateřina
2011-01-01
The project described in this article had two goals. The main goal was to compare technological and radiochemical data from two units of nuclear power plant. The other goal was to check the collection, organization and interpretation of routinely measured data. Monitoring of analytical and radiochemical data is a very valuable source of knowledge for some processes in the primary circuit. Exploratory analysis of one-dimensional data was performed to estimate location and variability and to find extreme values, data trends, distribution, autocorrelation etc. This process allowed for the cleaning and completion of raw data. Then multiple analyses such as multiple comparisons, multiple correlation, variance analysis, and so on were performed. Measured data was organized into a data matrix. The results and graphs such as Box plots, Mahalanobis distance, Biplot, Correlation, and Trend graphs are presented in this article as statistical analysis tools. Tables of data were replaced with graphs because graphs condense large amounts of information into easy-to-understand formats. The significant conclusion of this work is that the collection and comprehension of data is a very substantial part of statistical processing. With well-prepared and well-understood data, its accurate evaluation is possible. Cooperation between the technicians who collect data and the statistician who processes it is also very important. (author)
Directory of Open Access Journals (Sweden)
Tura Andrea
2012-02-01
Full Text Available Abstract Background Symmetry and regularity of gait are essential outcomes of gait retraining programs, especially in lower-limb amputees. This study aims presenting an algorithm to automatically compute symmetry and regularity indices, and assessing the minimum number of strides for appropriate evaluation of gait symmetry and regularity through autocorrelation of acceleration signals. Methods Ten transfemoral amputees (AMP and ten control subjects (CTRL were studied. Subjects wore an accelerometer and were asked to walk for 70 m at their natural speed (twice. Reference values of step and stride regularity indices (Ad1 and Ad2 were obtained by autocorrelation analysis of the vertical and antero-posterior acceleration signals, excluding initial and final strides. The Ad1 and Ad2 coefficients were then computed at different stages by analyzing increasing portions of the signals (considering both the signals cleaned by initial and final strides, and the whole signals. At each stage, the difference between Ad1 and Ad2 values and the corresponding reference values were compared with the minimum detectable difference, MDD, of the index. If that difference was less than MDD, it was assumed that the portion of signal used in the analysis was of sufficient length to allow reliable estimation of the autocorrelation coefficient. Results All Ad1 and Ad2 indices were lower in AMP than in CTRL (P Conclusions Without the need to identify and eliminate the phases of gait initiation and termination, twenty strides can provide a reasonable amount of information to reliably estimate gait regularity in transfemoral amputees.
DEFF Research Database (Denmark)
Nielsen, Erland Hejn
2004-01-01
significant nature or (2) aggregate system behaviour is in general very different from just the summing-up (even for finite sets of micro-behavioural patterns) and/or (3) it is simply a wrong assumption that in many cases is chosen by mere convention or plain convenience. It is evident that before choosing...... method or some autocorrelation extended descriptive sampling method, can then easily be applied. The results from the Livny, Melamed and Tsiolis (1993) study as well as the results from this work both indicates that system performance measures as for instance average waiting time or average time...
Mei, Zhixiong; Wu, Hao; Li, Shiyun
2018-06-01
The Conversion of Land Use and its Effects at Small regional extent (CLUE-S), which is a widely used model for land-use simulation, utilizes logistic regression to estimate the relationships between land use and its drivers, and thus, predict land-use change probabilities. However, logistic regression disregards possible spatial autocorrelation and self-organization in land-use data. Autologistic regression can depict spatial autocorrelation but cannot address self-organization, while logistic regression by considering only self-organization (NElogistic regression) fails to capture spatial autocorrelation. Therefore, this study developed a regression (NE-autologistic regression) method, which incorporated both spatial autocorrelation and self-organization, to improve CLUE-S. The Zengcheng District of Guangzhou, China was selected as the study area. The land-use data of 2001, 2005, and 2009, as well as 10 typical driving factors, were used to validate the proposed regression method and the improved CLUE-S model. Then, three future land-use scenarios in 2020: the natural growth scenario, ecological protection scenario, and economic development scenario, were simulated using the improved model. Validation results showed that NE-autologistic regression performed better than logistic regression, autologistic regression, and NE-logistic regression in predicting land-use change probabilities. The spatial allocation accuracy and kappa values of NE-autologistic-CLUE-S were higher than those of logistic-CLUE-S, autologistic-CLUE-S, and NE-logistic-CLUE-S for the simulations of two periods, 2001-2009 and 2005-2009, which proved that the improved CLUE-S model achieved the best simulation and was thereby effective to a certain extent. The scenario simulation results indicated that under all three scenarios, traffic land and residential/industrial land would increase, whereas arable land and unused land would decrease during 2009-2020. Apparent differences also existed in the
DEFF Research Database (Denmark)
Nørgaard, Pernille; Wright, Dave; Ball, Susan
2013-01-01
Theoretically, repeated sampling of free β-human chorionic gonadotropin (hCGβ) and pregnancy associated plasma protein-A (PAPP-A) in the first trimester of pregnancy might improve performance of risk assessment of trisomy 21 (T21). To assess the performance of a screening test involving repeated...... measures of biochemical markers, correlations between markers must be estimated. The aims of this study were to calculate the autocorrelation and cross-correlation between hCGβ and PAPP-A in the first trimester of pregnancy and to investigate the possible impact of gestational age at the first sample...
... Watchdog Ratings Feedback Contact Select Page Childhood Cancer Statistics Home > Cancer Resources > Childhood Cancer Statistics Childhood Cancer Statistics – Graphs and Infographics Number of Diagnoses Incidence Rates ...
Asten, M. W.; Hayashi, K.
2018-05-01
Ambient seismic noise or microtremor observations used in spatial auto-correlation (SPAC) array methods consist of a wide frequency range of surface waves from the frequency of about 0.1 Hz to several tens of Hz. The wavelengths (and hence depth sensitivity of such surface waves) allow determination of the site S-wave velocity model from a depth of 1 or 2 m down to a maximum of several kilometres; it is a passive seismic method using only ambient noise as the energy source. Application usually uses a 2D seismic array with a small number of seismometers (generally between 2 and 15) to estimate the phase velocity dispersion curve and hence the S-wave velocity depth profile for the site. A large number of methods have been proposed and used to estimate the dispersion curve; SPAC is the one of the oldest and the most commonly used methods due to its versatility and minimal instrumentation requirements. We show that direct fitting of observed and model SPAC spectra generally gives a superior bandwidth of useable data than does the more common approach of inversion after the intermediate step of constructing an observed dispersion curve. Current case histories demonstrate the method with a range of array types including two-station arrays, L-shaped multi-station arrays, triangular and circular arrays. Array sizes from a few metres to several-km in diameter have been successfully deployed in sites ranging from downtown urban settings to rural and remote desert sites. A fundamental requirement of the method is the ability to average wave propagation over a range of azimuths; this can be achieved with either or both of the wave sources being widely distributed in azimuth, and the use of a 2D array sampling the wave field over a range of azimuths. Several variants of the method extend its applicability to under-sampled data from sparse arrays, the complexity of multiple-mode propagation of energy, and the problem of precise estimation where array geometry departs from an
Liu, Lu; Wei, Jianrong; Zhang, Huishu; Xin, Jianhong; Huang, Jiping
2013-01-01
Because classical music has greatly affected our life and culture in its long history, it has attracted extensive attention from researchers to understand laws behind it. Based on statistical physics, here we use a different method to investigate classical music, namely, by analyzing cumulative distribution functions (CDFs) and autocorrelation functions of pitch fluctuations in compositions. We analyze 1,876 compositions of five representative classical music composers across 164 years from Bach, to Mozart, to Beethoven, to Mendelsohn, and to Chopin. We report that the biggest pitch fluctuations of a composer gradually increase as time evolves from Bach time to Mendelsohn/Chopin time. In particular, for the compositions of a composer, the positive and negative tails of a CDF of pitch fluctuations are distributed not only in power laws (with the scale-free property), but also in symmetry (namely, the probability of a treble following a bass and that of a bass following a treble are basically the same for each composer). The power-law exponent decreases as time elapses. Further, we also calculate the autocorrelation function of the pitch fluctuation. The autocorrelation function shows a power-law distribution for each composer. Especially, the power-law exponents vary with the composers, indicating their different levels of long-range correlation of notes. This work not only suggests a way to understand and develop music from a viewpoint of statistical physics, but also enriches the realm of traditional statistical physics by analyzing music.
Directory of Open Access Journals (Sweden)
Lu Liu
Full Text Available Because classical music has greatly affected our life and culture in its long history, it has attracted extensive attention from researchers to understand laws behind it. Based on statistical physics, here we use a different method to investigate classical music, namely, by analyzing cumulative distribution functions (CDFs and autocorrelation functions of pitch fluctuations in compositions. We analyze 1,876 compositions of five representative classical music composers across 164 years from Bach, to Mozart, to Beethoven, to Mendelsohn, and to Chopin. We report that the biggest pitch fluctuations of a composer gradually increase as time evolves from Bach time to Mendelsohn/Chopin time. In particular, for the compositions of a composer, the positive and negative tails of a CDF of pitch fluctuations are distributed not only in power laws (with the scale-free property, but also in symmetry (namely, the probability of a treble following a bass and that of a bass following a treble are basically the same for each composer. The power-law exponent decreases as time elapses. Further, we also calculate the autocorrelation function of the pitch fluctuation. The autocorrelation function shows a power-law distribution for each composer. Especially, the power-law exponents vary with the composers, indicating their different levels of long-range correlation of notes. This work not only suggests a way to understand and develop music from a viewpoint of statistical physics, but also enriches the realm of traditional statistical physics by analyzing music.
... Standards Act and Program MQSA Insights MQSA National Statistics Share Tweet Linkedin Pin it More sharing options ... but should level off with time. Archived Scorecard Statistics 2018 Scorecard Statistics 2017 Scorecard Statistics 2016 Scorecard ...
State Transportation Statistics 2014
2014-12-15
The Bureau of Transportation Statistics (BTS) presents State Transportation Statistics 2014, a statistical profile of transportation in the 50 states and the District of Columbia. This is the 12th annual edition of State Transportation Statistics, a ...
Renyi statistics in equilibrium statistical mechanics
International Nuclear Information System (INIS)
Parvan, A.S.; Biro, T.S.
2010-01-01
The Renyi statistics in the canonical and microcanonical ensembles is examined both in general and in particular for the ideal gas. In the microcanonical ensemble the Renyi statistics is equivalent to the Boltzmann-Gibbs statistics. By the exact analytical results for the ideal gas, it is shown that in the canonical ensemble, taking the thermodynamic limit, the Renyi statistics is also equivalent to the Boltzmann-Gibbs statistics. Furthermore it satisfies the requirements of the equilibrium thermodynamics, i.e. the thermodynamical potential of the statistical ensemble is a homogeneous function of first degree of its extensive variables of state. We conclude that the Renyi statistics arrives at the same thermodynamical relations, as those stemming from the Boltzmann-Gibbs statistics in this limit.
Sampling, Probability Models and Statistical Reasoning Statistical
Indian Academy of Sciences (India)
Home; Journals; Resonance – Journal of Science Education; Volume 1; Issue 5. Sampling, Probability Models and Statistical Reasoning Statistical Inference. Mohan Delampady V R Padmawar. General Article Volume 1 Issue 5 May 1996 pp 49-58 ...
Statistical Estimation of Heterogeneities: A New Frontier in Well Testing
Neuman, S. P.; Guadagnini, A.; Illman, W. A.; Riva, M.; Vesselinov, V. V.
2001-12-01
Well-testing methods have traditionally relied on analytical solutions of groundwater flow equations in relatively simple domains, consisting of one or at most a few units having uniform hydraulic properties. Recently, attention has been shifting toward methods and solutions that would allow one to characterize subsurface heterogeneities in greater detail. On one hand, geostatistical inverse methods are being used to assess the spatial variability of parameters, such as permeability and porosity, on the basis of multiple cross-hole pressure interference tests. On the other hand, analytical solutions are being developed to describe the mean and variance (first and second statistical moments) of flow to a well in a randomly heterogeneous medium. Geostatistical inverse interpretation of cross-hole tests yields a smoothed but detailed "tomographic" image of how parameters actually vary in three-dimensional space, together with corresponding measures of estimation uncertainty. Moment solutions may soon allow one to interpret well tests in terms of statistical parameters such as the mean and variance of log permeability, its spatial autocorrelation and statistical anisotropy. The idea of geostatistical cross-hole tomography is illustrated through pneumatic injection tests conducted in unsaturated fractured tuff at the Apache Leap Research Site near Superior, Arizona. The idea of using moment equations to interpret well-tests statistically is illustrated through a recently developed three-dimensional solution for steady state flow to a well in a bounded, randomly heterogeneous, statistically anisotropic aquifer.
Statistical distribution of solar soft X-ray bursts
Energy Technology Data Exchange (ETDEWEB)
Kaufmann, P; Piazza, L R; Schaal, R E [Universidade Mackenzie, Sao Paulo (Brazil). Centro de Radio-Astronomia e Astrofisica
1979-03-01
Nearly 1000 solar events with fluxes measured in 0.5-3A/sup 0/, 1-8A/sup 0/ and 8-20A/sup 0/ bands by Explorer 37 (US NRL Solrad) satellite are statistically analyzed. The differential distribution of peak fluxes can be represented by power laws with exponents -1.4, -2.2, -2.9 respectively, which are compared to 2-12A/sup 0/ results. For the 0.5-3A/sup 0/ band there is a suggested peak in the distribution. Autocorrelation analyses of the distribution have shown that in the harder band (0.5-3A/sup 0/) there is a concentration of events at preferred values multiplied of about 10x10/sup -5/erg cm/sup -2/S/sup -1/ of unknown origin.
Statistical properties of Charney-Hasegawa-Mima zonal flows
International Nuclear Information System (INIS)
Anderson, Johan; Botha, G. J. J.
2015-01-01
A theoretical interpretation of numerically generated probability density functions (PDFs) of intermittent plasma transport events in unforced zonal flows is provided within the Charney-Hasegawa-Mima (CHM) model. The governing equation is solved numerically with various prescribed density gradients that are designed to produce different configurations of parallel and anti-parallel streams. Long-lasting vortices form whose flow is governed by the zonal streams. It is found that the numerically generated PDFs can be matched with analytical predictions of PDFs based on the instanton method by removing the autocorrelations from the time series. In many instances, the statistics generated by the CHM dynamics relaxes to Gaussian distributions for both the electrostatic and vorticity perturbations, whereas in areas with strong nonlinear interactions it is found that the PDFs are exponentially distributed
Statistical distribution of solar soft X-ray bursts
International Nuclear Information System (INIS)
Kaufmann, P.; Piazza, L.R.; Schaal, R.E.
1979-01-01
Nearly 1000 solar events with fluxes measured in 0.5-3A 0 , 1-8A 0 and 8-20A 0 bands by Explorer 37 (US NRL Solrad) satelite are statistically analysed. The differential distribution of peak fluxes can be represented by power laws with exponents -1.4, -2.2, -2.9 respectively, which are compared to 2-12A 0 results. At the 0.5-3A 0 band there is a suggested peak in the distribution. Autocorrelation analysis of the distribution have shown that in the harder band (0.5-3A 0 ) there is a concentration of events at preferred values multiplied of about 10x10 -5 erg cm -2 S -1 of unknown origin [pt
Savage, Leonard J
1972-01-01
Classic analysis of the foundations of statistics and development of personal probability, one of the greatest controversies in modern statistical thought. Revised edition. Calculus, probability, statistics, and Boolean algebra are recommended.
State Transportation Statistics 2010
2011-09-14
The Bureau of Transportation Statistics (BTS), a part of DOTs Research and Innovative Technology Administration (RITA), presents State Transportation Statistics 2010, a statistical profile of transportation in the 50 states and the District of Col...
State Transportation Statistics 2012
2013-08-15
The Bureau of Transportation Statistics (BTS), a part of the U.S. Department of Transportation's (USDOT) Research and Innovative Technology Administration (RITA), presents State Transportation Statistics 2012, a statistical profile of transportation ...
Adrenal Gland Tumors: Statistics
... Gland Tumor: Statistics Request Permissions Adrenal Gland Tumor: Statistics Approved by the Cancer.Net Editorial Board , 03/ ... primary adrenal gland tumor is very uncommon. Exact statistics are not available for this type of tumor ...
State transportation statistics 2009
2009-01-01
The Bureau of Transportation Statistics (BTS), a part of DOTs Research and : Innovative Technology Administration (RITA), presents State Transportation : Statistics 2009, a statistical profile of transportation in the 50 states and the : District ...
State Transportation Statistics 2011
2012-08-08
The Bureau of Transportation Statistics (BTS), a part of DOTs Research and Innovative Technology Administration (RITA), presents State Transportation Statistics 2011, a statistical profile of transportation in the 50 states and the District of Col...
Neuroendocrine Tumor: Statistics
... Tumor > Neuroendocrine Tumor: Statistics Request Permissions Neuroendocrine Tumor: Statistics Approved by the Cancer.Net Editorial Board , 01/ ... the body. It is important to remember that statistics on the survival rates for people with a ...
State Transportation Statistics 2013
2014-09-19
The Bureau of Transportation Statistics (BTS), a part of the U.S. Department of Transportations (USDOT) Research and Innovative Technology Administration (RITA), presents State Transportation Statistics 2013, a statistical profile of transportatio...
BTS statistical standards manual
2005-10-01
The Bureau of Transportation Statistics (BTS), like other federal statistical agencies, establishes professional standards to guide the methods and procedures for the collection, processing, storage, and presentation of statistical data. Standards an...
Chakraborty, Saikat; Das, Subir K.
2017-09-01
Via Monte Carlo simulations we study pattern and aging during coarsening in a nonconserved nearest-neighbor Ising model, following quenches from infinite to zero temperature, in space dimension d = 3. The decay of the order-parameter autocorrelation function appears to obey a power-law behavior, as a function of the ratio between the observation and waiting times, in the large ratio limit. However, the exponent of the power law, estimated accurately via a state-of-the-art method, violates a well-known lower bound. This surprising fact has been discussed in connection with a quantitative picture of the structural anomaly that the 3D Ising model exhibits during coarsening at zero temperature. These results are compared with those for quenches to a temperature above that of the roughening transition.
Vogelsang, R.; Hoheisel, C.
1987-02-01
Molecular-dynamics (MD) calculations are reported for three thermodynamic states of a Lennard-Jones fluid. Systems of 2048 particles and 105 integration steps were used. The transverse current autocorrelation function, Ct(k,t), has been determined for wave vectors of the range 0.5viscosities which showed a systematic behavior as a function of k. Extrapolation to the hydrodynamic region at k=0 gave shear viscosity coefficients in good agreement with direct Green-Kubo results obtained in previous work. The two-exponential model fit for the memory function proposed by other authors does not provide a reasonable description of the MD results, as the fit parameters show no systematic wave-vector dependence, although the Ct(k,t) functions are somewhat better fitted. Similarly, the semiempirical interpolation formula for the decay time based on the viscoelastic concept proposed by Akcasu and Daniels fails to reproduce the correct k dependence for the wavelength range investigated herein.
Interaction of a quantum well with squeezed light: Quantum-statistical properties
International Nuclear Information System (INIS)
Sete, Eyob A.; Eleuch, H.
2010-01-01
We investigate the quantum statistical properties of the light emitted by a quantum well interacting with squeezed light from a degenerate subthreshold optical parametric oscillator. We obtain analytical solutions for the pertinent quantum Langevin equations in the strong-coupling and low-excitation regimes. Using these solutions we calculate the intensity spectrum, autocorrelation function, and quadrature squeezing for the fluorescent light. We show that the fluorescent light exhibits bunching and quadrature squeezing. We also show that the squeezed light leads to narrowing of the width of the spectrum of the fluorescent light.
Information Statistics in Schools Educate your students about the value and everyday use of statistics. The Statistics in Schools program provides resources for teaching and learning with real life data. Explore the site for standards-aligned, classroom-ready activities. Statistics in Schools Math Activities History
Transport Statistics - Transport - UNECE
Sustainable Energy Statistics Trade Transport Themes UNECE and the SDGs Climate Change Gender Ideas 4 Change UNECE Weekly Videos UNECE Transport Areas of Work Transport Statistics Transport Transport Statistics About us Terms of Reference Meetings and Events Meetings Working Party on Transport Statistics (WP.6
Generalized quantum statistics
International Nuclear Information System (INIS)
Chou, C.
1992-01-01
In the paper, a non-anyonic generalization of quantum statistics is presented, in which Fermi-Dirac statistics (FDS) and Bose-Einstein statistics (BES) appear as two special cases. The new quantum statistics, which is characterized by the dimension of its single particle Fock space, contains three consistent parts, namely the generalized bilinear quantization, the generalized quantum mechanical description and the corresponding statistical mechanics
National Statistical Commission and Indian Official Statistics
Indian Academy of Sciences (India)
Author Affiliations. T J Rao1. C. R. Rao Advanced Institute of Mathematics, Statistics and Computer Science (AIMSCS) University of Hyderabad Campus Central University Post Office, Prof. C. R. Rao Road Hyderabad 500 046, AP, India.
Rumsey, Deborah
2011-01-01
The fun and easy way to get down to business with statistics Stymied by statistics? No fear ? this friendly guide offers clear, practical explanations of statistical ideas, techniques, formulas, and calculations, with lots of examples that show you how these concepts apply to your everyday life. Statistics For Dummies shows you how to interpret and critique graphs and charts, determine the odds with probability, guesstimate with confidence using confidence intervals, set up and carry out a hypothesis test, compute statistical formulas, and more.Tracks to a typical first semester statistics cou
Industrial statistics with Minitab
Cintas, Pere Grima; Llabres, Xavier Tort-Martorell
2012-01-01
Industrial Statistics with MINITAB demonstrates the use of MINITAB as a tool for performing statistical analysis in an industrial context. This book covers introductory industrial statistics, exploring the most commonly used techniques alongside those that serve to give an overview of more complex issues. A plethora of examples in MINITAB are featured along with case studies for each of the statistical techniques presented. Industrial Statistics with MINITAB: Provides comprehensive coverage of user-friendly practical guidance to the essential statistical methods applied in industry.Explores
Recreational Boating Statistics 2012
Department of Homeland Security — Every year, the USCG compiles statistics on reported recreational boating accidents. These statistics are derived from accident reports that are filed by the owners...
Recreational Boating Statistics 2013
Department of Homeland Security — Every year, the USCG compiles statistics on reported recreational boating accidents. These statistics are derived from accident reports that are filed by the owners...
Statistical data analysis handbook
National Research Council Canada - National Science Library
Wall, Francis J
1986-01-01
It must be emphasized that this is not a text book on statistics. Instead it is a working tool that presents data analysis in clear, concise terms which can be readily understood even by those without formal training in statistics...
U.S. Department of Health & Human Services — The CMS Office of Enterprise Data and Analytics has developed CMS Program Statistics, which includes detailed summary statistics on national health care, Medicare...
Recreational Boating Statistics 2011
Department of Homeland Security — Every year, the USCG compiles statistics on reported recreational boating accidents. These statistics are derived from accident reports that are filed by the owners...
... Doing AMIGAS Stay Informed Cancer Home Uterine Cancer Statistics Language: English (US) Español (Spanish) Recommend on Facebook ... the most commonly diagnosed gynecologic cancer. U.S. Cancer Statistics Data Visualizations Tool The Data Visualizations tool makes ...
Tuberculosis Data and Statistics
... Advisory Groups Federal TB Task Force Data and Statistics Language: English (US) Español (Spanish) Recommend on Facebook ... Set) Mortality and Morbidity Weekly Reports Data and Statistics Decrease in Reported Tuberculosis Cases MMWR 2010; 59 ( ...
National transportation statistics 2011
2011-04-01
Compiled and published by the U.S. Department of Transportation's Bureau of Transportation Statistics : (BTS), National Transportation Statistics presents information on the U.S. transportation system, including : its physical components, safety reco...
National Transportation Statistics 2008
2009-01-08
Compiled and published by the U.S. Department of Transportations Bureau of Transportation Statistics (BTS), National Transportation Statistics presents information on the U.S. transportation system, including its physical components, safety record...
... News & Events About Us Home > Health Information Share Statistics Research shows that mental illnesses are common in ... of mental illnesses, such as suicide and disability. Statistics Top ı cs Mental Illness Any Anxiety Disorder ...
School Violence: Data & Statistics
... Social Media Publications Injury Center School Violence: Data & Statistics Recommend on Facebook Tweet Share Compartir The first ... Vehicle Safety Traumatic Brain Injury Injury Response Data & Statistics (WISQARS) Funded Programs Press Room Social Media Publications ...
Caregiver Statistics: Demographics
... You are here Home Selected Long-Term Care Statistics Order this publication Printer-friendly version What is ... needs and services are wide-ranging and complex, statistics may vary from study to study. Sources for ...
... Summary Coverdell Program 2012-2015 State Summaries Data & Statistics Fact Sheets Heart Disease and Stroke Fact Sheets ... Roadmap for State Planning Other Data Resources Other Statistic Resources Grantee Information Cross-Program Information Online Tools ...
... Standard Drink? Drinking Levels Defined Alcohol Facts and Statistics Print version Alcohol Use in the United States: ... 1238–1245, 2004. PMID: 15010446 National Center for Statistics and Analysis. 2014 Crash Data Key Findings (Traffic ...
National Transportation Statistics 2009
2010-01-21
Compiled and published by the U.S. Department of Transportation's Bureau of Transportation Statistics (BTS), National Transportation Statistics presents information on the U.S. transportation system, including its physical components, safety record, ...
National transportation statistics 2010
2010-01-01
National Transportation Statistics presents statistics on the U.S. transportation system, including its physical components, safety record, economic performance, the human and natural environment, and national security. This is a large online documen...
DEFF Research Database (Denmark)
Lindström, Erik; Madsen, Henrik; Nielsen, Jan Nygaard
Statistics for Finance develops students’ professional skills in statistics with applications in finance. Developed from the authors’ courses at the Technical University of Denmark and Lund University, the text bridges the gap between classical, rigorous treatments of financial mathematics...
Principles of applied statistics
National Research Council Canada - National Science Library
Cox, D. R; Donnelly, Christl A
2011-01-01
.... David Cox and Christl Donnelly distil decades of scientific experience into usable principles for the successful application of statistics, showing how good statistical strategy shapes every stage of an investigation...
Applying contemporary statistical techniques
Wilcox, Rand R
2003-01-01
Applying Contemporary Statistical Techniques explains why traditional statistical methods are often inadequate or outdated when applied to modern problems. Wilcox demonstrates how new and more powerful techniques address these problems far more effectively, making these modern robust methods understandable, practical, and easily accessible.* Assumes no previous training in statistics * Explains how and why modern statistical methods provide more accurate results than conventional methods* Covers the latest developments on multiple comparisons * Includes recent advanc
Mouser, P. J.
2010-12-01
In order to develop decision-making tools for the prediction and optimization of subsurface bioremediation strategies, we must be able to link the molecular-scale activity of microorganisms involved in remediation processes with biogeochemical processes observed at the field-scale. This requires the ability to quantify changes in the in situ metabolic condition of dominant microbes and associate these changes to fluctuations in nutrient levels throughout the bioremediation process. It also necessitates a need to understand the spatiotemporal variability of the molecular-scale information to develop meaningful parameters and constraint ranges in complex bio-physio-chemical models. The expression of three Geobacter species genes (ammonium transporter (amtB), nitrogen fixation (nifD), and a housekeeping gene (recA)) were tracked at two monitoring locations that differed significantly in ammonium (NH4+) concentrations during a field-scale experiment where acetate was injected into the subsurface to simulate Geobacteraceae in a uranium-contaminated aquifer. Analysis of amtB and nifD mRNA transcript levels indicated that NH4+ was the primary form of fixed nitrogen during bioremediation. Overall expression levels of amtB were on average 8-fold higher at NH4+ concentrations of 300 μM or more than at lower NH4+ levels (average 60 μM). The degree of temporal correlation in Geobacter species mRNA expression levels was calculated at both locations using autocorrelation methods that describe the relationship between sample semi-variance and time lag. At the monitoring location with lower NH4+, a temporal correlation lag of 8 days was observed for both amtB and nifD transcript patterns. At the location where higher NH4+ levels were observed, no discernable temporal correlation lag above the sampling frequency (approximately every 2 days) was observed for amtB or nifD transcript fluctuations. Autocorrelation trends in recA expression levels at both locations indicated that
Interactive statistics with ILLMO
Martens, J.B.O.S.
2014-01-01
Progress in empirical research relies on adequate statistical analysis and reporting. This article proposes an alternative approach to statistical modeling that is based on an old but mostly forgotten idea, namely Thurstone modeling. Traditional statistical methods assume that either the measured
Lenard, Christopher; McCarthy, Sally; Mills, Terence
2014-01-01
There are many different aspects of statistics. Statistics involves mathematics, computing, and applications to almost every field of endeavour. Each aspect provides an opportunity to spark someone's interest in the subject. In this paper we discuss some ethical aspects of statistics, and describe how an introduction to ethics has been…
Youth Sports Safety Statistics
... 6):794-799. 31 American Heart Association. CPR statistics. www.heart.org/HEARTORG/CPRAndECC/WhatisCPR/CPRFactsandStats/CPRpercent20Statistics_ ... Mental Health Services Administration, Center for Behavioral Health Statistics and Quality. (January 10, 2013). The DAWN Report: ...
Solar radiation data - statistical analysis and simulation models
Energy Technology Data Exchange (ETDEWEB)
Mustacchi, C; Cena, V; Rocchi, M; Haghigat, F
1984-01-01
The activities consisted in collecting meteorological data on magnetic tape for ten european locations (with latitudes ranging from 42/sup 0/ to 56/sup 0/ N), analysing the multi-year sequences, developing mathematical models to generate synthetic sequences having the same statistical properties of the original data sets, and producing one or more Short Reference Years (SRY's) for each location. The meteorological parameters examinated were (for all the locations) global + diffuse radiation on horizontal surface, dry bulb temperature, sunshine duration. For some of the locations additional parameters were available, namely, global, beam and diffuse radiation on surfaces other than horizontal, wet bulb temperature, wind velocity, cloud type, cloud cover. The statistical properties investigated were mean, variance, autocorrelation, crosscorrelation with selected parameters, probability density function. For all the meteorological parameters, various mathematical models were built: linear regression, stochastic models of the AR and the DAR type. In each case, the model with the best statistical behaviour was selected for the production of a SRY for the relevant parameter/location.
Statistical evidence for common ancestry: Application to primates.
Baum, David A; Ané, Cécile; Larget, Bret; Solís-Lemus, Claudia; Ho, Lam Si Tung; Boone, Peggy; Drummond, Chloe P; Bontrager, Martin; Hunter, Steven J; Saucier, William
2016-06-01
Since Darwin, biologists have come to recognize that the theory of descent from common ancestry (CA) is very well supported by diverse lines of evidence. However, while the qualitative evidence is overwhelming, we also need formal methods for quantifying the evidential support for CA over the alternative hypothesis of separate ancestry (SA). In this article, we explore a diversity of statistical methods using data from the primates. We focus on two alternatives to CA, species SA (the separate origin of each named species) and family SA (the separate origin of each family). We implemented statistical tests based on morphological, molecular, and biogeographic data and developed two new methods: one that tests for phylogenetic autocorrelation while correcting for variation due to confounding ecological traits and a method for examining whether fossil taxa have fewer derived differences than living taxa. We overwhelmingly rejected both species and family SA with infinitesimal P values. We compare these results with those from two companion papers, which also found tremendously strong support for the CA of all primates, and discuss future directions and general philosophical issues that pertain to statistical testing of historical hypotheses such as CA. © 2016 The Author(s). Evolution © 2016 The Society for the Study of Evolution.
Simultaneous ocular and muscle artifact removal from EEG data by exploiting diverse statistics.
Chen, Xun; Liu, Aiping; Chen, Qiang; Liu, Yu; Zou, Liang; McKeown, Martin J
2017-09-01
Electroencephalography (EEG) recordings are frequently contaminated by both ocular and muscle artifacts. These are normally dealt with separately, by employing blind source separation (BSS) techniques relying on either second-order or higher-order statistics (SOS & HOS respectively). When HOS-based methods are used, it is usually in the setting of assuming artifacts are statistically independent to the EEG. When SOS-based methods are used, it is assumed that artifacts have autocorrelation characteristics distinct from the EEG. In reality, ocular and muscle artifacts do not completely follow the assumptions of strict temporal independence to the EEG nor completely unique autocorrelation characteristics, suggesting that exploiting HOS or SOS alone may be insufficient to remove these artifacts. Here we employ a novel BSS technique, independent vector analysis (IVA), to jointly employ HOS and SOS simultaneously to remove ocular and muscle artifacts. Numerical simulations and application to real EEG recordings were used to explore the utility of the IVA approach. IVA was superior in isolating both ocular and muscle artifacts, especially for raw EEG data with low signal-to-noise ratio, and also integrated usually separate SOS and HOS steps into a single unified step. Copyright © 2017 Elsevier Ltd. All rights reserved.
Dowdy, Shirley; Chilko, Daniel
2011-01-01
Praise for the Second Edition "Statistics for Research has other fine qualities besides superior organization. The examples and the statistical methods are laid out with unusual clarity by the simple device of using special formats for each. The book was written with great care and is extremely user-friendly."-The UMAP Journal Although the goals and procedures of statistical research have changed little since the Second Edition of Statistics for Research was published, the almost universal availability of personal computers and statistical computing application packages have made it possible f
Boslaugh, Sarah
2013-01-01
Need to learn statistics for your job? Want help passing a statistics course? Statistics in a Nutshell is a clear and concise introduction and reference for anyone new to the subject. Thoroughly revised and expanded, this edition helps you gain a solid understanding of statistics without the numbing complexity of many college texts. Each chapter presents easy-to-follow descriptions, along with graphics, formulas, solved examples, and hands-on exercises. If you want to perform common statistical analyses and learn a wide range of techniques without getting in over your head, this is your book.
Statistics & probaility for dummies
Rumsey, Deborah J
2013-01-01
Two complete eBooks for one low price! Created and compiled by the publisher, this Statistics I and Statistics II bundle brings together two math titles in one, e-only bundle. With this special bundle, you'll get the complete text of the following two titles: Statistics For Dummies, 2nd Edition Statistics For Dummies shows you how to interpret and critique graphs and charts, determine the odds with probability, guesstimate with confidence using confidence intervals, set up and carry out a hypothesis test, compute statistical formulas, and more. Tra
Nonparametric statistical inference
Gibbons, Jean Dickinson
2010-01-01
Overall, this remains a very fine book suitable for a graduate-level course in nonparametric statistics. I recommend it for all people interested in learning the basic ideas of nonparametric statistical inference.-Eugenia Stoimenova, Journal of Applied Statistics, June 2012… one of the best books available for a graduate (or advanced undergraduate) text for a theory course on nonparametric statistics. … a very well-written and organized book on nonparametric statistics, especially useful and recommended for teachers and graduate students.-Biometrics, 67, September 2011This excellently presente
Business statistics for dummies
Anderson, Alan
2013-01-01
Score higher in your business statistics course? Easy. Business statistics is a common course for business majors and MBA candidates. It examines common data sets and the proper way to use such information when conducting research and producing informational reports such as profit and loss statements, customer satisfaction surveys, and peer comparisons. Business Statistics For Dummies tracks to a typical business statistics course offered at the undergraduate and graduate levels and provides clear, practical explanations of business statistical ideas, techniques, formulas, and calculations, w
Griffiths, Dawn
2009-01-01
Wouldn't it be great if there were a statistics book that made histograms, probability distributions, and chi square analysis more enjoyable than going to the dentist? Head First Statistics brings this typically dry subject to life, teaching you everything you want and need to know about statistics through engaging, interactive, and thought-provoking material, full of puzzles, stories, quizzes, visual aids, and real-world examples. Whether you're a student, a professional, or just curious about statistical analysis, Head First's brain-friendly formula helps you get a firm grasp of statistics
Lectures on algebraic statistics
Drton, Mathias; Sullivant, Seth
2009-01-01
How does an algebraic geometer studying secant varieties further the understanding of hypothesis tests in statistics? Why would a statistician working on factor analysis raise open problems about determinantal varieties? Connections of this type are at the heart of the new field of "algebraic statistics". In this field, mathematicians and statisticians come together to solve statistical inference problems using concepts from algebraic geometry as well as related computational and combinatorial techniques. The goal of these lectures is to introduce newcomers from the different camps to algebraic statistics. The introduction will be centered around the following three observations: many important statistical models correspond to algebraic or semi-algebraic sets of parameters; the geometry of these parameter spaces determines the behaviour of widely used statistical inference procedures; computational algebraic geometry can be used to study parameter spaces and other features of statistical models.
Naghshpour, Shahdad
2012-01-01
Statistics is the branch of mathematics that deals with real-life problems. As such, it is an essential tool for economists. Unfortunately, the way you and many other economists learn the concept of statistics is not compatible with the way economists think and learn. The problem is worsened by the use of mathematical jargon and complex derivations. Here's a book that proves none of this is necessary. All the examples and exercises in this book are constructed within the field of economics, thus eliminating the difficulty of learning statistics with examples from fields that have no relation to business, politics, or policy. Statistics is, in fact, not more difficult than economics. Anyone who can comprehend economics can understand and use statistics successfully within this field, including you! This book utilizes Microsoft Excel to obtain statistical results, as well as to perform additional necessary computations. Microsoft Excel is not the software of choice for performing sophisticated statistical analy...
Baseline Statistics of Linked Statistical Data
Scharnhorst, Andrea; Meroño-Peñuela, Albert; Guéret, Christophe
2014-01-01
We are surrounded by an ever increasing ocean of information, everybody will agree to that. We build sophisticated strategies to govern this information: design data models, develop infrastructures for data sharing, building tool for data analysis. Statistical datasets curated by National
International Nuclear Information System (INIS)
Behringer, K.
2001-08-01
A novel auto-correlation function (ACF) method has been investigated for determining the oscillation frequency and the decay ratio in BWR stability analyses. The report describes not only the method but also documents comprehensively the used and developed FORTRAN codes. The neutron signals are band-pass filtered to separate the oscillation peak in the power spectral density (PSD) from background. Two linear second-order oscillation models are considered. The ACF of each model, corrected for signal filtering and with the inclusion of a background term under the peak in the PSD, is then least-squares fitted to the ACF estimated on the previously filtered neutron signals, in order to determine the oscillation frequency and the decay ratio. The procedures of filtering and ACF estimation use fast Fourier transform techniques with signal segmentation. Gliding 'short-time' ACF estimates along a signal record allow the evaluation of uncertainties. Some numerical results are given which have been obtained from neutron signal data offered by the recent Forsmark I and Forsmark II NEA benchmark project. They are compared with those from other benchmark participants using different other analysis methods. (author)
Singh, Jaskaran; Darpe, A. K.; Singh, S. P.
2018-02-01
Local damage in rolling element bearings usually generates periodic impulses in vibration signals. The severity, repetition frequency and the fault excited resonance zone by these impulses are the key indicators for diagnosing bearing faults. In this paper, a methodology based on over complete rational dilation wavelet transform (ORDWT) is proposed, as it enjoys a good shift invariance. ORDWT offers flexibility in partitioning the frequency spectrum to generate a number of subbands (filters) with diverse bandwidths. The selection of the optimal filter that perfectly overlaps with the bearing fault excited resonance zone is based on the maximization of a proposed impulse detection measure "Temporal energy operated auto correlated kurtosis". The proposed indicator is robust and consistent in evaluating the impulsiveness of fault signals in presence of interfering vibration such as heavy background noise or sporadic shocks unrelated to the fault or normal operation. The structure of the proposed indicator enables it to be sensitive to fault severity. For enhanced fault classification, an autocorrelation of the energy time series of the signal filtered through the optimal subband is proposed. The application of the proposed methodology is validated on simulated and experimental data. The study shows that the performance of the proposed technique is more robust and consistent in comparison to the original fast kurtogram and wavelet kurtogram.
Energy Technology Data Exchange (ETDEWEB)
Yamamoto, H; Iwamoto, K; Saito, T; Yoshida, A [Iwate University, Iwate (Japan). Faculty of Engineering
1997-05-27
Methods to derive underground structures by utilizing the dispersion phenomenon of surface waves contained in microtremors include the frequency-wave number analysis method (the F-K method) and the spatial autocorrelation method (SAC method). The SAC method is said capable of estimating the structures to deeper depths than with the F-K method if the same seismometer is used. However, the F-K method is used more frequently. This is because the SAC method imposes a strict restriction that seismometers must be arranged evenly on the same circumference, while the F-K method allows seismometers to be arranged arbitrarily during an observation. Therefore, the present study has discussed whether the SAC method can be applied to observations with the seismometers arranged in the same way as in the F-K method, by using microtremor data acquired from actual observations. It was made clear that a seismometer arrangement for the SAC method may be sufficed with at least three meters arranged on the same circumference. These meters may not have to be arranged evenly, but because the acquired phase velocities may vary according to wave arriving directions and seismometer arrangement, it is desirable to perform observations with seismometers arranged as evenly as possible. 13 figs.
Isakozawa, Shigeto; Fuse, Taishi; Amano, Junpei; Baba, Norio
2018-04-01
As alternatives to the diffractogram-based method in high-resolution transmission electron microscopy, a spot auto-focusing (AF) method and a spot auto-stigmation (AS) method are presented with a unique high-definition auto-correlation function (HD-ACF). The HD-ACF clearly resolves the ACF central peak region in small amorphous-thin-film images, reflecting the phase contrast transfer function. At a 300-k magnification for a 120-kV transmission electron microscope, the smallest areas used are 64 × 64 pixels (~3 nm2) for the AF and 256 × 256 pixels for the AS. A useful advantage of these methods is that the AF function has an allowable accuracy even for a low s/n (~1.0) image. A reference database on the defocus dependency of the HD-ACF by the pre-acquisition of through-focus amorphous-thin-film images must be prepared to use these methods. This can be very beneficial because the specimens are not limited to approximations of weak phase objects but can be extended to objects outside such approximations.
Energy Technology Data Exchange (ETDEWEB)
Torrecilla, Jose S., E-mail: jstorre@quim.ucm.es [Department of Chemical Engineering, Faculty of Chemistry, University Complutense of Madrid, 28040 Madrid (Spain); Garcia, Julian; Garcia, Silvia; Rodriguez, Francisco [Department of Chemical Engineering, Faculty of Chemistry, University Complutense of Madrid, 28040 Madrid (Spain)
2011-03-04
The combination of lag-k autocorrelation coefficients (LCCs) and thermogravimetric analyzer (TGA) equipment is defined here as a tool to detect and quantify adulterations of extra virgin olive oil (EVOO) with refined olive (ROO), refined olive pomace (ROPO), sunflower (SO) or corn (CO) oils, when the adulterating agents concentration are less than 14%. The LCC is calculated from TGA scans of adulterated EVOO samples. Then, the standardized skewness of this coefficient has been applied to classify pure and adulterated samples of EVOO. In addition, this chaotic parameter has also been used to quantify the concentration of adulterant agents, by using successful linear correlation of LCCs and ROO, ROPO, SO or CO in 462 EVOO adulterated samples. In the case of detection, more than 82% of adulterated samples have been correctly classified. In the case of quantification of adulterant concentration, by an external validation process, the LCC/TGA approach estimates the adulterant agents concentration with a mean correlation coefficient (estimated versus real adulterant agent concentration) greater than 0.90 and a mean square error less than 4.9%.
Energy Technology Data Exchange (ETDEWEB)
Behringer, K
2001-08-01
A novel auto-correlation function (ACF) method has been investigated for determining the oscillation frequency and the decay ratio in BWR stability analyses. The report describes not only the method but also documents comprehensively the used and developed FORTRAN codes. The neutron signals are band-pass filtered to separate the oscillation peak in the power spectral density (PSD) from background. Two linear second-order oscillation models are considered. The ACF of each model, corrected for signal filtering and with the inclusion of a background term under the peak in the PSD, is then least-squares fitted to the ACF estimated on the previously filtered neutron signals, in order to determine the oscillation frequency and the decay ratio. The procedures of filtering and ACF estimation use fast Fourier transform techniques with signal segmentation. Gliding 'short-time' ACF estimates along a signal record allow the evaluation of uncertainties. Some numerical results are given which have been obtained from neutron signal data offered by the recent Forsmark I and Forsmark II NEA benchmark project. They are compared with those from other benchmark participants using different other analysis methods. (author)
International Nuclear Information System (INIS)
Torrecilla, Jose S.; Garcia, Julian; Garcia, Silvia; Rodriguez, Francisco
2011-01-01
The combination of lag-k autocorrelation coefficients (LCCs) and thermogravimetric analyzer (TGA) equipment is defined here as a tool to detect and quantify adulterations of extra virgin olive oil (EVOO) with refined olive (ROO), refined olive pomace (ROPO), sunflower (SO) or corn (CO) oils, when the adulterating agents concentration are less than 14%. The LCC is calculated from TGA scans of adulterated EVOO samples. Then, the standardized skewness of this coefficient has been applied to classify pure and adulterated samples of EVOO. In addition, this chaotic parameter has also been used to quantify the concentration of adulterant agents, by using successful linear correlation of LCCs and ROO, ROPO, SO or CO in 462 EVOO adulterated samples. In the case of detection, more than 82% of adulterated samples have been correctly classified. In the case of quantification of adulterant concentration, by an external validation process, the LCC/TGA approach estimates the adulterant agents concentration with a mean correlation coefficient (estimated versus real adulterant agent concentration) greater than 0.90 and a mean square error less than 4.9%.
Statistical Physics An Introduction
Yoshioka, Daijiro
2007-01-01
This book provides a comprehensive presentation of the basics of statistical physics. The first part explains the essence of statistical physics and how it provides a bridge between microscopic and macroscopic phenomena, allowing one to derive quantities such as entropy. Here the author avoids going into details such as Liouville’s theorem or the ergodic theorem, which are difficult for beginners and unnecessary for the actual application of the statistical mechanics. In the second part, statistical mechanics is applied to various systems which, although they look different, share the same mathematical structure. In this way readers can deepen their understanding of statistical physics. The book also features applications to quantum dynamics, thermodynamics, the Ising model and the statistical dynamics of free spins.
Statistical symmetries in physics
International Nuclear Information System (INIS)
Green, H.S.; Adelaide Univ., SA
1994-01-01
Every law of physics is invariant under some group of transformations and is therefore the expression of some type of symmetry. Symmetries are classified as geometrical, dynamical or statistical. At the most fundamental level, statistical symmetries are expressed in the field theories of the elementary particles. This paper traces some of the developments from the discovery of Bose statistics, one of the two fundamental symmetries of physics. A series of generalizations of Bose statistics is described. A supersymmetric generalization accommodates fermions as well as bosons, and further generalizations, including parastatistics, modular statistics and graded statistics, accommodate particles with properties such as 'colour'. A factorization of elements of ggl(n b ,n f ) can be used to define truncated boson operators. A general construction is given for q-deformed boson operators, and explicit constructions of the same type are given for various 'deformed' algebras. A summary is given of some of the applications and potential applications. 39 refs., 2 figs
The statistical stability phenomenon
Gorban, Igor I
2017-01-01
This monograph investigates violations of statistical stability of physical events, variables, and processes and develops a new physical-mathematical theory taking into consideration such violations – the theory of hyper-random phenomena. There are five parts. The first describes the phenomenon of statistical stability and its features, and develops methods for detecting violations of statistical stability, in particular when data is limited. The second part presents several examples of real processes of different physical nature and demonstrates the violation of statistical stability over broad observation intervals. The third part outlines the mathematical foundations of the theory of hyper-random phenomena, while the fourth develops the foundations of the mathematical analysis of divergent and many-valued functions. The fifth part contains theoretical and experimental studies of statistical laws where there is violation of statistical stability. The monograph should be of particular interest to engineers...
Equilibrium statistical mechanics
Jackson, E Atlee
2000-01-01
Ideal as an elementary introduction to equilibrium statistical mechanics, this volume covers both classical and quantum methodology for open and closed systems. Introductory chapters familiarize readers with probability and microscopic models of systems, while additional chapters describe the general derivation of the fundamental statistical mechanics relationships. The final chapter contains 16 sections, each dealing with a different application, ordered according to complexity, from classical through degenerate quantum statistical mechanics. Key features include an elementary introduction t
Applied statistics for economists
Lewis, Margaret
2012-01-01
This book is an undergraduate text that introduces students to commonly-used statistical methods in economics. Using examples based on contemporary economic issues and readily-available data, it not only explains the mechanics of the various methods, it also guides students to connect statistical results to detailed economic interpretations. Because the goal is for students to be able to apply the statistical methods presented, online sources for economic data and directions for performing each task in Excel are also included.
Mineral industry statistics 1975
Energy Technology Data Exchange (ETDEWEB)
1978-01-01
Production, consumption and marketing statistics are given for solid fuels (coal, peat), liquid fuels and gases (oil, natural gas), iron ore, bauxite and other minerals quarried in France, in 1975. Also accident statistics are included. Production statistics are presented of the Overseas Departments and territories (French Guiana, New Caledonia, New Hebrides). An account of modifications in the mining field in 1975 is given. Concessions, exploitation permits, and permits solely for prospecting for mineral products are discussed. (In French)
Lectures on statistical mechanics
Bowler, M G
1982-01-01
Anyone dissatisfied with the almost ritual dullness of many 'standard' texts in statistical mechanics will be grateful for the lucid explanation and generally reassuring tone. Aimed at securing firm foundations for equilibrium statistical mechanics, topics of great subtlety are presented transparently and enthusiastically. Very little mathematical preparation is required beyond elementary calculus and prerequisites in physics are limited to some elementary classical thermodynamics. Suitable as a basis for a first course in statistical mechanics, the book is an ideal supplement to more convent
Directory of Open Access Journals (Sweden)
Mirjam Nielen
2017-01-01
Full Text Available Always wondered why research papers often present rather complicated statistical analyses? Or wondered how to properly analyse the results of a pragmatic trial from your own practice? This talk will give an overview of basic statistical principles and focus on the why of statistics, rather than on the how.This is a podcast of Mirjam's talk at the Veterinary Evidence Today conference, Edinburgh November 2, 2016.
Equilibrium statistical mechanics
Mayer, J E
1968-01-01
The International Encyclopedia of Physical Chemistry and Chemical Physics, Volume 1: Equilibrium Statistical Mechanics covers the fundamental principles and the development of theoretical aspects of equilibrium statistical mechanics. Statistical mechanical is the study of the connection between the macroscopic behavior of bulk matter and the microscopic properties of its constituent atoms and molecules. This book contains eight chapters, and begins with a presentation of the master equation used for the calculation of the fundamental thermodynamic functions. The succeeding chapters highlight t
Mahalanobis, P C
1965-01-01
Contributions to Statistics focuses on the processes, methodologies, and approaches involved in statistics. The book is presented to Professor P. C. Mahalanobis on the occasion of his 70th birthday. The selection first offers information on the recovery of ancillary information and combinatorial properties of partially balanced designs and association schemes. Discussions focus on combinatorial applications of the algebra of association matrices, sample size analogy, association matrices and the algebra of association schemes, and conceptual statistical experiments. The book then examines latt
Boslaugh, Sarah
2008-01-01
Need to learn statistics as part of your job, or want some help passing a statistics course? Statistics in a Nutshell is a clear and concise introduction and reference that's perfect for anyone with no previous background in the subject. This book gives you a solid understanding of statistics without being too simple, yet without the numbing complexity of most college texts. You get a firm grasp of the fundamentals and a hands-on understanding of how to apply them before moving on to the more advanced material that follows. Each chapter presents you with easy-to-follow descriptions illustrat
Understanding Computational Bayesian Statistics
Bolstad, William M
2011-01-01
A hands-on introduction to computational statistics from a Bayesian point of view Providing a solid grounding in statistics while uniquely covering the topics from a Bayesian perspective, Understanding Computational Bayesian Statistics successfully guides readers through this new, cutting-edge approach. With its hands-on treatment of the topic, the book shows how samples can be drawn from the posterior distribution when the formula giving its shape is all that is known, and how Bayesian inferences can be based on these samples from the posterior. These ideas are illustrated on common statistic
Annual Statistical Supplement, 2002
Social Security Administration — The Annual Statistical Supplement, 2002 includes the most comprehensive data available on the Social Security and Supplemental Security Income programs. More than...
Annual Statistical Supplement, 2010
Social Security Administration — The Annual Statistical Supplement, 2010 includes the most comprehensive data available on the Social Security and Supplemental Security Income programs. More than...
Annual Statistical Supplement, 2007
Social Security Administration — The Annual Statistical Supplement, 2007 includes the most comprehensive data available on the Social Security and Supplemental Security Income programs. More than...
Annual Statistical Supplement, 2001
Social Security Administration — The Annual Statistical Supplement, 2001 includes the most comprehensive data available on the Social Security and Supplemental Security Income programs. More than...
Annual Statistical Supplement, 2016
Social Security Administration — The Annual Statistical Supplement, 2016 includes the most comprehensive data available on the Social Security and Supplemental Security Income programs. More than...
Annual Statistical Supplement, 2011
Social Security Administration — The Annual Statistical Supplement, 2011 includes the most comprehensive data available on the Social Security and Supplemental Security Income programs. More than...
Annual Statistical Supplement, 2005
Social Security Administration — The Annual Statistical Supplement, 2005 includes the most comprehensive data available on the Social Security and Supplemental Security Income programs. More than...
Annual Statistical Supplement, 2015
Social Security Administration — The Annual Statistical Supplement, 2015 includes the most comprehensive data available on the Social Security and Supplemental Security Income programs. More than...
Annual Statistical Supplement, 2003
Social Security Administration — The Annual Statistical Supplement, 2003 includes the most comprehensive data available on the Social Security and Supplemental Security Income programs. More than...
Annual Statistical Supplement, 2017
Social Security Administration — The Annual Statistical Supplement, 2017 includes the most comprehensive data available on the Social Security and Supplemental Security Income programs. More than...
Annual Statistical Supplement, 2008
Social Security Administration — The Annual Statistical Supplement, 2008 includes the most comprehensive data available on the Social Security and Supplemental Security Income programs. More than...
Annual Statistical Supplement, 2014
Social Security Administration — The Annual Statistical Supplement, 2014 includes the most comprehensive data available on the Social Security and Supplemental Security Income programs. More than...
Annual Statistical Supplement, 2004
Social Security Administration — The Annual Statistical Supplement, 2004 includes the most comprehensive data available on the Social Security and Supplemental Security Income programs. More than...
Annual Statistical Supplement, 2000
Social Security Administration — The Annual Statistical Supplement, 2000 includes the most comprehensive data available on the Social Security and Supplemental Security Income programs. More than...
Annual Statistical Supplement, 2009
Social Security Administration — The Annual Statistical Supplement, 2009 includes the most comprehensive data available on the Social Security and Supplemental Security Income programs. More than...
Annual Statistical Supplement, 2006
Social Security Administration — The Annual Statistical Supplement, 2006 includes the most comprehensive data available on the Social Security and Supplemental Security Income programs. More than...
Bulmer, M G
1979-01-01
There are many textbooks which describe current methods of statistical analysis, while neglecting related theory. There are equally many advanced textbooks which delve into the far reaches of statistical theory, while bypassing practical applications. But between these two approaches is an unfilled gap, in which theory and practice merge at an intermediate level. Professor M. G. Bulmer's Principles of Statistics, originally published in 1965, was created to fill that need. The new, corrected Dover edition of Principles of Statistics makes this invaluable mid-level text available once again fo
Kanji, Gopal K
2006-01-01
This expanded and updated Third Edition of Gopal K. Kanji's best-selling resource on statistical tests covers all the most commonly used tests with information on how to calculate and interpret results with simple datasets. Each entry begins with a short summary statement about the test's purpose, and contains details of the test objective, the limitations (or assumptions) involved, a brief outline of the method, a worked example, and the numerical calculation. 100 Statistical Tests, Third Edition is the one indispensable guide for users of statistical materials and consumers of statistical information at all levels and across all disciplines.
Statistical distribution sampling
Johnson, E. S.
1975-01-01
Determining the distribution of statistics by sampling was investigated. Characteristic functions, the quadratic regression problem, and the differential equations for the characteristic functions are analyzed.
Indian Academy of Sciences (India)
IAS Admin
Pauli exclusion principle, Fermi–. Dirac statistics, identical and in- distinguishable particles, Fermi gas. Fermi–Dirac Statistics. Derivation and Consequences. S Chaturvedi and Shyamal Biswas. (left) Subhash Chaturvedi is at University of. Hyderabad. His current research interests include phase space descriptions.
Generalized interpolative quantum statistics
International Nuclear Information System (INIS)
Ramanathan, R.
1992-01-01
A generalized interpolative quantum statistics is presented by conjecturing a certain reordering of phase space due to the presence of possible exotic objects other than bosons and fermions. Such an interpolation achieved through a Bose-counting strategy predicts the existence of an infinite quantum Boltzmann-Gibbs statistics akin to the one discovered by Greenberg recently
Handbook of Spatial Statistics
Gelfand, Alan E
2010-01-01
Offers an introduction detailing the evolution of the field of spatial statistics. This title focuses on the three main branches of spatial statistics: continuous spatial variation (point referenced data); discrete spatial variation, including lattice and areal unit data; and, spatial point patterns.
Statistics 101 for Radiologists.
Anvari, Arash; Halpern, Elkan F; Samir, Anthony E
2015-10-01
Diagnostic tests have wide clinical applications, including screening, diagnosis, measuring treatment effect, and determining prognosis. Interpreting diagnostic test results requires an understanding of key statistical concepts used to evaluate test efficacy. This review explains descriptive statistics and discusses probability, including mutually exclusive and independent events and conditional probability. In the inferential statistics section, a statistical perspective on study design is provided, together with an explanation of how to select appropriate statistical tests. Key concepts in recruiting study samples are discussed, including representativeness and random sampling. Variable types are defined, including predictor, outcome, and covariate variables, and the relationship of these variables to one another. In the hypothesis testing section, we explain how to determine if observed differences between groups are likely to be due to chance. We explain type I and II errors, statistical significance, and study power, followed by an explanation of effect sizes and how confidence intervals can be used to generalize observed effect sizes to the larger population. Statistical tests are explained in four categories: t tests and analysis of variance, proportion analysis tests, nonparametric tests, and regression techniques. We discuss sensitivity, specificity, accuracy, receiver operating characteristic analysis, and likelihood ratios. Measures of reliability and agreement, including κ statistics, intraclass correlation coefficients, and Bland-Altman graphs and analysis, are introduced. © RSNA, 2015.
International Nuclear Information System (INIS)
2003-01-01
The energy statistical table is a selection of statistical data for energies and countries from 1997 to 2002. It concerns the petroleum, the natural gas, the coal, the electric power, the production, the external market, the consumption per sector, the energy accounting 2002 and graphs on the long-dated forecasting. (A.L.B.)
Bayesian statistical inference
Directory of Open Access Journals (Sweden)
Bruno De Finetti
2017-04-01
Full Text Available This work was translated into English and published in the volume: Bruno De Finetti, Induction and Probability, Biblioteca di Statistica, eds. P. Monari, D. Cocchi, Clueb, Bologna, 1993.Bayesian statistical Inference is one of the last fundamental philosophical papers in which we can find the essential De Finetti's approach to the statistical inference.
Practical statistics for educators
Ravid, Ruth
2014-01-01
Practical Statistics for Educators, Fifth Edition, is a clear and easy-to-follow text written specifically for education students in introductory statistics courses and in action research courses. It is also a valuable resource and guidebook for educational practitioners who wish to study their own settings.
DEFF Research Database (Denmark)
Lauritzen, Steffen Lilholt
This book studies the brilliant Danish 19th Century astronomer, T.N. Thiele who made important contributions to statistics, actuarial science, astronomy and mathematics. The most important of these contributions in statistics are translated into English for the first time, and the text includes...
Huizingh, Eelko K. R. E.
2007-01-01
Accessibly written and easy to use, "Applied Statistics Using SPSS" is an all-in-one self-study guide to SPSS and do-it-yourself guide to statistics. What is unique about Eelko Huizingh's approach is that this book is based around the needs of undergraduate students embarking on their own research project, and its self-help style is designed to…
This tool allows users to animate cancer trends over time by cancer site and cause of death, race, and sex. Provides access to incidence, mortality, and survival. Select the type of statistic, variables, format, and then extract the statistics in a delimited format for further analyses.
Energy statistics yearbook 2002
International Nuclear Information System (INIS)
2005-01-01
The Energy Statistics Yearbook 2002 is a comprehensive collection of international energy statistics prepared by the United Nations Statistics Division. It is the forty-sixth in a series of annual compilations which commenced under the title World Energy Supplies in Selected Years, 1929-1950. It updates the statistical series shown in the previous issue. Supplementary series of monthly and quarterly data on production of energy may be found in the Monthly Bulletin of Statistics. The principal objective of the Yearbook is to provide a global framework of comparable data on long-term trends in the supply of mainly commercial primary and secondary forms of energy. Data for each type of fuel and aggregate data for the total mix of commercial fuels are shown for individual countries and areas and are summarized into regional and world totals. The data are compiled primarily from the annual energy questionnaire distributed by the United Nations Statistics Division and supplemented by official national statistical publications. Where official data are not available or are inconsistent, estimates are made by the Statistics Division based on governmental, professional or commercial materials. Estimates include, but are not limited to, extrapolated data based on partial year information, use of annual trends, trade data based on partner country reports, breakdowns of aggregated data as well as analysis of current energy events and activities
Howard Stauffer; Nadav Nur
2005-01-01
The papers included in the Advances in Statistics section of the Partners in Flight (PIF) 2002 Proceedings represent a small sample of statistical topics of current importance to Partners In Flight research scientists: hierarchical modeling, estimation of detection probabilities, and Bayesian applications. Sauer et al. (this volume) examines a hierarchical model...
Energy statistics yearbook 2001
International Nuclear Information System (INIS)
2004-01-01
The Energy Statistics Yearbook 2001 is a comprehensive collection of international energy statistics prepared by the United Nations Statistics Division. It is the forty-fifth in a series of annual compilations which commenced under the title World Energy Supplies in Selected Years, 1929-1950. It updates the statistical series shown in the previous issue. Supplementary series of monthly and quarterly data on production of energy may be found in the Monthly Bulletin of Statistics. The principal objective of the Yearbook is to provide a global framework of comparable data on long-term trends in the supply of mainly commercial primary and secondary forms of energy. Data for each type of fuel and aggregate data for the total mix of commercial fuels are shown for individual countries and areas and are summarized into regional and world totals. The data are compiled primarily from the annual energy questionnaire distributed by the United Nations Statistics Division and supplemented by official national statistical publications. Where official data are not available or are inconsistent, estimates are made by the Statistics Division based on governmental, professional or commercial materials. Estimates include, but are not limited to, extrapolated data based on partial year information, use of annual trends, trade data based on partner country reports, breakdowns of aggregated data as well as analysis of current energy events and activities
Energy statistics yearbook 2000
International Nuclear Information System (INIS)
2002-01-01
The Energy Statistics Yearbook 2000 is a comprehensive collection of international energy statistics prepared by the United Nations Statistics Division. It is the forty-third in a series of annual compilations which commenced under the title World Energy Supplies in Selected Years, 1929-1950. It updates the statistical series shown in the previous issue. Supplementary series of monthly and quarterly data on production of energy may be found in the Monthly Bulletin of Statistics. The principal objective of the Yearbook is to provide a global framework of comparable data on long-term trends in the supply of mainly commercial primary and secondary forms of energy. Data for each type of fuel and aggregate data for the total mix of commercial fuels are shown for individual countries and areas and are summarized into regional and world totals. The data are compiled primarily from the annual energy questionnaire distributed by the United Nations Statistics Division and supplemented by official national statistical publications. Where official data are not available or are inconsistent, estimates are made by the Statistics Division based on governmental, professional or commercial materials. Estimates include, but are not limited to, extrapolated data based on partial year information, use of annual trends, trade data based on partner country reports, breakdowns of aggregated data as well as analysis of current energy events and activities
Temperature dependent anomalous statistics
International Nuclear Information System (INIS)
Das, A.; Panda, S.
1991-07-01
We show that the anomalous statistics which arises in 2 + 1 dimensional Chern-Simons gauge theories can become temperature dependent in the most natural way. We analyze and show that a statistic's changing phase transition can happen in these theories only as T → ∞. (author). 14 refs
Introduction to Bayesian statistics
Bolstad, William M
2017-01-01
There is a strong upsurge in the use of Bayesian methods in applied statistical analysis, yet most introductory statistics texts only present frequentist methods. Bayesian statistics has many important advantages that students should learn about if they are going into fields where statistics will be used. In this Third Edition, four newly-added chapters address topics that reflect the rapid advances in the field of Bayesian staistics. The author continues to provide a Bayesian treatment of introductory statistical topics, such as scientific data gathering, discrete random variables, robust Bayesian methods, and Bayesian approaches to inferenfe cfor discrete random variables, bionomial proprotion, Poisson, normal mean, and simple linear regression. In addition, newly-developing topics in the field are presented in four new chapters: Bayesian inference with unknown mean and variance; Bayesian inference for Multivariate Normal mean vector; Bayesian inference for Multiple Linear RegressionModel; and Computati...
Understanding advanced statistical methods
Westfall, Peter
2013-01-01
Introduction: Probability, Statistics, and ScienceReality, Nature, Science, and ModelsStatistical Processes: Nature, Design and Measurement, and DataModelsDeterministic ModelsVariabilityParametersPurely Probabilistic Statistical ModelsStatistical Models with Both Deterministic and Probabilistic ComponentsStatistical InferenceGood and Bad ModelsUses of Probability ModelsRandom Variables and Their Probability DistributionsIntroductionTypes of Random Variables: Nominal, Ordinal, and ContinuousDiscrete Probability Distribution FunctionsContinuous Probability Distribution FunctionsSome Calculus-Derivatives and Least SquaresMore Calculus-Integrals and Cumulative Distribution FunctionsProbability Calculation and SimulationIntroductionAnalytic Calculations, Discrete and Continuous CasesSimulation-Based ApproximationGenerating Random NumbersIdentifying DistributionsIntroductionIdentifying Distributions from Theory AloneUsing Data: Estimating Distributions via the HistogramQuantiles: Theoretical and Data-Based Estimate...
Feld, R.; Slob, E. C.; Thorbecke, J.
2015-12-01
Creating virtual sources at locations where physical receivers have measured a response is known as seismic interferometry. A much appreciated benefit of interferometry is its independence of the actual source locations. The use of ambient noise as actual source is therefore not uncommon in this field. Ambient noise can be commercial noise, like for example mobile phone signals. For GPR this can be useful in cases where it is not possible to place a source, for instance when it is prohibited by laws and regulations. A mono-static GPR antenna can measure ambient noise. Interferometry by auto-correlation (AC) places a virtual source on this antenna's position, without actually transmitting anything. This can be used for pavement damage inspection. Earlier work showed very promising results with 2D numerical models of damaged pavement. 1D and 2D heterogeneities were compared, both modelled in a 2D pavement world. In a 1D heterogeneous model energy leaks away to the sides, whereas in a 2D heterogeneous model rays can reflect and therefore still add to the signal reconstruction (see illustration). In the first case the amount of stationary points is strictly limited, while in the other case the amount of stationary points is very large. We extend these models to a 3D world and optimise an experimental configuration. The illustration originates from the journal article under submission 'Non-destructive pavement damage inspection by mono-static GPR without transmitting anything' by R. Feld, E.C. Slob, and J.W. Thorbecke. (a) 2D heterogeneous pavement model with three irregular-shaped misalignments between the base and subbase layer (marked by arrows). Mono-antenna B-scan positions are shown schematically. (b) Ideal output: a real source at the receiver's position. The difference w.r.t. the trace found in the middle is shown. (c) AC output: a virtual source at the receiver's position. There is a clear overlap with the ideal output.
Directory of Open Access Journals (Sweden)
E. D. Beaton
2017-09-01
Full Text Available Proposed radioactive waste repositories require long residence times within deep geological settings for which we have little knowledge of local or regional subsurface dynamics that could affect the transport of hazardous species over the period of radioactive decay. Given the role of microbial processes on element speciation and transport, knowledge and understanding of local microbial ecology within geological formations being considered as host formations can aid predictions for long term safety. In this relatively unexplored environment, sampling opportunities are few and opportunistic. We combined the data collected for geochemistry and microbial abundances from multiple sampling opportunities from within a proposed host formation and performed multivariate mixing and mass balance (M3 modeling, spatial analysis and generalized linear modeling to address whether recharge can explain how subsurface communities assemble within fracture water obtained from multiple saturated fractures accessed by boreholes drilled into the crystalline formation underlying the Chalk River Laboratories site (Deep River, ON, Canada. We found that three possible source waters, each of meteoric origin, explained 97% of the samples, these are: modern recharge, recharge from the period of the Laurentide ice sheet retreat (ca. ∼12000 years before present and a putative saline source assigned as Champlain Sea (also ca. 12000 years before present. The distributed microbial abundances and geochemistry provide a conceptual model of two distinct regions within the subsurface associated with bicarbonate – used as a proxy for modern recharge – and manganese; these regions occur at depths relevant to a proposed repository within the formation. At the scale of sampling, the associated spatial autocorrelation means that abundances linked with geochemistry were not unambiguously discerned, although fine scale Moran’s eigenvector map (MEM coefficients were correlated with
Statistical inference of level densities from resolved resonance parameters
International Nuclear Information System (INIS)
Froehner, F.H.
1983-08-01
Level densities are most directly obtained by counting the resonances observed in the resolved resonance range. Even in the measurements, however, weak levels are invariably missed so that one has to estimate their number and add it to the raw count. The main categories of missinglevel estimators are discussed in the present review, viz. (I) ladder methods including those based on the theory of Hamiltonian matrix ensembles (Dyson-Mehta statistics), (II) methods based on comparison with artificial cross section curves (Monte Carlo simulation, Garrison's autocorrelation method), (III) methods exploiting the observed neutron width distribution by means of Bayesian or more approximate procedures such as maximum-likelihood, least-squares or moment methods, with various recipes for the treatment of detection thresholds and resolution effects. The language of mathematical statistics is employed to clarify the basis of, and the relationship between, the various techniques. Recent progress in the treatment of resolution effects, detection thresholds and p-wave admixture is described. (orig.) [de
Model output statistics applied to wind power prediction
Energy Technology Data Exchange (ETDEWEB)
Joensen, A; Giebel, G; Landberg, L [Risoe National Lab., Roskilde (Denmark); Madsen, H; Nielsen, H A [The Technical Univ. of Denmark, Dept. of Mathematical Modelling, Lyngby (Denmark)
1999-03-01
Being able to predict the output of a wind farm online for a day or two in advance has significant advantages for utilities, such as better possibility to schedule fossil fuelled power plants and a better position on electricity spot markets. In this paper prediction methods based on Numerical Weather Prediction (NWP) models are considered. The spatial resolution used in NWP models implies that these predictions are not valid locally at a specific wind farm. Furthermore, due to the non-stationary nature and complexity of the processes in the atmosphere, and occasional changes of NWP models, the deviation between the predicted and the measured wind will be time dependent. If observational data is available, and if the deviation between the predictions and the observations exhibits systematic behavior, this should be corrected for; if statistical methods are used, this approaches is usually referred to as MOS (Model Output Statistics). The influence of atmospheric turbulence intensity, topography, prediction horizon length and auto-correlation of wind speed and power is considered, and to take the time-variations into account, adaptive estimation methods are applied. Three estimation techniques are considered and compared, Extended Kalman Filtering, recursive least squares and a new modified recursive least squares algorithm. (au) EU-JOULE-3. 11 refs.
Ector, Hugo
2010-12-01
I still remember my first book on statistics: "Elementary statistics with applications in medicine and the biological sciences" by Frederick E. Croxton. For me, it has been the start of pursuing understanding statistics in daily life and in medical practice. It was the first volume in a long row of books. In his introduction, Croxton pretends that"nearly everyone involved in any aspect of medicine needs to have some knowledge of statistics". The reality is that for many clinicians, statistics are limited to a "P statistical methods. They have never had the opportunity to learn concise and clear descriptions of the key features. I have experienced how some authors can describe difficult methods in a well understandable language. Others fail completely. As a teacher, I tell my students that life is impossible without a basic knowledge of statistics. This feeling has resulted in an annual seminar of 90 minutes. This tutorial is the summary of this seminar. It is a summary and a transcription of the best pages I have detected.
International Nuclear Information System (INIS)
Tonchev, N.; Shumovskij, A.S.
1986-01-01
The history of investigations, conducted at the JINR in the field of statistical mechanics, beginning with the fundamental works by Bogolyubov N.N. on superconductivity microscopic theory is presented. Ideas, introduced in these works and methods developed in them, have largely determined the ways for developing statistical mechanics in the JINR and Hartree-Fock-Bogolyubov variational principle has become an important method of the modern nucleus theory. A brief review of the main achievements, connected with the development of statistical mechanics methods and their application in different fields of physical science is given
Wallis, W Allen
2014-01-01
Focusing on everyday applications as well as those of scientific research, this classic of modern statistical methods requires little to no mathematical background. Readers develop basic skills for evaluating and using statistical data. Lively, relevant examples include applications to business, government, social and physical sciences, genetics, medicine, and public health. ""W. Allen Wallis and Harry V. Roberts have made statistics fascinating."" - The New York Times ""The authors have set out with considerable success, to write a text which would be of interest and value to the student who,
D'Alessio, Michael
2012-01-01
AP Statistics Crash Course - Gets You a Higher Advanced Placement Score in Less Time Crash Course is perfect for the time-crunched student, the last-minute studier, or anyone who wants a refresher on the subject. AP Statistics Crash Course gives you: Targeted, Focused Review - Study Only What You Need to Know Crash Course is based on an in-depth analysis of the AP Statistics course description outline and actual Advanced Placement test questions. It covers only the information tested on the exam, so you can make the most of your valuable study time. Our easy-to-read format covers: exploring da
Mauro, John
2013-01-01
Written to reveal statistical deceptions often thrust upon unsuspecting journalists, this book views the use of numbers from a public perspective. Illustrating how the statistical naivete of journalists often nourishes quantitative misinformation, the author's intent is to make journalists more critical appraisers of numerical data so that in reporting them they do not deceive the public. The book frequently uses actual reported examples of misused statistical data reported by mass media and describes how journalists can avoid being taken in by them. Because reports of survey findings seldom g
Liao, Tim Futing
2011-01-01
An incomparably useful examination of statistical methods for comparisonThe nature of doing science, be it natural or social, inevitably calls for comparison. Statistical methods are at the heart of such comparison, for they not only help us gain understanding of the world around us but often define how our research is to be carried out. The need to compare between groups is best exemplified by experiments, which have clearly defined statistical methods. However, true experiments are not always possible. What complicates the matter more is a great deal of diversity in factors that are not inde
Statistical Pattern Recognition
Webb, Andrew R
2011-01-01
Statistical pattern recognition relates to the use of statistical techniques for analysing data measurements in order to extract information and make justified decisions. It is a very active area of study and research, which has seen many advances in recent years. Applications such as data mining, web searching, multimedia data retrieval, face recognition, and cursive handwriting recognition, all require robust and efficient pattern recognition techniques. This third edition provides an introduction to statistical pattern theory and techniques, with material drawn from a wide range of fields,
Mineral statistics yearbook 1994
International Nuclear Information System (INIS)
1994-01-01
A summary of mineral production in Saskatchewan was compiled and presented as a reference manual. Statistical information on fuel minerals such as crude oil, natural gas, liquefied petroleum gas and coal, and of industrial and metallic minerals, such as potash, sodium sulphate, salt and uranium, was provided in all conceivable variety of tables. Production statistics, disposition and value of sales of industrial and metallic minerals were also made available. Statistical data on drilling of oil and gas reservoirs and crown land disposition were also included. figs., tabs
Evolutionary Statistical Procedures
Baragona, Roberto; Poli, Irene
2011-01-01
This proposed text appears to be a good introduction to evolutionary computation for use in applied statistics research. The authors draw from a vast base of knowledge about the current literature in both the design of evolutionary algorithms and statistical techniques. Modern statistical research is on the threshold of solving increasingly complex problems in high dimensions, and the generalization of its methodology to parameters whose estimators do not follow mathematically simple distributions is underway. Many of these challenges involve optimizing functions for which analytic solutions a
Methods of statistical physics
Akhiezer, Aleksandr I
1981-01-01
Methods of Statistical Physics is an exposition of the tools of statistical mechanics, which evaluates the kinetic equations of classical and quantized systems. The book also analyzes the equations of macroscopic physics, such as the equations of hydrodynamics for normal and superfluid liquids and macroscopic electrodynamics. The text gives particular attention to the study of quantum systems. This study begins with a discussion of problems of quantum statistics with a detailed description of the basics of quantum mechanics along with the theory of measurement. An analysis of the asymptotic be
Cancer Data and Statistics Tools
... Educational Campaigns Initiatives Stay Informed Cancer Data and Statistics Tools Recommend on Facebook Tweet Share Compartir Cancer Statistics Tools United States Cancer Statistics: Data Visualizations The ...
Elements of statistical thermodynamics
Nash, Leonard K
2006-01-01
Encompassing essentially all aspects of statistical mechanics that appear in undergraduate texts, this concise, elementary treatment shows how an atomic-molecular perspective yields new insights into macroscopic thermodynamics. 1974 edition.
Department of Veterans Affairs — National-level, VISN-level, and/or VAMC-level statistics on the numbers and percentages of users of VHA care form the Northeast Program Evaluation Center (NEPEC)....
International Nuclear Information System (INIS)
Hilaire, S.
2001-01-01
A review of the statistical model of nuclear reactions is presented. The main relations are described, together with the ingredients necessary to perform practical calculations. In addition, a substantial overview of the width fluctuation correction factor is given. (author)
... and Statistics Recommend on Facebook Tweet Share Compartir Plague in the United States Plague was first introduced ... them at higher risk. Reported Cases of Human Plague - United States, 1970-2016 Since the mid–20th ...
Statistical Measures of Marksmanship
National Research Council Canada - National Science Library
Johnson, Richard
2001-01-01
.... This report describes objective statistical procedures to measure both rifle marksmanship accuracy, the proximity of an array of shots to the center of mass of a target, and marksmanship precision...
Titanic: A Statistical Exploration.
Takis, Sandra L.
1999-01-01
Uses the available data about the Titanic's passengers to interest students in exploring categorical data and the chi-square distribution. Describes activities incorporated into a statistics class and gives additional resources for collecting information about the Titanic. (ASK)
... About Us Information For… Media Policy Makers Data & Statistics Recommend on Facebook Tweet Share Compartir Sickle cell ... 1999 through 2002. This drop coincided with the introduction in 2000 of a vaccine that protects against ...
U.S. Department of Health & Human Services — The United States Cancer Statistics (USCS) online databases in WONDER provide cancer incidence and mortality data for the United States for the years since 1999, by...
Probability and Statistical Inference
Prosper, Harrison B.
2006-01-01
These lectures introduce key concepts in probability and statistical inference at a level suitable for graduate students in particle physics. Our goal is to paint as vivid a picture as possible of the concepts covered.
On quantum statistical inference
Barndorff-Nielsen, O.E.; Gill, R.D.; Jupp, P.E.
2003-01-01
Interest in problems of statistical inference connected to measurements of quantum systems has recently increased substantially, in step with dramatic new developments in experimental techniques for studying small quantum systems. Furthermore, developments in the theory of quantum measurements have
CMS Statistics Reference Booklet
U.S. Department of Health & Human Services — The annual CMS Statistics reference booklet provides a quick reference for summary information about health expenditures and the Medicare and Medicaid health...
Statistical mechanics of superconductivity
Kita, Takafumi
2015-01-01
This book provides a theoretical, step-by-step comprehensive explanation of superconductivity for undergraduate and graduate students who have completed elementary courses on thermodynamics and quantum mechanics. To this end, it adopts the unique approach of starting with the statistical mechanics of quantum ideal gases and successively adding and clarifying elements and techniques indispensible for understanding it. They include the spin-statistics theorem, second quantization, density matrices, the Bloch–De Dominicis theorem, the variational principle in statistical mechanics, attractive interaction, and bound states. Ample examples of their usage are also provided in terms of topics from advanced statistical mechanics such as two-particle correlations of quantum ideal gases, derivation of the Hartree–Fock equations, and Landau’s Fermi-liquid theory, among others. With these preliminaries, the fundamental mean-field equations of superconductivity are derived with maximum mathematical clarity based on ...
Statistical electromagnetics: Complex cavities
Naus, H.W.L.
2008-01-01
A selection of the literature on the statistical description of electromagnetic fields and complex cavities is concisely reviewed. Some essential concepts, for example, the application of the central limit theorem and the maximum entropy principle, are scrutinized. Implicit assumptions, biased
Davison, Anthony C.; Huser, Raphaë l
2015-01-01
Statistics of extremes concerns inference for rare events. Often the events have never yet been observed, and their probabilities must therefore be estimated by extrapolation of tail models fitted to available data. Because data concerning the event
Statistics: a Bayesian perspective
National Research Council Canada - National Science Library
Berry, Donald A
1996-01-01
...: it is the only introductory textbook based on Bayesian ideas, it combines concepts and methods, it presents statistics as a means of integrating data into the significant process, it develops ideas...
Saffran, Jenny R.; Kirkham, Natasha Z.
2017-01-01
Perception involves making sense of a dynamic, multimodal environment. In the absence of mechanisms capable of exploiting the statistical patterns in the natural world, infants would face an insurmountable computational problem. Infant statistical learning mechanisms facilitate the detection of structure. These abilities allow the infant to compute across elements in their environmental input, extracting patterns for further processing and subsequent learning. In this selective review, we summarize findings that show that statistical learning is both a broad and flexible mechanism (supporting learning from different modalities across many different content areas) and input specific (shifting computations depending on the type of input and goal of learning). We suggest that statistical learning not only provides a framework for studying language development and object knowledge in constrained laboratory settings, but also allows researchers to tackle real-world problems, such as multilingualism, the role of ever-changing learning environments, and differential developmental trajectories. PMID:28793812
CSIR Research Space (South Africa)
Shepperson, L
1997-12-01
Full Text Available This publication contains transport and related statistics on roads, vehicles, infrastructure, passengers, freight, rail, air, maritime and road traffic, and international comparisons. The information compiled in this publication has been gathered...
Müller-Kirsten, Harald J W
2013-01-01
Statistics links microscopic and macroscopic phenomena, and requires for this reason a large number of microscopic elements like atoms. The results are values of maximum probability or of averaging. This introduction to statistical physics concentrates on the basic principles, and attempts to explain these in simple terms supplemented by numerous examples. These basic principles include the difference between classical and quantum statistics, a priori probabilities as related to degeneracies, the vital aspect of indistinguishability as compared with distinguishability in classical physics, the differences between conserved and non-conserved elements, the different ways of counting arrangements in the three statistics (Maxwell-Boltzmann, Fermi-Dirac, Bose-Einstein), the difference between maximization of the number of arrangements of elements, and averaging in the Darwin-Fowler method. Significant applications to solids, radiation and electrons in metals are treated in separate chapters, as well as Bose-Eins...
Genton, Marc G.; Castruccio, Stefano; Crippa, Paola; Dutta, Subhajit; Huser, Raphaë l; Sun, Ying; Vettori, Sabrina
2015-01-01
This paper explores the use of visualization through animations, coined visuanimation, in the field of statistics. In particular, it illustrates the embedding of animations in the paper itself and the storage of larger movies in the online
Playing at Statistical Mechanics
Clark, Paul M.; And Others
1974-01-01
Discussed are the applications of counting techniques of a sorting game to distributions and concepts in statistical mechanics. Included are the following distributions: Fermi-Dirac, Bose-Einstein, and most probable. (RH)
Illinois travel statistics, 2008
2009-01-01
The 2008 Illinois Travel Statistics publication is assembled to provide detailed traffic : information to the different users of traffic data. While most users of traffic data at this level : of detail are within the Illinois Department of Transporta...
Illinois travel statistics, 2009
2010-01-01
The 2009 Illinois Travel Statistics publication is assembled to provide detailed traffic : information to the different users of traffic data. While most users of traffic data at this level : of detail are within the Illinois Department of Transporta...
Cholesterol Facts and Statistics
... Managing High Cholesterol Cholesterol-lowering Medicine High Cholesterol Statistics and Maps High Cholesterol Facts High Cholesterol Maps ... Deo R, et al. Heart disease and stroke statistics—2017 update: a report from the American Heart ...
Illinois travel statistics, 2010
2011-01-01
The 2010 Illinois Travel Statistics publication is assembled to provide detailed traffic : information to the different users of traffic data. While most users of traffic data at this level : of detail are within the Illinois Department of Transporta...
U.S. Department of Health & Human Services — This section contains statistical information and reports related to the percentage of electronic transactions being sent to Medicare contractors in the formats...
Information theory and statistics
Kullback, Solomon
1968-01-01
Highly useful text studies logarithmic measures of information and their application to testing statistical hypotheses. Includes numerous worked examples and problems. References. Glossary. Appendix. 1968 2nd, revised edition.
Department of Homeland Security — Accident statistics available on the Coast Guard’s website by state, year, and one variable to obtain tables and/or graphs. Data from reports has been loaded for...
Medicaid Drug Claims Statistics
U.S. Department of Health & Human Services — The Medicaid Drug Claims Statistics CD is a useful tool that conveniently breaks up Medicaid claim counts and separates them by quarter and includes an annual count.
Scheck, Florian
2016-01-01
Scheck’s textbook starts with a concise introduction to classical thermodynamics, including geometrical aspects. Then a short introduction to probabilities and statistics lays the basis for the statistical interpretation of thermodynamics. Phase transitions, discrete models and the stability of matter are explained in great detail. Thermodynamics has a special role in theoretical physics. Due to the general approach of thermodynamics the field has a bridging function between several areas like the theory of condensed matter, elementary particle physics, astrophysics and cosmology. The classical thermodynamics describes predominantly averaged properties of matter, reaching from few particle systems and state of matter to stellar objects. Statistical Thermodynamics covers the same fields, but explores them in greater depth and unifies classical statistical mechanics with quantum theory of multiple particle systems. The content is presented as two tracks: the fast track for master students, providing the essen...
Record Statistics and Dynamics
DEFF Research Database (Denmark)
Sibani, Paolo; Jensen, Henrik J.
2009-01-01
with independent random increments. The term record dynamics covers the rather new idea that records may, in special situations, have measurable dynamical consequences. The approach applies to the aging dynamics of glasses and other systems with multiple metastable states. The basic idea is that record sizes...... fluctuations of e. g. the energy are able to push the system past some sort of ‘edge of stability’, inducing irreversible configurational changes, whose statistics then closely follows the statistics of record fluctuations....
Introductory statistical inference
Mukhopadhyay, Nitis
2014-01-01
This gracefully organized text reveals the rigorous theory of probability and statistical inference in the style of a tutorial, using worked examples, exercises, figures, tables, and computer simulations to develop and illustrate concepts. Drills and boxed summaries emphasize and reinforce important ideas and special techniques.Beginning with a review of the basic concepts and methods in probability theory, moments, and moment generating functions, the author moves to more intricate topics. Introductory Statistical Inference studies multivariate random variables, exponential families of dist
Statistical mechanics rigorous results
Ruelle, David
1999-01-01
This classic book marks the beginning of an era of vigorous mathematical progress in equilibrium statistical mechanics. Its treatment of the infinite system limit has not been superseded, and the discussion of thermodynamic functions and states remains basic for more recent work. The conceptual foundation provided by the Rigorous Results remains invaluable for the study of the spectacular developments of statistical mechanics in the second half of the 20th century.
Statistical mechanics of anyons
International Nuclear Information System (INIS)
Arovas, D.P.
1985-01-01
We study the statistical mechanics of a two-dimensional gas of free anyons - particles which interpolate between Bose-Einstein and Fermi-Dirac character. Thermodynamic quantities are discussed in the low-density regime. In particular, the second virial coefficient is evaluated by two different methods and is found to exhibit a simple, periodic, but nonanalytic behavior as a function of the statistics determining parameter. (orig.)
Mulholland, Henry
1968-01-01
Fundamentals of Statistics covers topics on the introduction, fundamentals, and science of statistics. The book discusses the collection, organization and representation of numerical data; elementary probability; the binomial Poisson distributions; and the measures of central tendency. The text describes measures of dispersion for measuring the spread of a distribution; continuous distributions for measuring on a continuous scale; the properties and use of normal distribution; and tests involving the normal or student's 't' distributions. The use of control charts for sample means; the ranges
Business statistics I essentials
Clark, Louise
2014-01-01
REA's Essentials provide quick and easy access to critical information in a variety of different fields, ranging from the most basic to the most advanced. As its name implies, these concise, comprehensive study guides summarize the essentials of the field covered. Essentials are helpful when preparing for exams, doing homework and will remain a lasting reference source for students, teachers, and professionals. Business Statistics I includes descriptive statistics, introduction to probability, probability distributions, sampling and sampling distributions, interval estimation, and hypothesis t
Johnson, Norman
This is author-approved bcc: This is the third volume of a collection of seminal papers in the statistical sciences written during the past 110 years. These papers have each had an outstanding influence on the development of statistical theory and practice over the last century. Each paper is preceded by an introduction written by an authority in the field providing background information and assessing its influence. Volume III concerntrates on articles from the 1980's while including some earlier articles not included in Volume I and II. Samuel Kotz is Professor of Statistics in the College of Business and Management at the University of Maryland. Norman L. Johnson is Professor Emeritus of Statistics at the University of North Carolina. Also available: Breakthroughs in Statistics Volume I: Foundations and Basic Theory Samuel Kotz and Norman L. Johnson, Editors 1993. 631 pp. Softcover. ISBN 0-387-94037-5 Breakthroughs in Statistics Volume II: Methodology and Distribution Samuel Kotz and Norman L. Johnson, Edi...
UN Data- Environmental Statistics: Waste
World Wide Human Geography Data Working Group — The Environment Statistics Database contains selected water and waste statistics by country. Statistics on water and waste are based on official statistics supplied...
UN Data: Environment Statistics: Waste
World Wide Human Geography Data Working Group — The Environment Statistics Database contains selected water and waste statistics by country. Statistics on water and waste are based on official statistics supplied...
Frydel, Derek; Rice, Stuart A
2007-12-01
We report a hydrodynamic analysis of the long-time behavior of the linear and angular velocity autocorrelation functions of an isolated colloid particle constrained to have quasi-two-dimensional motion, and compare the predicted behavior with the results of lattice-Boltzmann simulations. Our analysis uses the singularity method to characterize unsteady linear motion of an incompressible fluid. For bounded fluids we construct an image system with a discrete set of fundamental solutions of the Stokes equation from which we extract the long-time decay of the velocity. For the case that there are free slip boundary conditions at walls separated by H particle diameters, the time evolution of the parallel linear velocity and the perpendicular rotational velocity following impulsive excitation both correspond to the time evolution of a two-dimensional (2D) fluid with effective density rho_(2D)=rhoH. For the case that there are no slip boundary conditions at the walls, the same types of motion correspond to 2D fluid motions with a coefficient of friction xi=pi(2)nu/H(2) modulo a prefactor of order 1, with nu the kinematic viscosity. The linear particle motion perpendicular to the walls also experiences an effective frictional force, but the time dependence is proportional to t(-2) , which cannot be related to either pure 3D or pure 2D fluid motion. Our incompressible fluid model predicts correct self-diffusion constants but it does not capture all of the effects of the fluid confinement on the particle motion. In particular, the linear motion of a particle perpendicular to the walls is influenced by coupling between the density flux and the velocity field, which leads to damped velocity oscillations whose frequency is proportional to c_(s)/H , with c_(s) the velocity of sound. For particle motion parallel to no slip walls there is a slowing down of a density flux that spreads diffusively, which generates a long-time decay proportional to t(-1) .
Conformity and statistical tolerancing
Leblond, Laurent; Pillet, Maurice
2018-02-01
Statistical tolerancing was first proposed by Shewhart (Economic Control of Quality of Manufactured Product, (1931) reprinted 1980 by ASQC), in spite of this long history, its use remains moderate. One of the probable reasons for this low utilization is undoubtedly the difficulty for designers to anticipate the risks of this approach. The arithmetic tolerance (worst case) allows a simple interpretation: conformity is defined by the presence of the characteristic in an interval. Statistical tolerancing is more complex in its definition. An interval is not sufficient to define the conformance. To justify the statistical tolerancing formula used by designers, a tolerance interval should be interpreted as the interval where most of the parts produced should probably be located. This tolerance is justified by considering a conformity criterion of the parts guaranteeing low offsets on the latter characteristics. Unlike traditional arithmetic tolerancing, statistical tolerancing requires a sustained exchange of information between design and manufacture to be used safely. This paper proposes a formal definition of the conformity, which we apply successively to the quadratic and arithmetic tolerancing. We introduce a concept of concavity, which helps us to demonstrate the link between tolerancing approach and conformity. We use this concept to demonstrate the various acceptable propositions of statistical tolerancing (in the space decentring, dispersion).
Intuitive introductory statistics
Wolfe, Douglas A
2017-01-01
This textbook is designed to give an engaging introduction to statistics and the art of data analysis. The unique scope includes, but also goes beyond, classical methodology associated with the normal distribution. What if the normal model is not valid for a particular data set? This cutting-edge approach provides the alternatives. It is an introduction to the world and possibilities of statistics that uses exercises, computer analyses, and simulations throughout the core lessons. These elementary statistical methods are intuitive. Counting and ranking features prominently in the text. Nonparametric methods, for instance, are often based on counts and ranks and are very easy to integrate into an introductory course. The ease of computation with advanced calculators and statistical software, both of which factor into this text, allows important techniques to be introduced earlier in the study of statistics. This book's novel scope also includes measuring symmetry with Walsh averages, finding a nonp...
International Nuclear Information System (INIS)
Holttinen, H.; Tammelin, B.; Hyvoenen, R.
1997-01-01
The recording, analyzing and publishing of statistics of wind energy production has been reorganized in cooperation of VTT Energy, Finnish Meteorological (FMI Energy) and Finnish Wind Energy Association (STY) and supported by the Ministry of Trade and Industry (KTM). VTT Energy has developed a database that contains both monthly data and information on the wind turbines, sites and operators involved. The monthly production figures together with component failure statistics are collected from the operators by VTT Energy, who produces the final wind energy statistics to be published in Tuulensilmae and reported to energy statistics in Finland and abroad (Statistics Finland, Eurostat, IEA). To be able to verify the annual and monthly wind energy potential with average wind energy climate a production index in adopted. The index gives the expected wind energy production at various areas in Finland calculated using real wind speed observations, air density and a power curve for a typical 500 kW-wind turbine. FMI Energy has produced the average figures for four weather stations using the data from 1985-1996, and produces the monthly figures. (orig.)
Probability and Bayesian statistics
1987-01-01
This book contains selected and refereed contributions to the "Inter national Symposium on Probability and Bayesian Statistics" which was orga nized to celebrate the 80th birthday of Professor Bruno de Finetti at his birthplace Innsbruck in Austria. Since Professor de Finetti died in 1985 the symposium was dedicated to the memory of Bruno de Finetti and took place at Igls near Innsbruck from 23 to 26 September 1986. Some of the pa pers are published especially by the relationship to Bruno de Finetti's scientific work. The evolution of stochastics shows growing importance of probability as coherent assessment of numerical values as degrees of believe in certain events. This is the basis for Bayesian inference in the sense of modern statistics. The contributions in this volume cover a broad spectrum ranging from foundations of probability across psychological aspects of formulating sub jective probability statements, abstract measure theoretical considerations, contributions to theoretical statistics an...
Multivariate Statistical Process Control
DEFF Research Database (Denmark)
Kulahci, Murat
2013-01-01
As sensor and computer technology continues to improve, it becomes a normal occurrence that we confront with high dimensional data sets. As in many areas of industrial statistics, this brings forth various challenges in statistical process control (SPC) and monitoring for which the aim...... is to identify “out-of-control” state of a process using control charts in order to reduce the excessive variation caused by so-called assignable causes. In practice, the most common method of monitoring multivariate data is through a statistic akin to the Hotelling’s T2. For high dimensional data with excessive...... amount of cross correlation, practitioners are often recommended to use latent structures methods such as Principal Component Analysis to summarize the data in only a few linear combinations of the original variables that capture most of the variation in the data. Applications of these control charts...
DEFF Research Database (Denmark)
Lindström, Erik; Madsen, Henrik; Nielsen, Jan Nygaard
Statistics for Finance develops students’ professional skills in statistics with applications in finance. Developed from the authors’ courses at the Technical University of Denmark and Lund University, the text bridges the gap between classical, rigorous treatments of financial mathematics...... that rarely connect concepts to data and books on econometrics and time series analysis that do not cover specific problems related to option valuation. The book discusses applications of financial derivatives pertaining to risk assessment and elimination. The authors cover various statistical...... and mathematical techniques, including linear and nonlinear time series analysis, stochastic calculus models, stochastic differential equations, Itō’s formula, the Black–Scholes model, the generalized method-of-moments, and the Kalman filter. They explain how these tools are used to price financial derivatives...
1992 Energy statistics Yearbook
International Nuclear Information System (INIS)
1994-01-01
The principal objective of the Yearbook is to provide a global framework of comparable data on long-term trends in the supply of mainly commercial primary and secondary forms of energy. Data for each type of fuel and aggregate data for the total mix of commercial fuels are shown for individual countries and areas and are summarized into regional and world totals. The data are compiled primarily from annual questionnaires distributed by the United Nations Statistical Division and supplemented by official national statistical publications. Where official data are not available or are inconsistent, estimates are made by the Statistical Division based on governmental, professional or commercial materials. Estimates include, but are not limited to, extrapolated data based on partial year information, use of annual trends, trade data based on partner country reports, breakdowns of aggregated data as well as analysis of current energy events and activities
Energy Technology Data Exchange (ETDEWEB)
NONE
2010-07-01
Detailed, complete, timely and reliable statistics are essential to monitor the energy situation at a country level as well as at an international level. Energy statistics on supply, trade, stocks, transformation and demand are indeed the basis for any sound energy policy decision. For instance, the market of oil -- which is the largest traded commodity worldwide -- needs to be closely monitored in order for all market players to know at any time what is produced, traded, stocked and consumed and by whom. In view of the role and importance of energy in world development, one would expect that basic energy information to be readily available and reliable. This is not always the case and one can even observe a decline in the quality, coverage and timeliness of energy statistics over the last few years.
Statistical Engine Knock Control
DEFF Research Database (Denmark)
Stotsky, Alexander A.
2008-01-01
A new statistical concept of the knock control of a spark ignition automotive engine is proposed . The control aim is associated with the statistical hy pothesis test which compares the threshold value to the average value of the max imal amplitud e of the knock sensor signal at a given freq uency....... C ontrol algorithm which is used for minimization of the regulation error realizes a simple count-up-count-d own logic. A new ad aptation algorithm for the knock d etection threshold is also d eveloped . C onfi d ence interval method is used as the b asis for ad aptation. A simple statistical mod el...... which includ es generation of the amplitud e signals, a threshold value d etermination and a knock sound mod el is d eveloped for evaluation of the control concept....
Forster, Malcolm R
2011-01-01
Statisticians and philosophers of science have many common interests but restricted communication with each other. This volume aims to remedy these shortcomings. It provides state-of-the-art research in the area of philosophy of statistics by encouraging numerous experts to communicate with one another without feeling "restricted” by their disciplines or thinking "piecemeal” in their treatment of issues. A second goal of this book is to present work in the field without bias toward any particular statistical paradigm. Broadly speaking, the essays in this Handbook are concerned with problems of induction, statistics and probability. For centuries, foundational problems like induction have been among philosophers' favorite topics; recently, however, non-philosophers have increasingly taken a keen interest in these issues. This volume accordingly contains papers by both philosophers and non-philosophers, including scholars from nine academic disciplines.
Perception in statistical graphics
VanderPlas, Susan Ruth
There has been quite a bit of research on statistical graphics and visualization, generally focused on new types of graphics, new software to create graphics, interactivity, and usability studies. Our ability to interpret and use statistical graphics hinges on the interface between the graph itself and the brain that perceives and interprets it, and there is substantially less research on the interplay between graph, eye, brain, and mind than is sufficient to understand the nature of these relationships. The goal of the work presented here is to further explore the interplay between a static graph, the translation of that graph from paper to mental representation (the journey from eye to brain), and the mental processes that operate on that graph once it is transferred into memory (mind). Understanding the perception of statistical graphics should allow researchers to create more effective graphs which produce fewer distortions and viewer errors while reducing the cognitive load necessary to understand the information presented in the graph. Taken together, these experiments should lay a foundation for exploring the perception of statistical graphics. There has been considerable research into the accuracy of numerical judgments viewers make from graphs, and these studies are useful, but it is more effective to understand how errors in these judgments occur so that the root cause of the error can be addressed directly. Understanding how visual reasoning relates to the ability to make judgments from graphs allows us to tailor graphics to particular target audiences. In addition, understanding the hierarchy of salient features in statistical graphics allows us to clearly communicate the important message from data or statistical models by constructing graphics which are designed specifically for the perceptual system.
READING STATISTICS AND RESEARCH
Directory of Open Access Journals (Sweden)
Reviewed by Yavuz Akbulut
2008-10-01
Full Text Available The book demonstrates the best and most conservative ways to decipher and critique research reports particularly for social science researchers. In addition, new editions of the book are always better organized, effectively structured and meticulously updated in line with the developments in the field of research statistics. Even the most trivial issues are revisited and updated in new editions. For instance, purchaser of the previous editions might check the interpretation of skewness and kurtosis indices in the third edition (p. 34 and in the fifth edition (p.29 to see how the author revisits every single detail. Theory and practice always go hand in hand in all editions of the book. Re-reading previous editions (e.g. third edition before reading the fifth edition gives the impression that the author never stops ameliorating his instructional text writing methods. In brief, “Reading Statistics and Research” is among the best sources showing research consumers how to understand and critically assess the statistical information and research results contained in technical research reports. In this respect, the review written by Mirko Savić in Panoeconomicus (2008, 2, pp. 249-252 will help the readers to get a more detailed overview of each chapters. I cordially urge the beginning researchers to pick a highlighter to conduct a detailed reading with the book. A thorough reading of the source will make the researchers quite selective in appreciating the harmony between the data analysis, results and discussion sections of typical journal articles. If interested, beginning researchers might begin with this book to grasp the basics of research statistics, and prop up their critical research reading skills with some statistics package applications through the help of Dr. Andy Field’s book, Discovering Statistics using SPSS (second edition published by Sage in 2005.
Waller, Derek L
2008-01-01
Statistical analysis is essential to business decision-making and management, but the underlying theory of data collection, organization and analysis is one of the most challenging topics for business students and practitioners. This user-friendly text and CD-ROM package will help you to develop strong skills in presenting and interpreting statistical information in a business or management environment. Based entirely on using Microsoft Excel rather than more complicated applications, it includes a clear guide to using Excel with the key functions employed in the book, a glossary of terms and
Statistics As Principled Argument
Abelson, Robert P
2012-01-01
In this illuminating volume, Robert P. Abelson delves into the too-often dismissed problems of interpreting quantitative data and then presenting them in the context of a coherent story about one's research. Unlike too many books on statistics, this is a remarkably engaging read, filled with fascinating real-life (and real-research) examples rather than with recipes for analysis. It will be of true interest and lasting value to beginning graduate students and seasoned researchers alike. The focus of the book is that the purpose of statistics is to organize a useful argument from quantitative
International Nuclear Information System (INIS)
1998-01-01
The international office of energy information and studies (Enerdata), has published the second edition of its 1997 statistical yearbook which includes consolidated 1996 data with respect to the previous version from June 1997. The CD-Rom comprises the annual worldwide petroleum, natural gas, coal and electricity statistics from 1991 to 1996 with information about production, external trade, consumption, market shares, sectoral distribution of consumption and energy balance sheets. The world is divided into 12 zones (52 countries available). It contains also energy indicators: production and consumption tendencies, supply and production structures, safety of supplies, energy efficiency, and CO 2 emissions. (J.S.)
Einstein's statistical mechanics
Energy Technology Data Exchange (ETDEWEB)
Baracca, A; Rechtman S, R
1985-08-01
The foundation of equilibrium classical statistical mechanics were laid down in 1902 independently by Gibbs and Einstein. The latter's contribution, developed in three papers published between 1902 and 1904, is usually forgotten and when not, rapidly dismissed as equivalent to Gibb's. We review in detail Einstein's ideas on the foundations of statistical mechanics and show that they constitute the beginning of a research program that led Einstein to quantum theory. We also show how these ideas may be used as a starting point for an introductory course on the subject.
Einstein's statistical mechanics
International Nuclear Information System (INIS)
Baracca, A.; Rechtman S, R.
1985-01-01
The foundation of equilibrium classical statistical mechanics were laid down in 1902 independently by Gibbs and Einstein. The latter's contribution, developed in three papers published between 1902 and 1904, is usually forgotten and when not, rapidly dismissed as equivalent to Gibb's. We review in detail Einstein's ideas on the foundations of statistical mechanics and show that they constitute the beginning of a research program that led Einstein to quantum theory. We also show how these ideas may be used as a starting point for an introductory course on the subject. (author)
Neave, Henry R
2012-01-01
This book, designed for students taking a basic introductory course in statistical analysis, is far more than just a book of tables. Each table is accompanied by a careful but concise explanation and useful worked examples. Requiring little mathematical background, Elementary Statistics Tables is thus not just a reference book but a positive and user-friendly teaching and learning aid. The new edition contains a new and comprehensive "teach-yourself" section on a simple but powerful approach, now well-known in parts of industry but less so in academia, to analysing and interpreting process dat
Energy Technology Data Exchange (ETDEWEB)
Suh, M. Y.; Jee, K. Y.; Park, K. K.; Park, Y. J.; Kim, W. H
1999-08-01
This report is intended to describe the statistical methods necessary to design and conduct radiation counting experiments and evaluate the data from the experiment. The methods are described for the evaluation of the stability of a counting system and the estimation of the precision of counting data by application of probability distribution models. The methods for the determination of the uncertainty of the results calculated from the number of counts, as well as various statistical methods for the reduction of counting error are also described. (Author). 11 refs., 8 tabs., 8 figs.
Energy Technology Data Exchange (ETDEWEB)
Suh, M. Y.; Jee, K. Y.; Park, K. K. [Korea Atomic Energy Research Institute, Taejon (Korea)
1999-08-01
This report is intended to describe the statistical methods necessary to design and conduct radiation counting experiments and evaluate the data from the experiments. The methods are described for the evaluation of the stability of a counting system and the estimation of the precision of counting data by application of probability distribution models. The methods for the determination of the uncertainty of the results calculated from the number of counts, as well as various statistical methods for the reduction of counting error are also described. 11 refs., 6 figs., 8 tabs. (Author)
Bayesian statistics an introduction
Lee, Peter M
2012-01-01
Bayesian Statistics is the school of thought that combines prior beliefs with the likelihood of a hypothesis to arrive at posterior beliefs. The first edition of Peter Lee’s book appeared in 1989, but the subject has moved ever onwards, with increasing emphasis on Monte Carlo based techniques. This new fourth edition looks at recent techniques such as variational methods, Bayesian importance sampling, approximate Bayesian computation and Reversible Jump Markov Chain Monte Carlo (RJMCMC), providing a concise account of the way in which the Bayesian approach to statistics develops as wel
Milewski, Emil G
2012-01-01
REA's Essentials provide quick and easy access to critical information in a variety of different fields, ranging from the most basic to the most advanced. As its name implies, these concise, comprehensive study guides summarize the essentials of the field covered. Essentials are helpful when preparing for exams, doing homework and will remain a lasting reference source for students, teachers, and professionals. Statistics II discusses sampling theory, statistical inference, independent and dependent variables, correlation theory, experimental design, count data, chi-square test, and time se
Search Databases and Statistics
DEFF Research Database (Denmark)
Refsgaard, Jan C; Munk, Stephanie; Jensen, Lars J
2016-01-01
having strengths and weaknesses that must be considered for the individual needs. These are reviewed in this chapter. Equally critical for generating highly confident output datasets is the application of sound statistical criteria to limit the inclusion of incorrect peptide identifications from database...... searches. Additionally, careful filtering and use of appropriate statistical tests on the output datasets affects the quality of all downstream analyses and interpretation of the data. Our considerations and general practices on these aspects of phosphoproteomics data processing are presented here....
Environmental accounting and statistics
International Nuclear Information System (INIS)
Bartelmus, P.L.P.
1992-01-01
The objective of sustainable development is to integrate environmental concerns with mainstream socio-economic policies. Integrated policies need to be supported by integrated data. Environmental accounting achieves this integration by incorporating environmental costs and benefits into conventional national accounts. Modified accounting aggregates can thus be used in defining and measuring environmentally sound and sustainable economic growth. Further development objectives need to be assessed by more comprehensive, though necessarily less integrative, systems of environmental statistics and indicators. Integrative frameworks for the different statistical systems in the fields of economy, environment and population would facilitate the provision of comparable data for the analysis of integrated development. (author). 19 refs, 2 figs, 2 tabs
International Nuclear Information System (INIS)
Suh, M. Y.; Jee, K. Y.; Park, K. K.; Park, Y. J.; Kim, W. H.
1999-08-01
This report is intended to describe the statistical methods necessary to design and conduct radiation counting experiments and evaluate the data from the experiment. The methods are described for the evaluation of the stability of a counting system and the estimation of the precision of counting data by application of probability distribution models. The methods for the determination of the uncertainty of the results calculated from the number of counts, as well as various statistical methods for the reduction of counting error are also described. (Author). 11 refs., 8 tabs., 8 figs
Fisher's Contributions to Statistics
Indian Academy of Sciences (India)
Home; Journals; Resonance – Journal of Science Education; Volume 2; Issue 9. Fisher's Contributions to Statistics. T Krishnan. General Article Volume 2 Issue 9 September 1997 pp 32-37. Fulltext. Click here to view fulltext PDF. Permanent link: https://www.ias.ac.in/article/fulltext/reso/002/09/0032-0037. Author Affiliations.
ASURV: Astronomical SURVival Statistics
Feigelson, E. D.; Nelson, P. I.; Isobe, T.; LaValley, M.
2014-06-01
ASURV (Astronomical SURVival Statistics) provides astronomy survival analysis for right- and left-censored data including the maximum-likelihood Kaplan-Meier estimator and several univariate two-sample tests, bivariate correlation measures, and linear regressions. ASURV is written in FORTRAN 77, and is stand-alone and does not call any specialized libraries.
Elementary statistical physics
Kittel, C
1965-01-01
This book is intended to help physics students attain a modest working knowledge of several areas of statistical mechanics, including stochastic processes and transport theory. The areas discussed are among those forming a useful part of the intellectual background of a physicist.
Whither Statistics Education Research?
Watson, Jane
2016-01-01
This year marks the 25th anniversary of the publication of a "National Statement on Mathematics for Australian Schools", which was the first curriculum statement this country had including "Chance and Data" as a significant component. It is hence an opportune time to survey the history of the related statistics education…
African Journals Online (AJOL)
incidents' refer to 'incidents such as labour disputes and dissatisfaction with service delivery in which violence erupted and SAPS action was required to restore peace and order'.26. It is apparent from both the SAPS statistics and those provided by the Municipal IQ Hotspots. Monitor, that public protests and gatherings are.
International Nuclear Information System (INIS)
Oelkers, E.; Heller, A.S.; Farnsworth, D.A.; Kearfott, K.J.
1978-01-01
The report describes the statistical analysis of DNBR thermal-hydraulic margin of a 3800 MWt, 205-FA core under design overpower conditions. The analysis used LYNX-generated data at predetermined values of the input variables whose uncertainties were to be statistically combined. LYNX data were used to construct an efficient response surface model in the region of interest; the statistical analysis was accomplished through the evaluation of core reliability; utilizing propagation of the uncertainty distributions of the inputs. The response surface model was implemented in both the analytical error propagation and Monte Carlo Techniques. The basic structural units relating to the acceptance criteria are fuel pins. Therefore, the statistical population of pins with minimum DNBR values smaller than specified values is determined. The specified values are designated relative to the most probable and maximum design DNBR values on the power limiting pin used in present design analysis, so that gains over the present design criteria could be assessed for specified probabilistic acceptance criteria. The results are equivalent to gains ranging from 1.2 to 4.8 percent of rated power dependent on the acceptance criterion. The corresponding acceptance criteria range from 95 percent confidence that no pin will be in DNB to 99.9 percent of the pins, which are expected to avoid DNB
Illinois forest statistics, 1985.
Jerold T. Hahn
1987-01-01
The third inventory of the timber resource of Illinois shows a 1% increase in commercial forest area and a 40% gain in growing-stock volume between 1962 and 1985. Presented are highlights and statistics on area, volume, growth, mortality, removals, utilization, and biomass.
Air Carrier Traffic Statistics.
2013-11-01
This report contains airline operating statistics for large certificated air carriers based on data reported to U.S. Department of Transportation (DOT) by carriers that hold a certificate issued under Section 401 of the Federal Aviation Act of 1958 a...
CERN. Geneva
2005-01-01
The three lectures will present an introduction to statistical methods as used in High Energy Physics. As the time will be very limited, the course will seek mainly to define the important issues and to introduce the most wide used tools. Topics will include the interpretation and use of probability, estimation of parameters and testing of hypotheses.
Introduction to Statistics course
CERN. Geneva HR-RFA
2006-01-01
The four lectures will present an introduction to statistical methods as used in High Energy Physics. As the time will be very limited, the course will seek mainly to define the important issues and to introduce the most wide used tools. Topics will include the interpretation and use of probability, estimation of parameters and testing of hypotheses.
Fisher's Contributions to Statistics
Indian Academy of Sciences (India)
of statistics are multifarious, profound and long-lasting. In fact, he can be ... that it is not even possible to mention them all in this short article. ... At that time the term 'likelihood' as oppo- .... Dedicated to the memory of Fisher soon after his death,.
Michigan forest statistics, 1980.
Gerhard K. Raile; W. Brad Smith
1983-01-01
The fourth inventory of the timber resource of Michigan shows a 7% decline in commercial forest area and a 27% gain in growing-stock volume between 1966 and 1980. Highlights and statistics are presented on area, volume, growth, mortality, removals, utilization, and biomass.
Air Carrier Traffic Statistics.
2012-07-01
This report contains airline operating statistics for large certificated air carriers based on data reported to U.S. Department of Transportation (DOT) by carriers that hold a certificate issued under Section 401 of the Federal Aviation Act of 1958 a...
Geometric statistical inference
International Nuclear Information System (INIS)
Periwal, Vipul
1999-01-01
A reparametrization-covariant formulation of the inverse problem of probability is explicitly solved for finite sample sizes. The inferred distribution is explicitly continuous for finite sample size. A geometric solution of the statistical inference problem in higher dimensions is outlined
CERN. Geneva
2004-01-01
The three lectures will present an introduction to statistical methods as used in High Energy Physics. As the time will be very limited, the course will seek mainly to define the important issues and to introduce the most wide used tools. Topics will include the interpretation and use of probability, estimation of parameters and testing of hypotheses.
Hemophilia Data and Statistics
... View public health webinars on blood disorders Data & Statistics Language: English (US) Español (Spanish) Recommend on Facebook ... genetic testing is done to diagnose hemophilia before birth. For the one-third ... rates and hospitalization rates for bleeding complications from hemophilia ...
Statistical learning and prejudice.
Madison, Guy; Ullén, Fredrik
2012-12-01
Human behavior is guided by evolutionarily shaped brain mechanisms that make statistical predictions based on limited information. Such mechanisms are important for facilitating interpersonal relationships, avoiding dangers, and seizing opportunities in social interaction. We thus suggest that it is essential for analyses of prejudice and prejudice reduction to take the predictive accuracy and adaptivity of the studied prejudices into account.
Mosteller, Frederick; Hoaglin, David C; Tanur, Judith M
2010-01-01
Includes chapter-length insider accounts of work on the pre-election polls of 1948, statistical aspects of the Kinsey report on sexual behavior in the human male, mathematical learning theory, authorship of the disputed Federalist papers, safety of anesthetics, and an examination of the Coleman report on equality of educational opportunity
Juvenile Court Statistics - 1972.
Office of Youth Development (DHEW), Washington, DC.
This report is a statistical study of juvenile court cases in 1972. The data demonstrates how the court is frequently utilized in dealing with juvenile delinquency by the police as well as by other community agencies and parents. Excluded from this report are the ordinary traffic cases handled by juvenile court. The data indicate that: (1) in…
Juvenile Court Statistics, 1974.
Corbett, Jacqueline; Vereb, Thomas S.
This report presents information on juvenile court processing of youth in the U.S. during 1974. It is based on data gathered under the National Juvenile Court Statistical Reporting System. Findings can be summarized as follows: (1) 1,252,700 juvenile delinquency cases, excluding traffic offenses, were handled by courts in the U.S. in 1974; (2) the…
International Nuclear Information System (INIS)
Kim, Jin Gyeong; Park, Jin Ho; Park, Hyeon Jin; Lee, Jae Jun; Jun, Whong Seok; Whang, Jin Su
2009-08-01
This book explains statistics for engineers using MATLAB, which includes arrangement and summary of data, probability, probability distribution, sampling distribution, assumption, check, variance analysis, regression analysis, categorical data analysis, quality assurance such as conception of control chart, consecutive control chart, breakthrough strategy and analysis using Matlab, reliability analysis like measurement of reliability and analysis with Maltab, and Markov chain.
Thermodynamics for Fractal Statistics
da Cruz, Wellington
1998-01-01
We consider for an anyon gas its termodynamics properties taking into account the fractal statistics obtained by us recently. This approach describes the anyonic excitations in terms of equivalence classes labeled by fractal parameter or Hausdorff dimension $h$. An exact equation of state is obtained in the high-temperature and low-temperature limits, for gases with a constant density of states.
Statistical Hadronization and Holography
DEFF Research Database (Denmark)
Bechi, Jacopo
2009-01-01
In this paper we consider some issues about the statistical model of the hadronization in a holographic approach. We introduce a Rindler like horizon in the bulk and we understand the string breaking as a tunneling event under this horizon. We calculate the hadron spectrum and we get a thermal...
Per Object statistical analysis
DEFF Research Database (Denmark)
2008-01-01
of a specific class in turn, and uses as pair of PPO stages to derive the statistics and then assign them to the objects' Object Variables. It may be that this could all be done in some other, simply way, but several other ways that were tried did not succeed. The procedure ouptut has been tested against...
Beyond quantum microcanonical statistics
International Nuclear Information System (INIS)
Fresch, Barbara; Moro, Giorgio J.
2011-01-01
Descriptions of molecular systems usually refer to two distinct theoretical frameworks. On the one hand the quantum pure state, i.e., the wavefunction, of an isolated system is determined to calculate molecular properties and their time evolution according to the unitary Schroedinger equation. On the other hand a mixed state, i.e., a statistical density matrix, is the standard formalism to account for thermal equilibrium, as postulated in the microcanonical quantum statistics. In the present paper an alternative treatment relying on a statistical analysis of the possible wavefunctions of an isolated system is presented. In analogy with the classical ergodic theory, the time evolution of the wavefunction determines the probability distribution in the phase space pertaining to an isolated system. However, this alone cannot account for a well defined thermodynamical description of the system in the macroscopic limit, unless a suitable probability distribution for the quantum constants of motion is introduced. We present a workable formalism assuring the emergence of typical values of thermodynamic functions, such as the internal energy and the entropy, in the large size limit of the system. This allows the identification of macroscopic properties independently of the specific realization of the quantum state. A description of material systems in agreement with equilibrium thermodynamics is then derived without constraints on the physical constituents and interactions of the system. Furthermore, the canonical statistics is recovered in all generality for the reduced density matrix of a subsystem.
Statistical Analysis and validation
Hoefsloot, H.C.J.; Horvatovich, P.; Bischoff, R.
2013-01-01
In this chapter guidelines are given for the selection of a few biomarker candidates from a large number of compounds with a relative low number of samples. The main concepts concerning the statistical validation of the search for biomarkers are discussed. These complicated methods and concepts are
DEFF Research Database (Denmark)
Larsen, Gunner Chr.; Bierbooms, W.; Hansen, Kurt Schaldemose
2003-01-01
. A theoretical expression for the probability density function associated with local extremes of a stochasticprocess is presented. The expression is basically based on the lower four statistical moments and a bandwidth parameter. The theoretical expression is subsequently verified by comparison with simulated...
Kleibergen, F.R.; Kleijn, R.; Paap, R.
2000-01-01
We propose a novel Bayesian test under a (noninformative) Jeffreys'priorspecification. We check whether the fixed scalar value of the so-calledBayesian Score Statistic (BSS) under the null hypothesis is aplausiblerealization from its known and standardized distribution under thealternative. Unlike
Topics in Statistical Calibration
2014-03-27
Natural cubic spline speed di st 110 B.2 The calibrate function The most basic calibration problem, the one often encountered in more advanced ...0040-1706, 1537-2723. A. M. Mood, F. A. Graybill, and D. C. Boes. Introduction to the Theory of Statistics. McGraw-Hill, Auckland , U.A, 1974. ISBN
On quantum statistical inference
Barndorff-Nielsen, O.E.; Gill, R.D.; Jupp, P.E.
2001-01-01
Recent developments in the mathematical foundations of quantum mechanics have brought the theory closer to that of classical probability and statistics. On the other hand, the unique character of quantum physics sets many of the questions addressed apart from those met classically in stochastics.
Statistical mechanics of solitons
International Nuclear Information System (INIS)
Bishop, A.
1980-01-01
The status of statistical mechanics theory (classical and quantum, statics and dynamics) is reviewed for 1-D soliton or solitary-wave-bearing systems. Primary attention is given to (i) perspective for existing results with evaluation and representative literature guide; (ii) motivation and status report for remaining problems; (iii) discussion of connections with other 1-D topics
Statistics for Learning Genetics
Charles, Abigail Sheena
This study investigated the knowledge and skills that biology students may need to help them understand statistics/mathematics as it applies to genetics. The data are based on analyses of current representative genetics texts, practicing genetics professors' perspectives, and more directly, students' perceptions of, and performance in, doing statistically-based genetics problems. This issue is at the emerging edge of modern college-level genetics instruction, and this study attempts to identify key theoretical components for creating a specialized biological statistics curriculum. The goal of this curriculum will be to prepare biology students with the skills for assimilating quantitatively-based genetic processes, increasingly at the forefront of modern genetics. To fulfill this, two college level classes at two universities were surveyed. One university was located in the northeastern US and the other in the West Indies. There was a sample size of 42 students and a supplementary interview was administered to a select 9 students. Interviews were also administered to professors in the field in order to gain insight into the teaching of statistics in genetics. Key findings indicated that students had very little to no background in statistics (55%). Although students did perform well on exams with 60% of the population receiving an A or B grade, 77% of them did not offer good explanations on a probability question associated with the normal distribution provided in the survey. The scope and presentation of the applicable statistics/mathematics in some of the most used textbooks in genetics teaching, as well as genetics syllabi used by instructors do not help the issue. It was found that the text books, often times, either did not give effective explanations for students, or completely left out certain topics. The omission of certain statistical/mathematical oriented topics was seen to be also true with the genetics syllabi reviewed for this study. Nonetheless
Statistical microeconomics and commodity prices: theory and empirical results.
Baaquie, Belal E
2016-01-13
A review is made of the statistical generalization of microeconomics by Baaquie (Baaquie 2013 Phys. A 392, 4400-4416. (doi:10.1016/j.physa.2013.05.008)), where the market price of every traded commodity, at each instant of time, is considered to be an independent random variable. The dynamics of commodity market prices is given by the unequal time correlation function and is modelled by the Feynman path integral based on an action functional. The correlation functions of the model are defined using the path integral. The existence of the action functional for commodity prices that was postulated to exist in Baaquie (Baaquie 2013 Phys. A 392, 4400-4416. (doi:10.1016/j.physa.2013.05.008)) has been empirically ascertained in Baaquie et al. (Baaquie et al. 2015 Phys. A 428, 19-37. (doi:10.1016/j.physa.2015.02.030)). The model's action functionals for different commodities has been empirically determined and calibrated using the unequal time correlation functions of the market commodity prices using a perturbation expansion (Baaquie et al. 2015 Phys. A 428, 19-37. (doi:10.1016/j.physa.2015.02.030)). Nine commodities drawn from the energy, metal and grain sectors are empirically studied and their auto-correlation for up to 300 days is described by the model to an accuracy of R(2)>0.90-using only six parameters. © 2015 The Author(s).
Statistical properties of turbulence in a toroidal magnetized ECR plasma
International Nuclear Information System (INIS)
Yu Yi; Lu Ronghua; Wang Zhijiang; Wen Yizhi; Yu Changxuan; Wan Shude; Liu, Wandong
2008-01-01
The statistical analyses of fluctuation data measured by electrostatic-probe arrays clearly show that the self-organized criticality (SOC) avalanches are not the dominant behaviors in a toroidal ECR plasma in the SMT (Simple Magnetic Torus) mode of KT-5D device. The f -1 index region in the auto-correlation spectra of the floating potential V f and the ion saturation current I s , which is a fingerprint of a SOC system, ranges only in a narrow frequency band. By investigating the Hurst exponents at increasingly coarse grained time series, we find that at a time scale of τ>100 μs, there exists no or a very weak long-range correlation over two decades in τ. The difference between the PDFs of I s and V f clearly shows a more global nature of the latter. The transport flux induced by the turbulence suggests that the natural intermittency of turbulent transport maybe independent of the avalanche induced by near criticality. The drift instability is dominant in a SMT plasma generated by means of ECR discharges
Statistical Analysis of Environmental Tritium around Wolsong Site
Energy Technology Data Exchange (ETDEWEB)
Kim, Ju Youl [FNC Technology Co., Yongin (Korea, Republic of)
2010-04-15
To find the relationship among airborne tritium, tritium in rainwater, TFWT (Tissue Free Water Tritium) and TBT (Tissue Bound Tritium), statistical analysis is conducted based on tritium data measured at KHNP employees' house around Wolsong nuclear power plants during 10 years from 1999 to 2008. The results show that tritium in such media exhibits a strong seasonal and annual periodicity. Tritium concentration in rainwater is observed to be highly correlated with TFWT and directly transmitted to TFWT without delay. The response of environmental radioactivity of tritium around Wolsong site is analyzed using time-series technique and non-parametric trend analysis. Tritium in the atmosphere and rainwater is strongly auto-correlated by seasonal and annual periodicity. TFWT concentration in pine needle is proven to be more sensitive to rainfall phenomenon than other weather variables. Non-parametric trend analysis of TFWT concentration within pine needle shows a increasing slope in terms of confidence level of 95%. This study demonstrates a usefulness of time-series and trend analysis for the interpretation of environmental radioactivity relationship with various environmental media.
Software Used to Generate Cancer Statistics - SEER Cancer Statistics
Videos that highlight topics and trends in cancer statistics and definitions of statistical terms. Also software tools for analyzing and reporting cancer statistics, which are used to compile SEER's annual reports.
Maccone, C.
In this paper is provided the statistical generalization of the Fermi paradox. The statistics of habitable planets may be based on a set of ten (and possibly more) astrobiological requirements first pointed out by Stephen H. Dole in his book Habitable planets for man (1964). The statistical generalization of the original and by now too simplistic Dole equation is provided by replacing a product of ten positive numbers by the product of ten positive random variables. This is denoted the SEH, an acronym standing for “Statistical Equation for Habitables”. The proof in this paper is based on the Central Limit Theorem (CLT) of Statistics, stating that the sum of any number of independent random variables, each of which may be ARBITRARILY distributed, approaches a Gaussian (i.e. normal) random variable (Lyapunov form of the CLT). It is then shown that: 1. The new random variable NHab, yielding the number of habitables (i.e. habitable planets) in the Galaxy, follows the log- normal distribution. By construction, the mean value of this log-normal distribution is the total number of habitable planets as given by the statistical Dole equation. 2. The ten (or more) astrobiological factors are now positive random variables. The probability distribution of each random variable may be arbitrary. The CLT in the so-called Lyapunov or Lindeberg forms (that both do not assume the factors to be identically distributed) allows for that. In other words, the CLT "translates" into the SEH by allowing an arbitrary probability distribution for each factor. This is both astrobiologically realistic and useful for any further investigations. 3. By applying the SEH it is shown that the (average) distance between any two nearby habitable planets in the Galaxy may be shown to be inversely proportional to the cubic root of NHab. This distance is denoted by new random variable D. The relevant probability density function is derived, which was named the "Maccone distribution" by Paul Davies in
International Nuclear Information System (INIS)
Venkataraman, G.
1992-01-01
Treating radiation gas as a classical gas, Einstein derived Planck's law of radiation by considering the dynamic equilibrium between atoms and radiation. Dissatisfied with this treatment, S.N. Bose derived Plank's law by another original way. He treated the problem in generality: he counted how many cells were available for the photon gas in phase space and distributed the photons into these cells. In this manner of distribution, there were three radically new ideas: The indistinguishability of particles, the spin of the photon (with only two possible orientations) and the nonconservation of photon number. This gave rise to a new discipline of quantum statistical mechanics. Physics underlying Bose's discovery, its significance and its role in development of the concept of ideal gas, spin-statistics theorem and spin particles are described. The book has been written in a simple and direct language in an informal style aiming to stimulate the curiosity of a reader. (M.G.B.)
Genton, Marc G.
2015-04-14
This paper explores the use of visualization through animations, coined visuanimation, in the field of statistics. In particular, it illustrates the embedding of animations in the paper itself and the storage of larger movies in the online supplemental material. We present results from statistics research projects using a variety of visuanimations, ranging from exploratory data analysis of image data sets to spatio-temporal extreme event modelling; these include a multiscale analysis of classification methods, the study of the effects of a simulated explosive volcanic eruption and an emulation of climate model output. This paper serves as an illustration of visuanimation for future publications in Stat. Copyright © 2015 John Wiley & Sons, Ltd.
Fermions from classical statistics
International Nuclear Information System (INIS)
Wetterich, C.
2010-01-01
We describe fermions in terms of a classical statistical ensemble. The states τ of this ensemble are characterized by a sequence of values one or zero or a corresponding set of two-level observables. Every classical probability distribution can be associated to a quantum state for fermions. If the time evolution of the classical probabilities p τ amounts to a rotation of the wave function q τ (t)=±√(p τ (t)), we infer the unitary time evolution of a quantum system of fermions according to a Schroedinger equation. We establish how such classical statistical ensembles can be mapped to Grassmann functional integrals. Quantum field theories for fermions arise for a suitable time evolution of classical probabilities for generalized Ising models.
Applied statistical thermodynamics
Lucas, Klaus
1991-01-01
The book guides the reader from the foundations of statisti- cal thermodynamics including the theory of intermolecular forces to modern computer-aided applications in chemical en- gineering and physical chemistry. The approach is new. The foundations of quantum and statistical mechanics are presen- ted in a simple way and their applications to the prediction of fluid phase behavior of real systems are demonstrated. A particular effort is made to introduce the reader to expli- cit formulations of intermolecular interaction models and to show how these models influence the properties of fluid sy- stems. The established methods of statistical mechanics - computer simulation, perturbation theory, and numerical in- tegration - are discussed in a style appropriate for newcom- ers and are extensively applied. Numerous worked examples illustrate how practical calculations should be carried out.
Diffeomorphic Statistical Deformation Models
DEFF Research Database (Denmark)
Hansen, Michael Sass; Hansen, Mads/Fogtman; Larsen, Rasmus
2007-01-01
In this paper we present a new method for constructing diffeomorphic statistical deformation models in arbitrary dimensional images with a nonlinear generative model and a linear parameter space. Our deformation model is a modified version of the diffeomorphic model introduced by Cootes et al....... The modifications ensure that no boundary restriction has to be enforced on the parameter space to prevent folds or tears in the deformation field. For straightforward statistical analysis, principal component analysis and sparse methods, we assume that the parameters for a class of deformations lie on a linear...... with ground truth in form of manual expert annotations, and compared to Cootes's model. We anticipate applications in unconstrained diffeomorphic synthesis of images, e.g. for tracking, segmentation, registration or classification purposes....
Statistics for experimentalists
Cooper, B E
2014-01-01
Statistics for Experimentalists aims to provide experimental scientists with a working knowledge of statistical methods and search approaches to the analysis of data. The book first elaborates on probability and continuous probability distributions. Discussions focus on properties of continuous random variables and normal variables, independence of two random variables, central moments of a continuous distribution, prediction from a normal distribution, binomial probabilities, and multiplication of probabilities and independence. The text then examines estimation and tests of significance. Topics include estimators and estimates, expected values, minimum variance linear unbiased estimators, sufficient estimators, methods of maximum likelihood and least squares, and the test of significance method. The manuscript ponders on distribution-free tests, Poisson process and counting problems, correlation and function fitting, balanced incomplete randomized block designs and the analysis of covariance, and experiment...
Statistical theory and inference
Olive, David J
2014-01-01
This text is for a one semester graduate course in statistical theory and covers minimal and complete sufficient statistics, maximum likelihood estimators, method of moments, bias and mean square error, uniform minimum variance estimators and the Cramer-Rao lower bound, an introduction to large sample theory, likelihood ratio tests and uniformly most powerful tests and the Neyman Pearson Lemma. A major goal of this text is to make these topics much more accessible to students by using the theory of exponential families. Exponential families, indicator functions and the support of the distribution are used throughout the text to simplify the theory. More than 50 ``brand name" distributions are used to illustrate the theory with many examples of exponential families, maximum likelihood estimators and uniformly minimum variance unbiased estimators. There are many homework problems with over 30 pages of solutions.
Classical and statistical thermodynamics
Rizk, Hanna A
2016-01-01
This is a text book of thermodynamics for the student who seeks thorough training in science or engineering. Systematic and thorough treatment of the fundamental principles rather than presenting the large mass of facts has been stressed. The book includes some of the historical and humanistic background of thermodynamics, but without affecting the continuity of the analytical treatment. For a clearer and more profound understanding of thermodynamics this book is highly recommended. In this respect, the author believes that a sound grounding in classical thermodynamics is an essential prerequisite for the understanding of statistical thermodynamics. Such a book comprising the two wide branches of thermodynamics is in fact unprecedented. Being a written work dealing systematically with the two main branches of thermodynamics, namely classical thermodynamics and statistical thermodynamics, together with some important indexes under only one cover, this treatise is so eminently useful.
A Separation Algorithm for Sources with Temporal Structure Only Using Second-order Statistics
Directory of Open Access Journals (Sweden)
J.G. Wang
2013-09-01
Full Text Available Unlike conventional blind source separation (BSS deals with independent identically distributed (i.i.d. sources, this paper addresses the separation from mixtures of sources with temporal structure, such as linear autocorrelations. Many sequential extraction algorithms have been reported, resulting in inevitable cumulated errors introduced by the deflation scheme. We propose a robust separation algorithm to recover original sources simultaneously, through a joint diagonalizer of several average delayed covariance matrices at positions of the optimal time delay and its integers. The proposed algorithm is computationally simple and efficient, since it is based on the second-order statistics only. Extensive simulation results confirm the validity and high performance of the algorithm. Compared with related extraction algorithms, its separation signal-to-noise rate for a desired source can reach 20dB higher, and it seems rather insensitive to the estimation error of the time delay.
Irregular Liesegang-type patterns in gas phase revisited. II. Statistical correlation analysis
Torres-Guzmán, José C.; Martínez-Mekler, Gustavo; Müller, Markus F.
2016-05-01
We present a statistical analysis of Liesegang-type patterns formed in a gaseous HCl-NH3 system by ammonium chloride precipitation along glass tubes, as described in Paper I [J. C. Torres-Guzmán et al., J. Chem. Phys. 144, 174701 (2016)] of this work. We focus on the detection and characterization of short and long-range correlations within the non-stationary sequence of apparently irregular precipitation bands. To this end we applied several techniques to estimate spatial correlations stemming from different fields, namely, linear auto-correlation via the power spectral density, detrended fluctuation analysis (DFA), and methods developed in the context of random matrix theory (RMT). In particular RMT methods disclose well pronounced long-range correlations over at least 40 bands in terms of both, band positions and intensity values. By using a variant of the DFA we furnish proof of the nonlinear nature of the detected long-range correlations.
Evaluating statistical cloud schemes
Grützun, Verena; Quaas, Johannes; Morcrette , Cyril J.; Ament, Felix
2015-01-01
Statistical cloud schemes with prognostic probability distribution functions have become more important in atmospheric modeling, especially since they are in principle scale adaptive and capture cloud physics in more detail. While in theory the schemes have a great potential, their accuracy is still questionable. High-resolution three-dimensional observational data of water vapor and cloud water, which could be used for testing them, are missing. We explore the potential of ground-based re...
1979 DOE statistical symposium
Energy Technology Data Exchange (ETDEWEB)
Gardiner, D.A.; Truett T. (comps. and eds.)
1980-09-01
The 1979 DOE Statistical Symposium was the fifth in the series of annual symposia designed to bring together statisticians and other interested parties who are actively engaged in helping to solve the nation's energy problems. The program included presentations of technical papers centered around exploration and disposal of nuclear fuel, general energy-related topics, and health-related issues, and workshops on model evaluation, risk analysis, analysis of large data sets, and resource estimation.
Milewski, Emil G
2012-01-01
REA's Essentials provide quick and easy access to critical information in a variety of different fields, ranging from the most basic to the most advanced. As its name implies, these concise, comprehensive study guides summarize the essentials of the field covered. Essentials are helpful when preparing for exams, doing homework and will remain a lasting reference source for students, teachers, and professionals. Statistics I covers include frequency distributions, numerical methods of describing data, measures of variability, parameters of distributions, probability theory, and distributions.