Generalized quantum statistics
International Nuclear Information System (INIS)
Chou, C.
1992-01-01
In the paper, a non-anyonic generalization of quantum statistics is presented, in which Fermi-Dirac statistics (FDS) and Bose-Einstein statistics (BES) appear as two special cases. The new quantum statistics, which is characterized by the dimension of its single particle Fock space, contains three consistent parts, namely the generalized bilinear quantization, the generalized quantum mechanical description and the corresponding statistical mechanics
Generalized Statistical Mechanics at the Onset of Chaos
Directory of Open Access Journals (Sweden)
Alberto Robledo
2013-11-01
Full Text Available Transitions to chaos in archetypal low-dimensional nonlinear maps offer real and precise model systems in which to assess proposed generalizations of statistical mechanics. The known association of chaotic dynamics with the structure of Boltzmann–Gibbs (BG statistical mechanics has suggested the potential verification of these generalizations at the onset of chaos, when the only Lyapunov exponent vanishes and ergodic and mixing properties cease to hold. There are three well-known routes to chaos in these deterministic dissipative systems, period-doubling, quasi-periodicity and intermittency, which provide the setting in which to explore the limit of validity of the standard BG structure. It has been shown that there is a rich and intricate behavior for both the dynamics within and towards the attractors at the onset of chaos and that these two kinds of properties are linked via generalized statistical-mechanical expressions. Amongst the topics presented are: (i permanently growing sensitivity fluctuations and their infinite family of generalized Pesin identities; (ii the emergence of statistical-mechanical structures in the dynamics along the routes to chaos; (iii dynamical hierarchies with modular organization; and (iv limit distributions of sums of deterministic variables. The occurrence of generalized entropy properties in condensed-matter physical systems is illustrated by considering critical fluctuations, localization transition and glass formation. We complete our presentation with the description of the manifestations of the dynamics at the transitions to chaos in various kinds of complex systems, such as, frequency and size rank distributions and complex network images of time series. We discuss the results.
A Concise Introduction to the Statistical Physics of Complex Systems
Bertin, Eric
2012-01-01
This concise primer (based on lectures given at summer schools on complex systems and on a masters degree course in complex systems modeling) will provide graduate students and newcomers to the field with the basic knowledge of the concepts and methods of statistical physics and its potential for application to interdisciplinary topics. Indeed, in recent years, statistical physics has begun to attract the interest of a broad community of researchers in the field of complex system sciences, ranging from biology to the social sciences, economics and computer science. More generally, a growing number of graduate students and researchers feel the need to learn some basic concepts and questions originating in other disciplines without necessarily having to master all of the corresponding technicalities and jargon. Generally speaking, the goals of statistical physics may be summarized as follows: on the one hand to study systems composed of a large number of interacting ‘entities’, and on the other to predict...
Statistical physics of complex systems a concise introduction
Bertin, Eric
2016-01-01
This course-tested primer provides graduate students and non-specialists with a basic understanding of the concepts and methods of statistical physics and demonstrates their wide range of applications to interdisciplinary topics in the field of complex system sciences, including selected aspects of theoretical modeling in biology and the social sciences. Generally speaking, the goals of statistical physics may be summarized as follows: on the one hand to study systems composed of a large number of interacting units, and on the other to predict the macroscopic, collective behavior of the system considered from the perspective of the microscopic laws governing the dynamics of the individual entities. These two goals are essentially also shared by what is now called 'complex systems science', and as such, systems studied in the framework of statistical physics may be considered to be among the simplest examples of complex systems – while also offering a rather well developed mathematical treatment. The second ...
Generalized $L-, M-$, and $R$-Statistics
Serfling, Robert J.
1984-01-01
A class of statistics generalizing $U$-statistics and $L$-statistics, and containing other varieties of statistic as well, such as trimmed $U$-statistics, is studied. Using the differentiable statistical function approach, differential approximations are obtained and the influence curves of these generalized $L$-statistics are derived. These results are employed to establish asymptotic normality for such statistics. Parallel generalizations of $M$- and $R$-statistics are noted. Strong converg...
A Statistical Evaluation of Atmosphere-Ocean General Circulation Models: Complexity vs. Simplicity
Robert K. Kaufmann; David I. Stern
2004-01-01
The principal tools used to model future climate change are General Circulation Models which are deterministic high resolution bottom-up models of the global atmosphere-ocean system that require large amounts of supercomputer time to generate results. But are these models a cost-effective way of predicting future climate change at the global level? In this paper we use modern econometric techniques to evaluate the statistical adequacy of three general circulation models (GCMs) by testing thre...
A generalization of Friedman's rank statistic
Kroon, de J.; Laan, van der P.
1983-01-01
In this paper a very natural generalization of the two·way analysis of variance rank statistic of FRIEDMAN is given. The general distribution-free test procedure based on this statistic for the effect of J treatments in a random block design can be applied in general two-way layouts without
Statistical physics of networks, information and complex systems
Energy Technology Data Exchange (ETDEWEB)
Ecke, Robert E [Los Alamos National Laboratory
2009-01-01
In this project we explore the mathematical methods and concepts of statistical physics that are fmding abundant applications across the scientific and technological spectrum from soft condensed matter systems and bio-infonnatics to economic and social systems. Our approach exploits the considerable similarity of concepts between statistical physics and computer science, allowing for a powerful multi-disciplinary approach that draws its strength from cross-fertilization and mUltiple interactions of researchers with different backgrounds. The work on this project takes advantage of the newly appreciated connection between computer science and statistics and addresses important problems in data storage, decoding, optimization, the infonnation processing properties of the brain, the interface between quantum and classical infonnation science, the verification of large software programs, modeling of complex systems including disease epidemiology, resource distribution issues, and the nature of highly fluctuating complex systems. Common themes that the project has been emphasizing are (i) neural computation, (ii) network theory and its applications, and (iii) a statistical physics approach to infonnation theory. The project's efforts focus on the general problem of optimization and variational techniques, algorithm development and infonnation theoretic approaches to quantum systems. These efforts are responsible for fruitful collaborations and the nucleation of science efforts that span multiple divisions such as EES, CCS, 0 , T, ISR and P. This project supports the DOE mission in Energy Security and Nuclear Non-Proliferation by developing novel infonnation science tools for communication, sensing, and interacting complex networks such as the internet or energy distribution system. The work also supports programs in Threat Reduction and Homeland Security.
Generalized interpolative quantum statistics
International Nuclear Information System (INIS)
Ramanathan, R.
1992-01-01
A generalized interpolative quantum statistics is presented by conjecturing a certain reordering of phase space due to the presence of possible exotic objects other than bosons and fermions. Such an interpolation achieved through a Bose-counting strategy predicts the existence of an infinite quantum Boltzmann-Gibbs statistics akin to the one discovered by Greenberg recently
Polychronakos fractional statistics with a complex-valued parameter
International Nuclear Information System (INIS)
Rovenchak, Andrij
2012-01-01
A generalization of quantum statistics is proposed in a fashion similar to the suggestion of Polychronakos [Phys. Lett. B 365, 202 (1996)] with the parameter α varying between −1 (fermionic case) and +1 (bosonic case). However, unlike the original formulation, it is suggested that intermediate values are located on the unit circle in the complex plane. In doing so one can avoid the case α = 0 corresponding to the Boltzmann statistics, which is not a quantum one. The limits of α → +1 and α → −1 reproducing small deviations from the Bose and Fermi statistics, respectively, are studied in detail. The equivalence between the statistics parameter and a possible dissipative part of the excitation spectrum is established. The case of a non-conserving number of excitations is analyzed. It is defined from the condition that the real part of the chemical potential equals zero. Thermodynamic quantities of a model system of two-dimensional harmonic oscillators are calculated.
Generalized bond percolation and statistical mechanics
International Nuclear Information System (INIS)
Tsallis, C.
1978-05-01
A generalization of traditional bond percolation is performed, in the sens that bonds have now the possibility of partially transmitting the information (a fact which leads to the concept of 'fidelity' of the bond), and also in the sens that, besides the normal tendency to equiprobability, the bonds are allowed to substantially change the information. Furthermore the fidelity is allowed, to become an aleatory variable, and the operational rules concerning the associated distribution laws are determined. Thermally quenched random bonds and the whole body of Statistical Mechanics become particular cases of this formalism, which is in general adapted to the treatment of all problems whose main characteristic is to preserve a part of the information through a long path or array (critical phenomena, regime changements, thermal random models, etc). Operationally it provides a quick method for the calculation of the equivalent probability of complex clusters within the traditional bond percolation problem [pt
A weighted generalized score statistic for comparison of predictive values of diagnostic tests.
Kosinski, Andrzej S
2013-03-15
Positive and negative predictive values are important measures of a medical diagnostic test performance. We consider testing equality of two positive or two negative predictive values within a paired design in which all patients receive two diagnostic tests. The existing statistical tests for testing equality of predictive values are either Wald tests based on the multinomial distribution or the empirical Wald and generalized score tests within the generalized estimating equations (GEE) framework. As presented in the literature, these test statistics have considerably complex formulas without clear intuitive insight. We propose their re-formulations that are mathematically equivalent but algebraically simple and intuitive. As is clearly seen with a new re-formulation we presented, the generalized score statistic does not always reduce to the commonly used score statistic in the independent samples case. To alleviate this, we introduce a weighted generalized score (WGS) test statistic that incorporates empirical covariance matrix with newly proposed weights. This statistic is simple to compute, always reduces to the score statistic in the independent samples situation, and preserves type I error better than the other statistics as demonstrated by simulations. Thus, we believe that the proposed WGS statistic is the preferred statistic for testing equality of two predictive values and for corresponding sample size computations. The new formulas of the Wald statistics may be useful for easy computation of confidence intervals for difference of predictive values. The introduced concepts have potential to lead to development of the WGS test statistic in a general GEE setting. Copyright © 2012 John Wiley & Sons, Ltd.
Statistical electromagnetics: Complex cavities
Naus, H.W.L.
2008-01-01
A selection of the literature on the statistical description of electromagnetic fields and complex cavities is concisely reviewed. Some essential concepts, for example, the application of the central limit theorem and the maximum entropy principle, are scrutinized. Implicit assumptions, biased
On Generalized Stam Inequalities and Fisher–Rényi Complexity Measures
Directory of Open Access Journals (Sweden)
Steeve Zozor
2017-09-01
Full Text Available Information-theoretic inequalities play a fundamental role in numerous scientific and technological areas (e.g., estimation and communication theories, signal and information processing, quantum physics, … as they generally express the impossibility to have a complete description of a system via a finite number of information measures. In particular, they gave rise to the design of various quantifiers (statistical complexity measures of the internal complexity of a (quantum system. In this paper, we introduce a three-parametric Fisher–Rényi complexity, named ( p , β , λ -Fisher–Rényi complexity, based on both a two-parametic extension of the Fisher information and the Rényi entropies of a probability density function ρ characteristic of the system. This complexity measure quantifies the combined balance of the spreading and the gradient contents of ρ , and has the three main properties of a statistical complexity: the invariance under translation and scaling transformations, and a universal bounding from below. The latter is proved by generalizing the Stam inequality, which lowerbounds the product of the Shannon entropy power and the Fisher information of a probability density function. An extension of this inequality was already proposed by Bercher and Lutwak, a particular case of the general one, where the three parameters are linked, allowing to determine the sharp lower bound and the associated probability density with minimal complexity. Using the notion of differential-escort deformation, we are able to determine the sharp bound of the complexity measure even when the three parameters are decoupled (in a certain range. We determine as well the distribution that saturates the inequality: the ( p , β , λ -Gaussian distribution, which involves an inverse incomplete beta function. Finally, the complexity measure is calculated for various quantum-mechanical states of the harmonic and hydrogenic systems, which are the two main
Statistical distribution for generalized ideal gas of fractional-statistics particles
International Nuclear Information System (INIS)
Wu, Y.
1994-01-01
We derive the occupation-number distribution in a generalized ideal gas of particles obeying fractional statistics, including mutual statistics, by adopting a state-counting definition. When there is no mutual statistics, the statistical distribution interpolates between bosons and fermions, and respects a fractional exclusion principle (except for bosons). Anyons in a strong magnetic field at low temperatures constitute such a physical system. Applications to the thermodynamic properties of quasiparticle excitations in the Laughlin quantum Hall fluid are discussed
Actuarial statistics with generalized linear mixed models
Antonio, K.; Beirlant, J.
2007-01-01
Over the last decade the use of generalized linear models (GLMs) in actuarial statistics has received a lot of attention, starting from the actuarial illustrations in the standard text by McCullagh and Nelder [McCullagh, P., Nelder, J.A., 1989. Generalized linear models. In: Monographs on Statistics
Statistical analysis of complex systems with nonclassical invariant measures
Fratalocchi, Andrea
2011-02-28
I investigate the problem of finding a statistical description of a complex many-body system whose invariant measure cannot be constructed stemming from classical thermodynamics ensembles. By taking solitons as a reference system and by employing a general formalism based on the Ablowitz-Kaup-Newell-Segur scheme, I demonstrate how to build an invariant measure and, within a one-dimensional phase space, how to develop a suitable thermodynamics. A detailed example is provided with a universal model of wave propagation, with reference to a transparent potential sustaining gray solitons. The system shows a rich thermodynamic scenario, with a free-energy landscape supporting phase transitions and controllable emergent properties. I finally discuss the origin of such behavior, trying to identify common denominators in the area of complex dynamics.
Statistics of Shared Components in Complex Component Systems
Mazzolini, Andrea; Gherardi, Marco; Caselle, Michele; Cosentino Lagomarsino, Marco; Osella, Matteo
2018-04-01
Many complex systems are modular. Such systems can be represented as "component systems," i.e., sets of elementary components, such as LEGO bricks in LEGO sets. The bricks found in a LEGO set reflect a target architecture, which can be built following a set-specific list of instructions. In other component systems, instead, the underlying functional design and constraints are not obvious a priori, and their detection is often a challenge of both scientific and practical importance, requiring a clear understanding of component statistics. Importantly, some quantitative invariants appear to be common to many component systems, most notably a common broad distribution of component abundances, which often resembles the well-known Zipf's law. Such "laws" affect in a general and nontrivial way the component statistics, potentially hindering the identification of system-specific functional constraints or generative processes. Here, we specifically focus on the statistics of shared components, i.e., the distribution of the number of components shared by different system realizations, such as the common bricks found in different LEGO sets. To account for the effects of component heterogeneity, we consider a simple null model, which builds system realizations by random draws from a universe of possible components. Under general assumptions on abundance heterogeneity, we provide analytical estimates of component occurrence, which quantify exhaustively the statistics of shared components. Surprisingly, this simple null model can positively explain important features of empirical component-occurrence distributions obtained from large-scale data on bacterial genomes, LEGO sets, and book chapters. Specific architectural features and functional constraints can be detected from occurrence patterns as deviations from these null predictions, as we show for the illustrative case of the "core" genome in bacteria.
Concepts and recent advances in generalized information measures and statistics
Kowalski, Andres M
2013-01-01
Since the introduction of the information measure widely known as Shannon entropy, quantifiers based on information theory and concepts such as entropic forms and statistical complexities have proven to be useful in diverse scientific research fields. This book contains introductory tutorials suitable for the general reader, together with chapters dedicated to the basic concepts of the most frequently employed information measures or quantifiers and their recent applications to different areas, including physics, biology, medicine, economics, communication and social sciences. As these quantif
Order, disorder and generalized statistics
International Nuclear Information System (INIS)
Marino, E.C.; Swieca, J.A.
1980-06-01
We generalize the prescription of Kadanoff and Ceva for the computation of disorder variables correlation functions in the Ising model for continuous field theories with U(1) symmetry. By considering the product of order and disorder variables, we obtain a path integral representation for fields with generalized statistics. We discuss in detail the cases of massless Thirring and Schwinger models. (Author) [pt
Order, disorder and generalized statistics
International Nuclear Information System (INIS)
Marino, E.C.; Swieca, J.A.; Pontificia Universidade Catolica do Rio de Janeiro
1980-01-01
We generalize the prescription of Kadanoff and Ceva for the computation of disorder variable correlation functions in the Ising model for continuous field theories with U(1) symmetry. By considering the product of order and disorder variables, we obtain a path integral representation for fields with generalized statistics. We discuss in detail the cases of massless Thirring and Schwinger models. (orig.)
Glushak, P. A.; Markiv, B. B.; Tokarchuk, M. V.
2018-01-01
We present a generalization of Zubarev's nonequilibrium statistical operator method based on the principle of maximum Renyi entropy. In the framework of this approach, we obtain transport equations for the basic set of parameters of the reduced description of nonequilibrium processes in a classical system of interacting particles using Liouville equations with fractional derivatives. For a classical systems of particles in a medium with a fractal structure, we obtain a non-Markovian diffusion equation with fractional spatial derivatives. For a concrete model of the frequency dependence of a memory function, we obtain generalized Kettano-type diffusion equation with the spatial and temporal fractality taken into account. We present a generalization of nonequilibrium thermofield dynamics in Zubarev's nonequilibrium statistical operator method in the framework of Renyi statistics.
Statistical mechanics of complex networks
Rubi, Miguel; Diaz-Guilera, Albert
2003-01-01
Networks can provide a useful model and graphic image useful for the description of a wide variety of web-like structures in the physical and man-made realms, e.g. protein networks, food webs and the Internet. The contributions gathered in the present volume provide both an introduction to, and an overview of, the multifaceted phenomenology of complex networks. Statistical Mechanics of Complex Networks also provides a state-of-the-art picture of current theoretical methods and approaches.
Second-Order Statistics for Wave Propagation through Complex Optical Systems
DEFF Research Database (Denmark)
Yura, H.T.; Hanson, Steen Grüner
1989-01-01
Closed-form expressions are derived for various statistical functions that arise in optical propagation through arbitrary optical systems that can be characterized by a complex ABCD matrix in the presence of distributed random inhomogeneities along the optical path. Specifically, within the second......-order Rytov approximation, explicit general expressions are presented for the mutual coherence function, the log-amplitude and phase correlation functions, and the mean-square irradiance that are obtained in propagation through an arbitrary paraxial ABCD optical system containing Gaussian-shaped limiting...
Statistics of Shared Components in Complex Component Systems
Directory of Open Access Journals (Sweden)
Andrea Mazzolini
2018-04-01
Full Text Available Many complex systems are modular. Such systems can be represented as “component systems,” i.e., sets of elementary components, such as LEGO bricks in LEGO sets. The bricks found in a LEGO set reflect a target architecture, which can be built following a set-specific list of instructions. In other component systems, instead, the underlying functional design and constraints are not obvious a priori, and their detection is often a challenge of both scientific and practical importance, requiring a clear understanding of component statistics. Importantly, some quantitative invariants appear to be common to many component systems, most notably a common broad distribution of component abundances, which often resembles the well-known Zipf’s law. Such “laws” affect in a general and nontrivial way the component statistics, potentially hindering the identification of system-specific functional constraints or generative processes. Here, we specifically focus on the statistics of shared components, i.e., the distribution of the number of components shared by different system realizations, such as the common bricks found in different LEGO sets. To account for the effects of component heterogeneity, we consider a simple null model, which builds system realizations by random draws from a universe of possible components. Under general assumptions on abundance heterogeneity, we provide analytical estimates of component occurrence, which quantify exhaustively the statistics of shared components. Surprisingly, this simple null model can positively explain important features of empirical component-occurrence distributions obtained from large-scale data on bacterial genomes, LEGO sets, and book chapters. Specific architectural features and functional constraints can be detected from occurrence patterns as deviations from these null predictions, as we show for the illustrative case of the “core” genome in bacteria.
A generalized complexity measure based on Rényi entropy
Sánchez-Moreno, Pablo; Angulo, Juan Carlos; Dehesa, Jesus S.
2014-08-01
The intrinsic statistical complexities of finite many-particle systems (i.e., those defined in terms of the single-particle density) quantify the degree of structure or patterns, far beyond the entropy measures. They are intuitively constructed to be minima at the opposite extremes of perfect order and maximal randomness. Starting from the pioneering LMC measure, which satisfies these requirements, some extensions of LMC-Rényi type have been published in the literature. The latter measures were shown to describe a variety of physical aspects of the internal disorder in atomic and molecular systems (e.g., quantum phase transitions, atomic shell filling) which are not grasped by their mother LMC quantity. However, they are not minimal for maximal randomness in general. In this communication, we propose a generalized LMC-Rényi complexity which overcomes this problem. Some applications which illustrate this fact are given.
Generalized Combination Complex Synchronization for Fractional-Order Chaotic Complex Systems
Directory of Open Access Journals (Sweden)
Cuimei Jiang
2015-07-01
Full Text Available Based on two fractional-order chaotic complex drive systems and one fractional-order chaotic complex response system with different dimensions, we propose generalized combination complex synchronization. In this new synchronization scheme, there are two complex scaling matrices that are non-square matrices. On the basis of the stability theory of fractional-order linear systems, we design a general controller via active control. Additionally, by virtue of two complex scaling matrices, generalized combination complex synchronization between fractional-order chaotic complex systems and real systems is investigated. Finally, three typical examples are given to demonstrate the effectiveness and feasibility of the schemes.
Karian, Zaven A
2000-01-01
Throughout the physical and social sciences, researchers face the challenge of fitting statistical distributions to their data. Although the study of statistical modelling has made great strides in recent years, the number and variety of distributions to choose from-all with their own formulas, tables, diagrams, and general properties-continue to create problems. For a specific application, which of the dozens of distributions should one use? What if none of them fit well?Fitting Statistical Distributions helps answer those questions. Focusing on techniques used successfully across many fields, the authors present all of the relevant results related to the Generalized Lambda Distribution (GLD), the Generalized Bootstrap (GB), and Monte Carlo simulation (MC). They provide the tables, algorithms, and computer programs needed for fitting continuous probability distributions to data in a wide variety of circumstances-covering bivariate as well as univariate distributions, and including situations where moments do...
Limit temperature for entanglement in generalized statistics
International Nuclear Information System (INIS)
Rossignoli, R.; Canosa, N.
2004-01-01
We discuss the main properties of general thermal states derived from non-additive entropic forms and their use for studying quantum entanglement. It is shown that all these states become more mixed as the temperature increases, approaching the full random state for T→∞. The formalism is then applied to examine the limit temperature for entanglement in a two-qubit XXZ Heisenberg chain, which exhibits the peculiar feature of being independent of the applied magnetic field in the conventional von Neumann based statistics. In contrast, this temperature is shown to be field dependent in a generalized statistics, even for small deviations from the standard form. Results for the Tsallis-based statistics are examined in detail
The statistical-inference approach to generalized thermodynamics
International Nuclear Information System (INIS)
Lavenda, B.H.; Scherer, C.
1987-01-01
Limit theorems, such as the central-limit theorem and the weak law of large numbers, are applicable to statistical thermodynamics for sufficiently large sample size of indipendent and identically distributed observations performed on extensive thermodynamic (chance) variables. The estimation of the intensive thermodynamic quantities is a problem in parametric statistical estimation. The normal approximation to the Gibbs' distribution is justified by the analysis of large deviations. Statistical thermodynamics is generalized to include the statistical estimation of variance as well as mean values
The Generalized Quantum Statistics
Hwang, WonYoung; Ji, Jeong-Young; Hong, Jongbae
1999-01-01
The concept of wavefunction reduction should be introduced to standard quantum mechanics in any physical processes where effective reduction of wavefunction occurs, as well as in the measurement processes. When the overlap is negligible, each particle obey Maxwell-Boltzmann statistics even if the particles are in principle described by totally symmetrized wavefunction [P.R.Holland, The Quantum Theory of Motion, Cambridge Unversity Press, 1993, p293]. We generalize the conjecture. That is, par...
Infants generalize representations of statistically segmented words
Directory of Open Access Journals (Sweden)
Katharine eGraf Estes
2012-10-01
Full Text Available The acoustic variation in language presents learners with a substantial challenge. To learn by tracking statistical regularities in speech, infants must recognize words across tokens that differ based on characteristics such as the speaker’s voice, affect, or the sentence context. Previous statistical learning studies have not investigated how these types of surface form variation affect learning. The present experiments used tasks tailored to two distinct developmental levels to investigate the robustness of statistical learning to variation. Experiment 1 examined statistical word segmentation in 11-month-olds and found that infants can recognize statistically segmented words across a change in the speaker’s voice from segmentation to testing. The direction of infants’ preferences suggests that recognizing words across a voice change is more difficult than recognizing them in a consistent voice. Experiment 2 tested whether 17-month-olds can generalize the output of statistical learning across variation to support word learning. The infants were successful in their generalization; they associated referents with statistically defined words despite a change in voice from segmentation to label learning. Infants’ learning patterns also indicate that they formed representations of across-word syllable sequences during segmentation. Thus, low probability sequences can act as object labels in some conditions. The findings of these experiments suggest that the units that emerge during statistical learning are not perceptually constrained, but rather are robust to naturalistic acoustic variation.
14 CFR 291.41 - Financial and statistical reporting-general.
2010-01-01
... 14 Aeronautics and Space 4 2010-01-01 2010-01-01 false Financial and statistical reporting-general... (AVIATION PROCEEDINGS) ECONOMIC REGULATIONS CARGO OPERATIONS IN INTERSTATE AIR TRANSPORTATION Reporting Rules § 291.41 Financial and statistical reporting—general. (a) Carriers providing cargo operations in...
Generalized complex geometry, generalized branes and the Hitchin sigma model
International Nuclear Information System (INIS)
Zucchini, Roberto
2005-01-01
Hitchin's generalized complex geometry has been shown to be relevant in compactifications of superstring theory with fluxes and is expected to lead to a deeper understanding of mirror symmetry. Gualtieri's notion of generalized complex submanifold seems to be a natural candidate for the description of branes in this context. Recently, we introduced a Batalin-Vilkovisky field theoretic realization of generalized complex geometry, the Hitchin sigma model, extending the well known Poisson sigma model. In this paper, exploiting Gualtieri's formalism, we incorporate branes into the model. A detailed study of the boundary conditions obeyed by the world sheet fields is provided. Finally, it is found that, when branes are present, the classical Batalin-Vilkovisky cohomology contains an extra sector that is related non trivially to a novel cohomology associated with the branes as generalized complex submanifolds. (author)
General quadrupolar statistical anisotropy: Planck limits
Energy Technology Data Exchange (ETDEWEB)
Ramazanov, S. [Gran Sasso Science Institute (INFN), Viale Francesco Crispi 7, I-67100 L' Aquila (Italy); Rubtsov, G. [Institute for Nuclear Research of the Russian Academy of Sciences, Prospect of the 60th Anniversary of October 7a, 117312 Moscow (Russian Federation); Thorsrud, M. [Faculty of Engineering, Østfold University College, P.O. Box 700, 1757 Halden (Norway); Urban, F.R., E-mail: sabir.ramazanov@gssi.infn.it, E-mail: grisha@ms2.inr.ac.ru, E-mail: mikjel.thorsrud@hiof.no, E-mail: federico.urban@kbfi.ee [National Institute of Chemical Physics and Biophysics, Rävala 10, 10143 Tallinn (Estonia)
2017-03-01
Several early Universe scenarios predict a direction-dependent spectrum of primordial curvature perturbations. This translates into the violation of the statistical isotropy of cosmic microwave background radiation. Previous searches for statistical anisotropy mainly focussed on a quadrupolar direction-dependence characterised by a single multipole vector and an overall amplitude g {sub *}. Generically, however, the quadrupole has a more complicated geometry described by two multipole vectors and g {sub *}. This is the subject of the present work. In particular, we limit the amplitude g {sub *} for different shapes of the quadrupole by making use of Planck 2015 maps. We also constrain certain inflationary scenarios which predict this kind of more general quadrupolar statistical anisotropy.
Kolmogorov complexity, pseudorandom generators and statistical models testing
Czech Academy of Sciences Publication Activity Database
Šindelář, Jan; Boček, Pavel
2002-01-01
Roč. 38, č. 6 (2002), s. 747-759 ISSN 0023-5954 R&D Projects: GA ČR GA102/99/1564 Institutional research plan: CEZ:AV0Z1075907 Keywords : Kolmogorov complexity * pseudorandom generators * statistical models testing Subject RIV: BB - Applied Statistics, Operational Research Impact factor: 0.341, year: 2002
Statistical complexity without explicit reference to underlying probabilities
Pennini, F.; Plastino, A.
2018-06-01
We show that extremely simple systems of a not too large number of particles can be simultaneously thermally stable and complex. To such an end, we extend the statistical complexity's notion to simple configurations of non-interacting particles, without appeal to probabilities, and discuss configurational properties.
Using the Statistical Indicators for the General Insurances Activity
Directory of Open Access Journals (Sweden)
Ion Partachi
2007-04-01
Full Text Available The statistics of the general insurances activity is largely used in the actuarial calculations. The actuarial analysis are achieved exclusively on the basis of primary and derived indicators, which are drawn up by various statistical methods. The statistical indicators which are used in this respect are obtained on the basis of the factors and conditions allowing the compensation cases to occur.The actuarial analysis is performed over the time as well, by using the chronological which allow the decomposition of the phenomenon being studied by its factors of influence.In this article, after briefly presenting a number of point of view regarding the utilization of the statistical indicators in the actuarial analysis, we have analyzed, successively, a series of issues, such as: the statistical indicators as regards the general insurances fund forming, expressed in physical and value units, or as absolute, relative and average volumes; the statistical indicators of the utilization of the general insurances funds (with the same diversified form of expression and the statistical indicators of the outcomes of the general insurances activity.A particular accent went to the underlying of certain methodological aspects regarding the calculation of the above mentioned indicators, emphasizing certain particular characteristics concerning their utilization in the frame of the actuarial analysis.The article is stressing the clarification of the fact that these indicators are used in the actuarial analysis as a real system. The respective proportions are enumerated, by underlying the concrete possibilities of computation, which secure the possibility of performing the necessary analysis involved by a decisional process.
The applications of Complexity Theory and Tsallis Non-extensive Statistics at Solar Plasma Dynamics
Pavlos, George
2015-04-01
As the solar plasma lives far from equilibrium it is an excellent laboratory for testing complexity theory and non-equilibrium statistical mechanics. In this study, we present the highlights of complexity theory and Tsallis non extensive statistical mechanics as concerns their applications at solar plasma dynamics, especially at sunspot, solar flare and solar wind phenomena. Generally, when a physical system is driven far from equilibrium states some novel characteristics can be observed related to the nonlinear character of dynamics. Generally, the nonlinearity in space plasma dynamics can generate intermittent turbulence with the typical characteristics of the anomalous diffusion process and strange topologies of stochastic space plasma fields (velocity and magnetic fields) caused by the strange dynamics and strange kinetics (Zaslavsky, 2002). In addition, according to Zelenyi and Milovanov (2004) the complex character of the space plasma system includes the existence of non-equilibrium (quasi)-stationary states (NESS) having the topology of a percolating fractal set. The stabilization of a system near the NESS is perceived as a transition into a turbulent state determined by self-organization processes. The long-range correlation effects manifest themselves as a strange non-Gaussian behavior of kinetic processes near the NESS plasma state. The complex character of space plasma can also be described by the non-extensive statistical thermodynamics pioneered by Tsallis, which offers a consistent and effective theoretical framework, based on a generalization of Boltzmann - Gibbs (BG) entropy, to describe far from equilibrium nonlinear complex dynamics (Tsallis, 2009). In a series of recent papers, the hypothesis of Tsallis non-extensive statistics in magnetosphere, sunspot dynamics, solar flares, solar wind and space plasma in general, was tested and verified (Karakatsanis et al., 2013; Pavlos et al., 2014; 2015). Our study includes the analysis of solar plasma time
International Nuclear Information System (INIS)
Nemnes, G A; Anghel, D V
2010-01-01
We present a stochastic method for the simulation of the time evolution in systems which obey generalized statistics, namely fractional exclusion statistics and Gentile's statistics. The transition rates are derived in the framework of canonical ensembles. This approach introduces a tool for describing interacting fermionic and bosonic systems in non-equilibrium as ideal FES systems, in a computationally efficient manner. The two types of statistics are analyzed comparatively, indicating their intrinsic thermodynamic differences and revealing key aspects related to the species size
Effective control of complex turbulent dynamical systems through statistical functionals.
Majda, Andrew J; Qi, Di
2017-05-30
Turbulent dynamical systems characterized by both a high-dimensional phase space and a large number of instabilities are ubiquitous among complex systems in science and engineering, including climate, material, and neural science. Control of these complex systems is a grand challenge, for example, in mitigating the effects of climate change or safe design of technology with fully developed shear turbulence. Control of flows in the transition to turbulence, where there is a small dimension of instabilities about a basic mean state, is an important and successful discipline. In complex turbulent dynamical systems, it is impossible to track and control the large dimension of instabilities, which strongly interact and exchange energy, and new control strategies are needed. The goal of this paper is to propose an effective statistical control strategy for complex turbulent dynamical systems based on a recent statistical energy principle and statistical linear response theory. We illustrate the potential practical efficiency and verify this effective statistical control strategy on the 40D Lorenz 1996 model in forcing regimes with various types of fully turbulent dynamics with nearly one-half of the phase space unstable.
Complex Data Modeling and Computationally Intensive Statistical Methods
Mantovan, Pietro
2010-01-01
The last years have seen the advent and development of many devices able to record and store an always increasing amount of complex and high dimensional data; 3D images generated by medical scanners or satellite remote sensing, DNA microarrays, real time financial data, system control datasets. The analysis of this data poses new challenging problems and requires the development of novel statistical models and computational methods, fueling many fascinating and fast growing research areas of modern statistics. The book offers a wide variety of statistical methods and is addressed to statistici
Nonequilibrium Statistical Operator Method and Generalized Kinetic Equations
Kuzemsky, A. L.
2018-01-01
We consider some principal problems of nonequilibrium statistical thermodynamics in the framework of the Zubarev nonequilibrium statistical operator approach. We present a brief comparative analysis of some approaches to describing irreversible processes based on the concept of nonequilibrium Gibbs ensembles and their applicability to describing nonequilibrium processes. We discuss the derivation of generalized kinetic equations for a system in a heat bath. We obtain and analyze a damped Schrödinger-type equation for a dynamical system in a heat bath. We study the dynamical behavior of a particle in a medium taking the dissipation effects into account. We consider the scattering problem for neutrons in a nonequilibrium medium and derive a generalized Van Hove formula. We show that the nonequilibrium statistical operator method is an effective, convenient tool for describing irreversible processes in condensed matter.
Information Geometric Complexity of a Trivariate Gaussian Statistical Model
Directory of Open Access Journals (Sweden)
Domenico Felice
2014-05-01
Full Text Available We evaluate the information geometric complexity of entropic motion on low-dimensional Gaussian statistical manifolds in order to quantify how difficult it is to make macroscopic predictions about systems in the presence of limited information. Specifically, we observe that the complexity of such entropic inferences not only depends on the amount of available pieces of information but also on the manner in which such pieces are correlated. Finally, we uncover that, for certain correlational structures, the impossibility of reaching the most favorable configuration from an entropic inference viewpoint seems to lead to an information geometric analog of the well-known frustration effect that occurs in statistical physics.
Generalized statistics and the rishon hypothesis
Energy Technology Data Exchange (ETDEWEB)
Jarvis, P.D. (Tasmania Univ., Sandy Bay (Australia). Dept. of Physics); Green, H.S. (Adelaide Univ. (Australia). Dept. of Mathematical Physics)
1983-01-01
It is pointed out that the proposal of Harari and others, that leptons and quarks should be regarded as composites, consisting of rishons or quips, can be formulated as a field theory in terms of two fundamental spinor fields which satisfy a new generalization of quantum statistics. The requirement of macroscopic causality determines which of the many combinations of rishons may be observed as isolated particles.
PREFACE: Statistical Physics of Complex Fluids
Golestanian, R.; Khajehpour, M. R. H.; Kolahchi, M. R.; Rouhani, S.
2005-04-01
The field of complex fluids is a rapidly developing, highly interdisciplinary field that brings together people from a plethora of backgrounds such as mechanical engineering, chemical engineering, materials science, applied mathematics, physics, chemistry and biology. In this melting pot of science, the traditional boundaries of various scientific disciplines have been set aside. It is this very property of the field that has guaranteed its richness and prosperity since the final decade of the 20th century and into the 21st. The C3 Commission of the International Union of Pure and Applied Physics (IUPAP), which is the commission for statistical physics that organizes the international STATPHYS conferences, encourages various, more focused, satellite meetings to complement the main event. For the STATPHYS22 conference in Bangalore (July 2004), Iran was recognized by the STATPHYS22 organizers as suitable to host such a satellite meeting and the Institute for Advanced Studies in Basic Sciences (IASBS) was chosen to be the site of this meeting. It was decided to organize a meeting in the field of complex fluids, which is a fairly developed field in Iran. This international meeting, and an accompanying summer school, were intended to boost international connections for both the research groups working in Iran, and several other groups working in the Middle East, South Asia and North Africa. The meeting, entitled `Statistical Physics of Complex Fluids' was held at the Institute for Advanced Studies in Basic Sciences (IASBS) in Zanjan, Iran, from 27 June to 1 July 2004. The main topics discussed at the meeting included: biological statistical physics, wetting and microfluidics, transport in complex media, soft and granular matter, and rheology of complex fluids. At this meeting, 22 invited lectures by eminent scientists were attended by 107 participants from different countries. The poster session consisted of 45 presentations which, in addition to the main topics of the
Statistical Physics of Complex Substitutive Systems
Jin, Qing
Diffusion processes are central to human interactions. Despite extensive studies that span multiple disciplines, our knowledge is limited to spreading processes in non-substitutive systems. Yet, a considerable number of ideas, products, and behaviors spread by substitution; to adopt a new one, agents must give up an existing one. This captures the spread of scientific constructs--forcing scientists to choose, for example, a deterministic or probabilistic worldview, as well as the adoption of durable items, such as mobile phones, cars, or homes. In this dissertation, I develop a statistical physics framework to describe, quantify, and understand substitutive systems. By empirically exploring three collected high-resolution datasets pertaining to such systems, I build a mechanistic model describing substitutions, which not only analytically predicts the universal macroscopic phenomenon discovered in the collected datasets, but also accurately captures the trajectories of individual items in a complex substitutive system, demonstrating a high degree of regularity and universality in substitutive systems. I also discuss the origins and insights of the parameters in the substitution model and possible generalization form of the mathematical framework. The systematical study of substitutive systems presented in this dissertation could potentially guide the understanding and prediction of all spreading phenomena driven by substitutions, from electric cars to scientific paradigms, and from renewable energy to new healthy habits.
Statistical screening of input variables in a complex computer code
International Nuclear Information System (INIS)
Krieger, T.J.
1982-01-01
A method is presented for ''statistical screening'' of input variables in a complex computer code. The object is to determine the ''effective'' or important input variables by estimating the relative magnitudes of their associated sensitivity coefficients. This is accomplished by performing a numerical experiment consisting of a relatively small number of computer runs with the code followed by a statistical analysis of the results. A formula for estimating the sensitivity coefficients is derived. Reference is made to an earlier work in which the method was applied to a complex reactor code with good results
a Statistical Dynamic Approach to Structural Evolution of Complex Capital Market Systems
Shao, Xiao; Chai, Li H.
As an important part of modern financial systems, capital market has played a crucial role on diverse social resource allocations and economical exchanges. Beyond traditional models and/or theories based on neoclassical economics, considering capital markets as typical complex open systems, this paper attempts to develop a new approach to overcome some shortcomings of the available researches. By defining the generalized entropy of capital market systems, a theoretical model and nonlinear dynamic equation on the operations of capital market are proposed from statistical dynamic perspectives. The US security market from 1995 to 2001 is then simulated and analyzed as a typical case. Some instructive results are discussed and summarized.
Energy Technology Data Exchange (ETDEWEB)
Paik, Joongcheol [University of Minnesota; Sotiropoulos, Fotis [University of Minnesota; Sale, Michael J [ORNL
2005-06-01
A numerical method is developed for carrying out unsteady Reynolds-averaged Navier-Stokes (URANS) simulations and detached-eddy simulations (DESs) in complex 3D geometries. The method is applied to simulate incompressible swirling flow in a typical hydroturbine draft tube, which consists of a strongly curved 90 degree elbow and two piers. The governing equations are solved with a second-order-accurate, finite-volume, dual-time-stepping artificial compressibility approach for a Reynolds number of 1.1 million on a mesh with 1.8 million nodes. The geometrical complexities of the draft tube are handled using domain decomposition with overset (chimera) grids. Numerical simulations show that unsteady statistical turbulence models can capture very complex 3D flow phenomena dominated by geometry-induced, large-scale instabilities and unsteady coherent structures such as the onset of vortex breakdown and the formation of the unsteady rope vortex downstream of the turbine runner. Both URANS and DES appear to yield the general shape and magnitude of mean velocity profiles in reasonable agreement with measurements. Significant discrepancies among the DES and URANS predictions of the turbulence statistics are also observed in the straight downstream diffuser.
Generalized t-statistic for two-group classification.
Komori, Osamu; Eguchi, Shinto; Copas, John B
2015-06-01
In the classic discriminant model of two multivariate normal distributions with equal variance matrices, the linear discriminant function is optimal both in terms of the log likelihood ratio and in terms of maximizing the standardized difference (the t-statistic) between the means of the two distributions. In a typical case-control study, normality may be sensible for the control sample but heterogeneity and uncertainty in diagnosis may suggest that a more flexible model is needed for the cases. We generalize the t-statistic approach by finding the linear function which maximizes a standardized difference but with data from one of the groups (the cases) filtered by a possibly nonlinear function U. We study conditions for consistency of the method and find the function U which is optimal in the sense of asymptotic efficiency. Optimality may also extend to other measures of discriminatory efficiency such as the area under the receiver operating characteristic curve. The optimal function U depends on a scalar probability density function which can be estimated non-parametrically using a standard numerical algorithm. A lasso-like version for variable selection is implemented by adding L1-regularization to the generalized t-statistic. Two microarray data sets in the study of asthma and various cancers are used as motivating examples. © 2014, The International Biometric Society.
Energy Technology Data Exchange (ETDEWEB)
Wang, Shi-bing, E-mail: wang-shibing@dlut.edu.cn, E-mail: wangxy@dlut.edu.cn [School of Computer and Information Engineering, Fuyang Normal University, Fuyang 236041 (China); Faculty of Electronic Information and Electrical Engineering, Dalian University of Technology, Dalian 116024 (China); Wang, Xing-yuan, E-mail: wang-shibing@dlut.edu.cn, E-mail: wangxy@dlut.edu.cn [Faculty of Electronic Information and Electrical Engineering, Dalian University of Technology, Dalian 116024 (China); Wang, Xiu-you [School of Computer and Information Engineering, Fuyang Normal University, Fuyang 236041 (China); Zhou, Yu-fei [College of Electrical Engineering and Automation, Anhui University, Hefei 230601 (China)
2016-04-15
With comprehensive consideration of generalized synchronization, combination synchronization and adaptive control, this paper investigates a novel adaptive generalized combination complex synchronization (AGCCS) scheme for different real and complex nonlinear systems with unknown parameters. On the basis of Lyapunov stability theory and adaptive control, an AGCCS controller and parameter update laws are derived to achieve synchronization and parameter identification of two real drive systems and a complex response system, as well as two complex drive systems and a real response system. Two simulation examples, namely, ACGCS for chaotic real Lorenz and Chen systems driving a hyperchaotic complex Lü system, and hyperchaotic complex Lorenz and Chen systems driving a real chaotic Lü system, are presented to verify the feasibility and effectiveness of the proposed scheme.
Low-complexity blind equalization for OFDM systems with general constellations
Al-Naffouri, Tareq Y.
2012-12-01
This paper proposes a low-complexity algorithm for blind equalization of data in orthogonal frequency division multiplexing (OFDM)-based wireless systems with general constellations. The proposed algorithm is able to recover the transmitted data even when the channel changes on a symbol-by-symbol basis, making it suitable for fast fading channels. The proposed algorithm does not require any statistical information about the channel and thus does not suffer from latency normally associated with blind methods. The paper demonstrates how to reduce the complexity of the algorithm, which becomes especially low at high signal-to-noise ratio (SNR). Specifically, it is shown that in the high SNR regime, the number of operations is of the order O(LN), where L is the cyclic prefix length and N is the total number of subcarriers. Simulation results confirm the favorable performance of the proposed algorithm. © 2012 IEEE.
Generalized Optical Theorem Detection in Random and Complex Media
Tu, Jing
The problem of detecting changes of a medium or environment based on active, transmit-plus-receive wave sensor data is at the heart of many important applications including radar, surveillance, remote sensing, nondestructive testing, and cancer detection. This is a challenging problem because both the change or target and the surrounding background medium are in general unknown and can be quite complex. This Ph.D. dissertation presents a new wave physics-based approach for the detection of targets or changes in rather arbitrary backgrounds. The proposed methodology is rooted on a fundamental result of wave theory called the optical theorem, which gives real physical energy meaning to the statistics used for detection. This dissertation is composed of two main parts. The first part significantly expands the theory and understanding of the optical theorem for arbitrary probing fields and arbitrary media including nonreciprocal media, active media, as well as time-varying and nonlinear scatterers. The proposed formalism addresses both scalar and full vector electromagnetic fields. The second contribution of this dissertation is the application of the optical theorem to change detection with particular emphasis on random, complex, and active media, including single frequency probing fields and broadband probing fields. The first part of this work focuses on the generalization of the existing theoretical repertoire and interpretation of the scalar and electromagnetic optical theorem. Several fundamental generalizations of the optical theorem are developed. A new theory is developed for the optical theorem for scalar fields in nonhomogeneous media which can be bounded or unbounded. The bounded media context is essential for applications such as intrusion detection and surveillance in enclosed environments such as indoor facilities, caves, tunnels, as well as for nondestructive testing and communication systems based on wave-guiding structures. The developed scalar
Statistical analysis of complex systems with nonclassical invariant measures
Fratalocchi, Andrea
2011-01-01
I investigate the problem of finding a statistical description of a complex many-body system whose invariant measure cannot be constructed stemming from classical thermodynamics ensembles. By taking solitons as a reference system and by employing a
Generalized field quantization and statistics of elementary particles
International Nuclear Information System (INIS)
Govorkov, A.V.
1994-01-01
Generalized schemes for the quantization of free fields based on the deformed trilinear relations of Green are investigated. A theorem shows that in reality continuous deformation is impossible. In particular, it is shown that a open-quotes smallclose quotes violation of the ordinary Fermi and Bose statistics is impossible both in the framework of local field theory, corresponding to parastatistics of finite orders, and in the framework of nonlocal field theory, corresponding to infinite statistics. The existence of antiparticles plays a decisive role in establishing the matter case. 23 refs
Statistical emission of complex fragments from highly excited compound nucleus
International Nuclear Information System (INIS)
Matsuse, T.
1991-01-01
A full statistical analysis has been given in terms of the Extended Hauser-Feshbach method. The charge and kinetic energy distributions of 35 Cl+ 12 C reaction at E lab = 180, 200 MeV and 23 Na+ 24 Mg reaction at E lab = 89 MeV which form the 47 V compound nucleus are investigated as a prototype of the light mass system. The measured kinetic energy distributions of the complex fragments are shown to be well reproduced by the Extended Hauser-Feshbach method, so the observed complex fragment production is understood as the statistical binary decay from the compound nucleus induced by heavy-ion reaction. Next, this method is applied to the study of the complex production from the 111 In compound nucleus which is formed by the 84 Kr+ 27 Al reaction at E lab = 890 MeV. (K.A.) 18 refs., 10 figs
Pseudo-complex general relativity
Hess, Peter O; Greiner, Walter
2016-01-01
This volume presents an pseudo-complex extension of General Relativity which addresses these issues and presents proposals for experimental examinations in strong fields near a large mass. General Relativity is a beautiful and well tested theory of gravitation. Nevertheless, it implies conceptual problems like the creation of singularities (Black Holes) as a result of the collapse of large masses, or the appearance of event horizons which exclude parts of the space-time from the observation of external observers. The mathematical and geometrical foundations of this extension are displayed in detail, and applications including orbits and accretion disks around large central masses, neutron stars or cosmological models are introduced. Calculations both for classical and extended applications are often executed in the form of problems with extensive solutions, which makes this volume also a valuable resource for any student of General Relativity.
Measuring streetscape complexity based on the statistics of local contrast and spatial frequency.
Directory of Open Access Journals (Sweden)
André Cavalcante
Full Text Available Streetscapes are basic urban elements which play a major role in the livability of a city. The visual complexity of streetscapes is known to influence how people behave in such built spaces. However, how and which characteristics of a visual scene influence our perception of complexity have yet to be fully understood. This study proposes a method to evaluate the complexity perceived in streetscapes based on the statistics of local contrast and spatial frequency. Here, 74 streetscape images from four cities, including daytime and nighttime scenes, were ranked for complexity by 40 participants. Image processing was then used to locally segment contrast and spatial frequency in the streetscapes. The statistics of these characteristics were extracted and later combined to form a single objective measure. The direct use of statistics revealed structural or morphological patterns in streetscapes related to the perception of complexity. Furthermore, in comparison to conventional measures of visual complexity, the proposed objective measure exhibits a higher correlation with the opinion of the participants. Also, the performance of this method is more robust regarding different time scenarios.
Impulsive generalized function synchronization of complex dynamical networks
International Nuclear Information System (INIS)
Zhang, Qunjiao; Chen, Juan; Wan, Li
2013-01-01
This Letter investigates generalized function synchronization of continuous and discrete complex networks by impulsive control. By constructing the reasonable corresponding impulsively controlled response networks, some criteria and corollaries are derived for the generalized function synchronization between the impulsively controlled complex networks, continuous and discrete networks are both included. Furthermore, the generalized linear synchronization and nonlinear synchronization are respectively illustrated by several examples. All the numerical simulations demonstrate the correctness of the theoretical results
Operation Statistics of the CERN Accelerators Complex for 2003
CERN. Geneva; Baird, S A; Rey, A; Steerenberg, R; CERN. Geneva. AB Department
2004-01-01
This report gives an overview of the performance of the different Accelerators (Linacs, PS Booster, PS, AD and SPS) of the CERN Accelerator Complex for 2003. It includes scheduled activities, beam availabilities, beam intensities and an analysis of faults and breakdowns by system and by beam. MORE INFORATION by using the OP Statistics Tool: http://eLogbook.web.cern.ch/eLogbook/statistics.php and on the SPS HomePage: http://ab-div-op-sps.web.cern.ch/ab-div-op-sps/SPSss.html
Equilibrium statistical mechanics
Jackson, E Atlee
2000-01-01
Ideal as an elementary introduction to equilibrium statistical mechanics, this volume covers both classical and quantum methodology for open and closed systems. Introductory chapters familiarize readers with probability and microscopic models of systems, while additional chapters describe the general derivation of the fundamental statistical mechanics relationships. The final chapter contains 16 sections, each dealing with a different application, ordered according to complexity, from classical through degenerate quantum statistical mechanics. Key features include an elementary introduction t
On the Limit Distribution of Lower Extreme Generalized Order Statistics
Indian Academy of Sciences (India)
In a wide subclass of generalized order statistics ( g O s ) , which contains most of the known and important models of ordered random variables, weak convergence of lower extremes are developed. A recent result of extreme value theory of m − g O s (as well as the classical extreme value theory of ordinary order statistics) ...
Foundations of Complex Systems Nonlinear Dynamics, Statistical Physics, and Prediction
Nicolis, Gregoire
2007-01-01
Complexity is emerging as a post-Newtonian paradigm for approaching a large body of phenomena of concern at the crossroads of physical, engineering, environmental, life and human sciences from a unifying point of view. This book outlines the foundations of modern complexity research as it arose from the cross-fertilization of ideas and tools from nonlinear science, statistical physics and numerical simulation. It is shown how these developments lead to an understanding, both qualitative and quantitative, of the complex systems encountered in nature and in everyday experience and, conversely, h
Domain general constraints on statistical learning.
Thiessen, Erik D
2011-01-01
All theories of language development suggest that learning is constrained. However, theories differ on whether these constraints arise from language-specific processes or have domain-general origins such as the characteristics of human perception and information processing. The current experiments explored constraints on statistical learning of patterns, such as the phonotactic patterns of an infants' native language. Infants in these experiments were presented with a visual analog of a phonotactic learning task used by J. R. Saffran and E. D. Thiessen (2003). Saffran and Thiessen found that infants' phonotactic learning was constrained such that some patterns were learned more easily than other patterns. The current results indicate that infants' learning of visual patterns shows the same constraints as infants' learning of phonotactic patterns. This is consistent with theories suggesting that constraints arise from domain-general sources and, as such, should operate over many kinds of stimuli in addition to linguistic stimuli. © 2011 The Author. Child Development © 2011 Society for Research in Child Development, Inc.
Spinor formalism and complex-vector formalism of general relativity
International Nuclear Information System (INIS)
Han-ying, G.; Yong-shi, W.; Gendao, L.
1974-01-01
In this paper, using E. Cartan's exterior calculus, we give the spinor form of the structure equations, which leads naturally to the Newman--Penrose equations. Furthermore, starting from the spinor spaces and the el (2C) algebra, we construct the general complex-vector formalism of general relativity. We find that both the Cahen--Debever--Defrise complex-vector formalism and that of Brans are its special cases. Thus, the spinor formalism and the complex-vector formalism of general relativity are unified on the basis of the uni-modular group SL(2C) and its Lie algebra
Geometric Transitions, Topological Strings, and Generalized Complex Geometry
International Nuclear Information System (INIS)
Chuang, Wu-yen
2007-01-01
Mirror symmetry is one of the most beautiful symmetries in string theory. It helps us very effectively gain insights into non-perturbative worldsheet instanton effects. It was also shown that the study of mirror symmetry for Calabi-Yau flux compactification leads us to the territory of ''Non-Kaehlerity''. In this thesis we demonstrate how to construct a new class of symplectic non-Kaehler and complex non-Kaehler string theory vacua via generalized geometric transitions. The class admits a mirror pairing by construction. From a variety of sources, including super-gravity analysis and KK reduction on SU(3) structure manifolds, we conclude that string theory connects Calabi-Yau spaces to both complex non-Kaehler and symplectic non-Kaehler manifolds and the resulting manifolds lie in generalized complex geometry. We go on to study the topological twisted models on a class of generalized complex geometry, bi-Hermitian geometry, which is the most general target space for (2, 2) world-sheet theory with non-trivial H flux turned on. We show that the usual Kaehler A and B models are generalized in a natural way. Since the gauged supergravity is the low energy effective theory for the compactifications on generalized geometries, we study the fate of flux-induced isometry gauging in N = 2 IIA and heterotic strings under non-perturbative instanton effects. Interestingly, we find we have protection mechanisms preventing the corrections to the hyper moduli spaces. Besides generalized geometries, we also discuss the possibility of new NS-NS fluxes in a new doubled formalism
Geometric Transitions, Topological Strings, and Generalized Complex Geometry
Energy Technology Data Exchange (ETDEWEB)
Chuang, Wu-yen; /SLAC /Stanford U., Phys. Dept.
2007-06-29
Mirror symmetry is one of the most beautiful symmetries in string theory. It helps us very effectively gain insights into non-perturbative worldsheet instanton effects. It was also shown that the study of mirror symmetry for Calabi-Yau flux compactification leads us to the territory of ''Non-Kaehlerity''. In this thesis we demonstrate how to construct a new class of symplectic non-Kaehler and complex non-Kaehler string theory vacua via generalized geometric transitions. The class admits a mirror pairing by construction. From a variety of sources, including super-gravity analysis and KK reduction on SU(3) structure manifolds, we conclude that string theory connects Calabi-Yau spaces to both complex non-Kaehler and symplectic non-Kaehler manifolds and the resulting manifolds lie in generalized complex geometry. We go on to study the topological twisted models on a class of generalized complex geometry, bi-Hermitian geometry, which is the most general target space for (2, 2) world-sheet theory with non-trivial H flux turned on. We show that the usual Kaehler A and B models are generalized in a natural way. Since the gauged supergravity is the low energy effective theory for the compactifications on generalized geometries, we study the fate of flux-induced isometry gauging in N = 2 IIA and heterotic strings under non-perturbative instanton effects. Interestingly, we find we have protection mechanisms preventing the corrections to the hyper moduli spaces. Besides generalized geometries, we also discuss the possibility of new NS-NS fluxes in a new doubled formalism.
International Nuclear Information System (INIS)
Onchi, T; Fujisawa, A; Sanpei, A; Himura, H; Masamune, S
2017-01-01
Permutation entropy and statistical complexity are measures for complex time series. The Bandt–Pompe methodology evaluates probability distribution using permutation. The method is robust and effective to quantify information of time series data. Statistical complexity is the product of Jensen–Shannon divergence and permutation entropy. These physical parameters are introduced to analyse time series of emission and magnetic fluctuations in low-aspect-ratio reversed-field pinch (RFP) plasma. The observed time-series data aggregates in a region of the plane, the so-called C – H plane, determined by entropy versus complexity. The C – H plane is a representation space used for distinguishing periodic, chaos, stochastic and noisy processes of time series data. The characteristics of the emissions and magnetic fluctuation change under different RFP-plasma conditions. The statistical complexities of soft x-ray emissions and magnetic fluctuations depend on the relationships between reversal and pinch parameters. (paper)
Evolutionary Statistical Procedures
Baragona, Roberto; Poli, Irene
2011-01-01
This proposed text appears to be a good introduction to evolutionary computation for use in applied statistics research. The authors draw from a vast base of knowledge about the current literature in both the design of evolutionary algorithms and statistical techniques. Modern statistical research is on the threshold of solving increasingly complex problems in high dimensions, and the generalization of its methodology to parameters whose estimators do not follow mathematically simple distributions is underway. Many of these challenges involve optimizing functions for which analytic solutions a
Generalized ensemble theory with non-extensive statistics
Shen, Ke-Ming; Zhang, Ben-Wei; Wang, En-Ke
2017-12-01
The non-extensive canonical ensemble theory is reconsidered with the method of Lagrange multipliers by maximizing Tsallis entropy, with the constraint that the normalized term of Tsallis' q -average of physical quantities, the sum ∑ pjq, is independent of the probability pi for Tsallis parameter q. The self-referential problem in the deduced probability and thermal quantities in non-extensive statistics is thus avoided, and thermodynamical relationships are obtained in a consistent and natural way. We also extend the study to the non-extensive grand canonical ensemble theory and obtain the q-deformed Bose-Einstein distribution as well as the q-deformed Fermi-Dirac distribution. The theory is further applied to the generalized Planck law to demonstrate the distinct behaviors of the various generalized q-distribution functions discussed in literature.
Statistical equilibrium and symplectic geometry in general relativity
International Nuclear Information System (INIS)
Iglesias, P.
1981-09-01
A geometrical construction is given of the statistical equilibrium states of a system of particles in the gravitational field in general relativity. By a method of localization variables, the expression of thermodynamic values is given and the compatibility of this description is shown with a macroscopic model of a relativistic continuous medium for a given value of the free-energy function [fr
Statistical complexity is maximized in a small-world brain.
Directory of Open Access Journals (Sweden)
Teck Liang Tan
Full Text Available In this paper, we study a network of Izhikevich neurons to explore what it means for a brain to be at the edge of chaos. To do so, we first constructed the phase diagram of a single Izhikevich excitatory neuron, and identified a small region of the parameter space where we find a large number of phase boundaries to serve as our edge of chaos. We then couple the outputs of these neurons directly to the parameters of other neurons, so that the neuron dynamics can drive transitions from one phase to another on an artificial energy landscape. Finally, we measure the statistical complexity of the parameter time series, while the network is tuned from a regular network to a random network using the Watts-Strogatz rewiring algorithm. We find that the statistical complexity of the parameter dynamics is maximized when the neuron network is most small-world-like. Our results suggest that the small-world architecture of neuron connections in brains is not accidental, but may be related to the information processing that they do.
Generalized statistical mechanics approaches to earthquakes and tectonics
Papadakis, Giorgos; Michas, Georgios
2016-01-01
Despite the extreme complexity that characterizes the mechanism of the earthquake generation process, simple empirical scaling relations apply to the collective properties of earthquakes and faults in a variety of tectonic environments and scales. The physical characterization of those properties and the scaling relations that describe them attract a wide scientific interest and are incorporated in the probabilistic forecasting of seismicity in local, regional and planetary scales. Considerable progress has been made in the analysis of the statistical mechanics of earthquakes, which, based on the principle of entropy, can provide a physical rationale to the macroscopic properties frequently observed. The scale-invariant properties, the (multi) fractal structures and the long-range interactions that have been found to characterize fault and earthquake populations have recently led to the consideration of non-extensive statistical mechanics (NESM) as a consistent statistical mechanics framework for the description of seismicity. The consistency between NESM and observations has been demonstrated in a series of publications on seismicity, faulting, rock physics and other fields of geosciences. The aim of this review is to present in a concise manner the fundamental macroscopic properties of earthquakes and faulting and how these can be derived by using the notions of statistical mechanics and NESM, providing further insights into earthquake physics and fault growth processes. PMID:28119548
Directory of Open Access Journals (Sweden)
Shibing Wang
2016-02-01
Full Text Available This paper introduces a new memristor-based hyperchaotic complex Lü system (MHCLS and investigates its adaptive complex generalized synchronization (ACGS. Firstly, the complex system is constructed based on a memristor-based hyperchaotic real Lü system, and its properties are analyzed theoretically. Secondly, its dynamical behaviors, including hyperchaos, chaos, transient phenomena, as well as periodic behaviors, are explored numerically by means of bifurcation diagrams, Lyapunov exponents, phase portraits, and time history diagrams. Thirdly, an adaptive controller and a parameter estimator are proposed to realize complex generalized synchronization and parameter identification of two identical MHCLSs with unknown parameters based on Lyapunov stability theory. Finally, the numerical simulation results of ACGS and its applications to secure communication are presented to verify the feasibility and effectiveness of the proposed method.
PREFACE: Counting Complexity: An international workshop on statistical mechanics and combinatorics
de Gier, Jan; Warnaar, Ole
2006-07-01
On 10-15 July 2005 the conference `Counting Complexity: An international workshop on statistical mechanics and combinatorics' was held on Dunk Island, Queensland, Australia in celebration of Tony Guttmann's 60th birthday. Dunk Island provided the perfect setting for engaging in almost all of Tony's life-long passions: swimming, running, food, wine and, of course, plenty of mathematics and physics. The conference was attended by many of Tony's close scientific friends from all over the world, and most talks were presented by his past and present collaborators. This volume contains the proceedings of the meeting and consists of 24 refereed research papers in the fields of statistical mechanics, condensed matter physics and combinatorics. These papers provide an excellent illustration of the breadth and scope of Tony's work. The very first contribution, written by Stu Whittington, contains an overview of the many scientific achievements of Tony over the past 40 years in mathematics and physics. The organizing committee, consisting of Richard Brak, Aleks Owczarek, Jan de Gier, Emma Lockwood, Andrew Rechnitzer and Ole Warnaar, gratefully acknowledges the Australian Mathematical Society (AustMS), the Australian Mathematical Sciences Institute (AMSI), the ARC Centre of Excellence for Mathematics and Statistics of Complex Systems (MASCOS), the ARC Complex Open Systems Research Network (COSNet), the Institute of Physics (IOP) and the Department of Mathematics and Statistics of The University of Melbourne for financial support in organizing the conference. Tony, we hope that your future years in mathematics will be numerous. Count yourself lucky! Tony Guttman
Gardenier, John S
2012-12-01
This paper recommends how authors of statistical studies can communicate to general audiences fully, clearly, and comfortably. The studies may use statistical methods to explore issues in science, engineering, and society or they may address issues in statistics specifically. In either case, readers without explicit statistical training should have no problem understanding the issues, the methods, or the results at a non-technical level. The arguments for those results should be clear, logical, and persuasive. This paper also provides advice for editors of general journals on selecting high quality statistical articles without the need for exceptional work or expense. Finally, readers are also advised to watch out for some common errors or misuses of statistics that can be detected without a technical statistical background.
A κ-generalized statistical mechanics approach to income analysis
Clementi, F.; Gallegati, M.; Kaniadakis, G.
2009-02-01
This paper proposes a statistical mechanics approach to the analysis of income distribution and inequality. A new distribution function, having its roots in the framework of κ-generalized statistics, is derived that is particularly suitable for describing the whole spectrum of incomes, from the low-middle income region up to the high income Pareto power-law regime. Analytical expressions for the shape, moments and some other basic statistical properties are given. Furthermore, several well-known econometric tools for measuring inequality, which all exist in a closed form, are considered. A method for parameter estimation is also discussed. The model is shown to fit remarkably well the data on personal income for the United States, and the analysis of inequality performed in terms of its parameters is revealed as very powerful.
A κ-generalized statistical mechanics approach to income analysis
International Nuclear Information System (INIS)
Clementi, F; Gallegati, M; Kaniadakis, G
2009-01-01
This paper proposes a statistical mechanics approach to the analysis of income distribution and inequality. A new distribution function, having its roots in the framework of κ-generalized statistics, is derived that is particularly suitable for describing the whole spectrum of incomes, from the low–middle income region up to the high income Pareto power-law regime. Analytical expressions for the shape, moments and some other basic statistical properties are given. Furthermore, several well-known econometric tools for measuring inequality, which all exist in a closed form, are considered. A method for parameter estimation is also discussed. The model is shown to fit remarkably well the data on personal income for the United States, and the analysis of inequality performed in terms of its parameters is revealed as very powerful
Rényi statistics for testing composite hypotheses in general exponential models
Czech Academy of Sciences Publication Activity Database
Morales, D.; Pardo, L.; Pardo, M. C.; Vajda, Igor
2004-01-01
Roč. 38, č. 2 (2004), s. 133-147 ISSN 0233-1888 R&D Projects: GA ČR GA201/02/1391 Grant - others:BMF(ES) 2003-00892; BMF(ES) 2003-04820 Institutional research plan: CEZ:AV0Z1075907 Keywords : natural exponential models * Levy processes * generalized Wald statistics Subject RIV: BB - Applied Statistics, Operational Research Impact factor: 0.323, year: 2004
Statistical mechanics of complex neural systems and high dimensional data
International Nuclear Information System (INIS)
Advani, Madhu; Lahiri, Subhaneil; Ganguli, Surya
2013-01-01
Recent experimental advances in neuroscience have opened new vistas into the immense complexity of neuronal networks. This proliferation of data challenges us on two parallel fronts. First, how can we form adequate theoretical frameworks for understanding how dynamical network processes cooperate across widely disparate spatiotemporal scales to solve important computational problems? Second, how can we extract meaningful models of neuronal systems from high dimensional datasets? To aid in these challenges, we give a pedagogical review of a collection of ideas and theoretical methods arising at the intersection of statistical physics, computer science and neurobiology. We introduce the interrelated replica and cavity methods, which originated in statistical physics as powerful ways to quantitatively analyze large highly heterogeneous systems of many interacting degrees of freedom. We also introduce the closely related notion of message passing in graphical models, which originated in computer science as a distributed algorithm capable of solving large inference and optimization problems involving many coupled variables. We then show how both the statistical physics and computer science perspectives can be applied in a wide diversity of contexts to problems arising in theoretical neuroscience and data analysis. Along the way we discuss spin glasses, learning theory, illusions of structure in noise, random matrices, dimensionality reduction and compressed sensing, all within the unified formalism of the replica method. Moreover, we review recent conceptual connections between message passing in graphical models, and neural computation and learning. Overall, these ideas illustrate how statistical physics and computer science might provide a lens through which we can uncover emergent computational functions buried deep within the dynamical complexities of neuronal networks. (paper)
Statistics without Tears: Complex Statistics with Simple Arithmetic
Smith, Brian
2011-01-01
One of the often overlooked aspects of modern statistics is the analysis of time series data. Modern introductory statistics courses tend to rush to probabilistic applications involving risk and confidence. Rarely does the first level course linger on such useful and fascinating topics as time series decomposition, with its practical applications…
Kittisuwan, Pichid
2015-03-01
The application of image processing in industry has shown remarkable success over the last decade, for example, in security and telecommunication systems. The denoising of natural image corrupted by Gaussian noise is a classical problem in image processing. So, image denoising is an indispensable step during image processing. This paper is concerned with dual-tree complex wavelet-based image denoising using Bayesian techniques. One of the cruxes of the Bayesian image denoising algorithms is to estimate the statistical parameter of the image. Here, we employ maximum a posteriori (MAP) estimation to calculate local observed variance with generalized Gamma density prior for local observed variance and Laplacian or Gaussian distribution for noisy wavelet coefficients. Evidently, our selection of prior distribution is motivated by efficient and flexible properties of generalized Gamma density. The experimental results show that the proposed method yields good denoising results.
Murayama, Shogo; Kinugawa, Hikaru; Tokuda, Isao T.; Gotoda, Hiroshi
2018-02-01
We present an experimental study on the characterization of dynamic behavior of flow velocity field during thermoacoustic combustion oscillations in a turbulent confined combustor from the viewpoints of statistical complexity and complex-network theory, involving detection of a precursor of thermoacoustic combustion oscillations. The multiscale complexity-entropy causality plane clearly shows the possible presence of two dynamics, noisy periodic oscillations and noisy chaos, in the shear layer regions (1) between the outer recirculation region in the dump plate and a recirculation flow in the wake of the centerbody and (2) between the outer recirculation region in the dump plate and a vortex breakdown bubble away from the centerbody. The vertex strength in the turbulence network and the community structure of the vorticity field can identify the vortical interactions during thermoacoustic combustion oscillations. Sequential horizontal visibility graph motifs are useful for capturing a precursor of themoacoustic combustion oscillations.
Complexity analysis based on generalized deviation for financial markets
Li, Chao; Shang, Pengjian
2018-03-01
In this paper, a new modified method is proposed as a measure to investigate the correlation between past price and future volatility for financial time series, known as the complexity analysis based on generalized deviation. In comparison with the former retarded volatility model, the new approach is both simple and computationally efficient. The method based on the generalized deviation function presents us an exhaustive way showing the quantization of the financial market rules. Robustness of this method is verified by numerical experiments with both artificial and financial time series. Results show that the generalized deviation complexity analysis method not only identifies the volatility of financial time series, but provides a comprehensive way distinguishing the different characteristics between stock indices and individual stocks. Exponential functions can be used to successfully fit the volatility curves and quantify the changes of complexity for stock market data. Then we study the influence for negative domain of deviation coefficient and differences during the volatile periods and calm periods. after the data analysis of the experimental model, we found that the generalized deviation model has definite advantages in exploring the relationship between the historical returns and future volatility.
Structural Behavioral Study on the General Aviation Network Based on Complex Network
Zhang, Liang; Lu, Na
2017-12-01
The general aviation system is an open and dissipative system with complex structures and behavioral features. This paper has established the system model and network model for general aviation. We have analyzed integral attributes and individual attributes by applying the complex network theory and concluded that the general aviation network has influential enterprise factors and node relations. We have checked whether the network has small world effect, scale-free property and network centrality property which a complex network should have by applying degree distribution of functions and proved that the general aviation network system is a complex network. Therefore, we propose to achieve the evolution process of the general aviation industrial chain to collaborative innovation cluster of advanced-form industries by strengthening network multiplication effect, stimulating innovation performance and spanning the structural hole path.
Application of statistical physics approaches to complex organizations
Matia, Kaushik
The first part of this thesis studies two different kinds of financial markets, namely, the stock market and the commodity market. Stock price fluctuations display certain scale-free statistical features that are not unlike those found in strongly-interacting physical systems. The possibility that new insights can be gained using concepts and methods developed to understand scale-free physical phenomena has stimulated considerable research activity in the physics community. In the first part of this thesis a comparative study of stocks and commodities is performed in terms of probability density function and correlations of stock price fluctuations. It is found that the probability density of the stock price fluctuation has a power law functional form with an exponent 3, which is similar across different markets around the world. We present an autoregressive model to explain the origin of the power law functional form of the probability density function of the price fluctuation. The first part also presents the discovery of unique features of the Indian economy, which we find displays a scale-dependent probability density function. In the second part of this thesis we quantify the statistical properties of fluctuations of complex systems like business firms and world scientific publications. We analyze class size of these systems mentioned above where units agglomerate to form classes. We find that the width of the probability density function of growth rate decays with the class size as a power law with an exponent beta which is universal in the sense that beta is independent of the system studied. We also identify two other scaling exponents, gamma connecting the unit size to the class size and gamma connecting the number of units to the class size, where products are units and firms are classes. Finally we propose a generalized preferential attachment model to describe the class size distribution. This model is successful in explaining the growth rate and class
Directory of Open Access Journals (Sweden)
Jacobo Pardo-Seco
Full Text Available BACKGROUND: Mitochondrial DNA (mtDNA variation (i.e. haplogroups has been analyzed in regards to a number of multifactorial diseases. The statistical power of a case-control study determines the a priori probability to reject the null hypothesis of homogeneity between cases and controls. METHODS/PRINCIPAL FINDINGS: We critically review previous approaches to the estimation of the statistical power based on the restricted scenario where the number of cases equals the number of controls, and propose a methodology that broadens procedures to more general situations. We developed statistical procedures that consider different disease scenarios, variable sample sizes in cases and controls, and variable number of haplogroups and effect sizes. The results indicate that the statistical power of a particular study can improve substantially by increasing the number of controls with respect to cases. In the opposite direction, the power decreases substantially when testing a growing number of haplogroups. We developed mitPower (http://bioinformatics.cesga.es/mitpower/, a web-based interface that implements the new statistical procedures and allows for the computation of the a priori statistical power in variable scenarios of case-control study designs, or e.g. the number of controls needed to reach fixed effect sizes. CONCLUSIONS/SIGNIFICANCE: The present study provides with statistical procedures for the computation of statistical power in common as well as complex case-control study designs involving 2×k tables, with special application (but not exclusive to mtDNA studies. In order to reach a wide range of researchers, we also provide a friendly web-based tool--mitPower--that can be used in both retrospective and prospective case-control disease studies.
On the limit distribution of lower extreme generalized order statistics
Indian Academy of Sciences (India)
Abstract. In a wide subclass of generalized order statistics (gOs), which contains most of the known and important models of ordered random variables, weak conver- gence of lower extremes are developed. A recent result of extreme value theory of m−gOs (as well as the classical extreme value theory of ordinary order ...
General statistical data structure for epidemiologic studies of DOE workers
International Nuclear Information System (INIS)
Frome, E.L.; Hudson, D.R.
1981-01-01
Epidemiologic studies to evaluate the occupational risks associated with employment in the nuclear industry are currently being conducted by the Department of Energy. Data that have potential value in evaluating any long-term health effects of occupational exposure to low levels of radiation are obtained for each individual at a given facility. We propose a general data structure for statistical analysis that is used to define transformations from the data management system into the data analysis system. Statistical methods of interest in epidemiologic studies include contingency table analysis and survival analysis procedures that can be used to evaluate potential associations between occupational radiation exposure and mortality. The purposes of this paper are to discuss (1) the adequacy of this data structure for single- and multiple-facility analysis and (2) the statistical computing problems encountered in dealing with large populations over extended periods of time
International Nuclear Information System (INIS)
Xu Yuhua; Zhou Wuneng; Fang Jian'an; Lu Hongqian
2009-01-01
This Letter proposes an approach to identify the topological structure and unknown parameters for uncertain general complex networks simultaneously. By designing effective adaptive controllers, we achieve synchronization between two complex networks. The unknown network topological structure and system parameters of uncertain general complex dynamical networks are identified simultaneously in the process of synchronization. Several useful criteria for synchronization are given. Finally, an illustrative example is presented to demonstrate the application of the theoretical results.
Energy Technology Data Exchange (ETDEWEB)
Xu Yuhua, E-mail: yuhuaxu2004@163.co [College of Information Science and Technology, Donghua University, Shanghai 201620 (China) and Department of Maths, Yunyang Teacher' s College, Hubei 442000 (China); Zhou Wuneng, E-mail: wnzhou@163.co [College of Information Science and Technology, Donghua University, Shanghai 201620 (China); Fang Jian' an [College of Information Science and Technology, Donghua University, Shanghai 201620 (China); Lu Hongqian [Shandong Institute of Light Industry, Shandong Jinan 250353 (China)
2009-12-28
This Letter proposes an approach to identify the topological structure and unknown parameters for uncertain general complex networks simultaneously. By designing effective adaptive controllers, we achieve synchronization between two complex networks. The unknown network topological structure and system parameters of uncertain general complex dynamical networks are identified simultaneously in the process of synchronization. Several useful criteria for synchronization are given. Finally, an illustrative example is presented to demonstrate the application of the theoretical results.
Systems and complexity thinking in general practice: part 1 - clinical application.
Sturmberg, Joachim P
2007-03-01
Many problems encountered in general practice cannot be sufficiently explained within the Newtonian reductionist paradigm. Systems and complexity thinking - already widely adopted in most nonmedical disciplines - describes and explores the contextual nature of questions posed in medicine, and in general practice in particular. This article briefly describes the framework underpinning systems and complexity sciences. A case study illustrates how systems and complexity thinking can help to better understand the contextual nature of patient presentations, and how different approaches will lead to different outcomes.
STATISTICAL ANALYSIS OF RAW SUGAR MATERIAL FOR SUGAR PRODUCER COMPLEX
A. A. Gromkovskii; O. I. Sherstyuk
2015-01-01
Summary. In the article examines the statistical data on the development of average weight and average sugar content of sugar beet roots. The successful solution of the problem of forecasting these raw indices is essential for solving problems of sugar producing complex control. In the paper by calculating the autocorrelation function demonstrated that the predominant trend component of the growth raw characteristics. For construct the prediction model is proposed to use an autoregressive fir...
Unifying Complexity and Information
Ke, Da-Guan
2013-04-01
Complex systems, arising in many contexts in the computer, life, social, and physical sciences, have not shared a generally-accepted complexity measure playing a fundamental role as the Shannon entropy H in statistical mechanics. Superficially-conflicting criteria of complexity measurement, i.e. complexity-randomness (C-R) relations, have given rise to a special measure intrinsically adaptable to more than one criterion. However, deep causes of the conflict and the adaptability are not much clear. Here I trace the root of each representative or adaptable measure to its particular universal data-generating or -regenerating model (UDGM or UDRM). A representative measure for deterministic dynamical systems is found as a counterpart of the H for random process, clearly redefining the boundary of different criteria. And a specific UDRM achieving the intrinsic adaptability enables a general information measure that ultimately solves all major disputes. This work encourages a single framework coving deterministic systems, statistical mechanics and real-world living organisms.
A generalized statistical model for the size distribution of wealth
International Nuclear Information System (INIS)
Clementi, F; Gallegati, M; Kaniadakis, G
2012-01-01
In a recent paper in this journal (Clementi et al 2009 J. Stat. Mech. P02037), we proposed a new, physically motivated, distribution function for modeling individual incomes, having its roots in the framework of the κ-generalized statistical mechanics. The performance of the κ-generalized distribution was checked against real data on personal income for the United States in 2003. In this paper we extend our previous model so as to be able to account for the distribution of wealth. Probabilistic functions and inequality measures of this generalized model for wealth distribution are obtained in closed form. In order to check the validity of the proposed model, we analyze the US household wealth distributions from 1984 to 2009 and conclude an excellent agreement with the data that is superior to any other model already known in the literature. (paper)
A generalized statistical model for the size distribution of wealth
Clementi, F.; Gallegati, M.; Kaniadakis, G.
2012-12-01
In a recent paper in this journal (Clementi et al 2009 J. Stat. Mech. P02037), we proposed a new, physically motivated, distribution function for modeling individual incomes, having its roots in the framework of the κ-generalized statistical mechanics. The performance of the κ-generalized distribution was checked against real data on personal income for the United States in 2003. In this paper we extend our previous model so as to be able to account for the distribution of wealth. Probabilistic functions and inequality measures of this generalized model for wealth distribution are obtained in closed form. In order to check the validity of the proposed model, we analyze the US household wealth distributions from 1984 to 2009 and conclude an excellent agreement with the data that is superior to any other model already known in the literature.
Spectral statistics and scattering resonances of complex primes arrays
Wang, Ren; Pinheiro, Felipe A.; Dal Negro, Luca
2018-01-01
We introduce a class of aperiodic arrays of electric dipoles generated from the distribution of prime numbers in complex quadratic fields (Eisenstein and Gaussian primes) as well as quaternion primes (Hurwitz and Lifschitz primes), and study the nature of their scattering resonances using the vectorial Green's matrix method. In these systems we demonstrate several distinctive spectral properties, such as the absence of level repulsion in the strongly scattering regime, critical statistics of level spacings, and the existence of critical modes, which are extended fractal modes with long lifetimes not supported by either random or periodic systems. Moreover, we show that one can predict important physical properties, such as the existence spectral gaps, by analyzing the eigenvalue distribution of the Green's matrix of the arrays in the complex plane. Our results unveil the importance of aperiodic correlations in prime number arrays for the engineering of gapped photonic media that support far richer mode localization and spectral properties compared to usual periodic and random media.
Tayurskii, Dmitrii; Abe, Sumiyoshi; Alexandre Wang, Q.
2012-11-01
The 3rd International Workshop on Statistical Physics and Mathematics for Complex Systems (SPMCS2012) was held between 25-30 August at Kazan (Volga Region) Federal University, Kazan, Russian Federation. This workshop was jointly organized by Kazan Federal University and Institut Supérieur des Matériaux et Mécaniques Avancées (ISMANS), France. The series of SPMCS workshops was created in 2008 with the aim to be an interdisciplinary incubator for the worldwide exchange of innovative ideas and information about the latest results. The first workshop was held at ISMANS, Le Mans (France) in 2008, and the third at Huazhong Normal University, Wuhan (China) in 2010. At SPMCS2012, we wished to bring together a broad community of researchers from the different branches of the rapidly developing complexity science to discuss the fundamental theoretical challenges (geometry/topology, number theory, statistical physics, dynamical systems, etc) as well as experimental and applied aspects of many practical problems (condensed matter, disordered systems, financial markets, chemistry, biology, geoscience, etc). The program of SPMCS2012 was prepared based on three categories: (i) physical and mathematical studies (quantum mechanics, generalized nonequilibrium thermodynamics, nonlinear dynamics, condensed matter physics, nanoscience); (ii) natural complex systems (physical, geophysical, chemical and biological); (iii) social, economical, political agent systems and man-made complex systems. The conference attracted 64 participants from 10 countries. There were 10 invited lectures, 12 invited talks and 28 regular oral talks in the morning and afternoon sessions. The book of Abstracts is available from the conference website (http://www.ksu.ru/conf/spmcs2012/?id=3). A round table was also held, the topic of which was 'Recent and Anticipated Future Progress in Science of Complexity', discussing a variety of questions and opinions important for the understanding of the concept of
Canary, Jana D; Blizzard, Leigh; Barry, Ronald P; Hosmer, David W; Quinn, Stephen J
2016-05-01
Generalized linear models (GLM) with a canonical logit link function are the primary modeling technique used to relate a binary outcome to predictor variables. However, noncanonical links can offer more flexibility, producing convenient analytical quantities (e.g., probit GLMs in toxicology) and desired measures of effect (e.g., relative risk from log GLMs). Many summary goodness-of-fit (GOF) statistics exist for logistic GLM. Their properties make the development of GOF statistics relatively straightforward, but it can be more difficult under noncanonical links. Although GOF tests for logistic GLM with continuous covariates (GLMCC) have been applied to GLMCCs with log links, we know of no GOF tests in the literature specifically developed for GLMCCs that can be applied regardless of link function chosen. We generalize the Tsiatis GOF statistic originally developed for logistic GLMCCs, (TG), so that it can be applied under any link function. Further, we show that the algebraically related Hosmer-Lemeshow (HL) and Pigeon-Heyse (J(2) ) statistics can be applied directly. In a simulation study, TG, HL, and J(2) were used to evaluate the fit of probit, log-log, complementary log-log, and log models, all calculated with a common grouping method. The TG statistic consistently maintained Type I error rates, while those of HL and J(2) were often lower than expected if terms with little influence were included. Generally, the statistics had similar power to detect an incorrect model. An exception occurred when a log GLMCC was incorrectly fit to data generated from a logistic GLMCC. In this case, TG had more power than HL or J(2) . © 2015 John Wiley & Sons Ltd/London School of Economics.
Robustness of pinning a general complex dynamical network
International Nuclear Information System (INIS)
Wang Lei; Sun Youxian
2010-01-01
This Letter studies the robustness problem of pinning a general complex dynamical network toward an assigned synchronous evolution. Several synchronization criteria are presented to guarantee the convergence of the pinning process locally and globally by construction of Lyapunov functions. In particular, if a pinning strategy has been designed for synchronization of a given complex dynamical network, then no matter what uncertainties occur among the pinned nodes, synchronization can still be guaranteed through the pinning. The analytical results show that pinning control has a certain robustness against perturbations on network architecture: adding, deleting and changing the weights of edges. Numerical simulations illustrated by scale-free complex networks verify the theoretical results above-acquired.
Capturing rogue waves by multi-point statistics
International Nuclear Information System (INIS)
Hadjihosseini, A; Wächter, Matthias; Peinke, J; Hoffmann, N P
2016-01-01
As an example of a complex system with extreme events, we investigate ocean wave states exhibiting rogue waves. We present a statistical method of data analysis based on multi-point statistics which for the first time allows the grasping of extreme rogue wave events in a highly satisfactory statistical manner. The key to the success of the approach is mapping the complexity of multi-point data onto the statistics of hierarchically ordered height increments for different time scales, for which we can show that a stochastic cascade process with Markov properties is governed by a Fokker–Planck equation. Conditional probabilities as well as the Fokker–Planck equation itself can be estimated directly from the available observational data. With this stochastic description surrogate data sets can in turn be generated, which makes it possible to work out arbitrary statistical features of the complex sea state in general, and extreme rogue wave events in particular. The results also open up new perspectives for forecasting the occurrence probability of extreme rogue wave events, and even for forecasting the occurrence of individual rogue waves based on precursory dynamics. (paper)
Generalized Hamiltonians, functional integration and statistics of continuous fluids and plasmas
International Nuclear Information System (INIS)
Tasso, H.
1985-05-01
Generalized Hamiltonian formalism including generalized Poisson brackets and Lie-Poisson brackets is presented in Section II. Gyroviscous magnetohydrodynamics is treated as a relevant example in Euler and Clebsch variables. Section III is devoted to a short review of functional integration containing the definition and a discussion of ambiguities and methods of evaluation. The main part of the contribution is given in Section IV, where some of the content of the previous sections is applied to Gibbs statistics of continuous fluids and plasmas. In particular, exact fluctuation spectra are calculated for relevant equations in fluids and plasmas. (orig.)
Two statistical mechanics aspects of complex networks
Thurner, Stefan; Biely, Christoly
2006-12-01
By adopting an ensemble interpretation of non-growing rewiring networks, network theory can be reduced to a counting problem of possible network states and an identification of their associated probabilities. We present two scenarios of how different rewirement schemes can be used to control the state probabilities of the system. In particular, we review how by generalizing the linking rules of random graphs, in combination with superstatistics and quantum mechanical concepts, one can establish an exact relation between the degree distribution of any given network and the nodes’ linking probability distributions. In a second approach, we control state probabilities by a network Hamiltonian, whose characteristics are motivated by biological and socio-economical statistical systems. We demonstrate that a thermodynamics of networks becomes a fully consistent concept, allowing to study e.g. ‘phase transitions’ and computing entropies through thermodynamic relations.
Quantum mechanics as a natural generalization of classical statistical mechanics
International Nuclear Information System (INIS)
Xu Laizi; Qian Shangwu
1994-01-01
By comparison between equations of motion of geometrical optics (GO) and that of classical statistical mechanics (CSM), it is found that there should be an analogy between GO and CSM instead of GO and classical mechanics (CM). Furthermore, by comparison between the classical limit (CL) of quantum mechanics (QM) and CSM, the authors find that CL of QM is CSM not CM, hence they demonstrated that QM is a natural generalization of CSM instead of CM
Generalized Majority Logic Criterion to Analyze the Statistical Strength of S-Boxes
Hussain, Iqtadar; Shah, Tariq; Gondal, Muhammad Asif; Mahmood, Hasan
2012-05-01
The majority logic criterion is applicable in the evaluation process of substitution boxes used in the advanced encryption standard (AES). The performance of modified or advanced substitution boxes is predicted by processing the results of statistical analysis by the majority logic criteria. In this paper, we use the majority logic criteria to analyze some popular and prevailing substitution boxes used in encryption processes. In particular, the majority logic criterion is applied to AES, affine power affine (APA), Gray, Lui J, residue prime, S8 AES, Skipjack, and Xyi substitution boxes. The majority logic criterion is further extended into a generalized majority logic criterion which has a broader spectrum of analyzing the effectiveness of substitution boxes in image encryption applications. The integral components of the statistical analyses used for the generalized majority logic criterion are derived from results of entropy analysis, contrast analysis, correlation analysis, homogeneity analysis, energy analysis, and mean of absolute deviation (MAD) analysis.
Common Fixed Points of Generalized Cocyclic Mappings in Complex Valued Metric Spaces
Directory of Open Access Journals (Sweden)
Mujahid Abbas
2015-01-01
Full Text Available We present fixed point results of mappings satisfying generalized contractive conditions in complex valued metric spaces. As an application, we obtain a common fixed point of a pair of weakly compatible mappings. Some common fixed point results of generalized contractive-type mappings involved in cocyclic representation of a nonempty subset of a complex valued metric space are also obtained. Some examples are also presented to support the results proved herein. These results extend and generalize many results in the existing literature.
Biparametric complexities and generalized Planck radiation law
Puertas-Centeno, David; Toranzo, I. V.; Dehesa, J. S.
2017-12-01
Complexity theory embodies some of the hardest, most fundamental and most challenging open problems in modern science. The very term complexity is very elusive, so the main goal of this theory is to find meaningful quantifiers for it. In fact, we need various measures to take into account the multiple facets of this term. Here, some biparametric Crámer-Rao and Heisenberg-Rényi measures of complexity of continuous probability distributions are defined and discussed. Then, they are applied to blackbody radiation at temperature T in a d-dimensional universe. It is found that these dimensionless quantities do not depend on T nor on any physical constants. So, they have a universal character in the sense that they only depend on spatial dimensionality. To determine these complexity quantifiers, we have calculated their dispersion (typical deviations) and entropy (Rényi entropies and the generalized Fisher information) constituents. They are found to have a temperature-dependent behavior similar to the celebrated Wien’s displacement law of the dominant frequency ν_max at which the spectrum reaches its maximum. Moreover, they allow us to gain insights into new aspects of the d-dimensional blackbody spectrum and the quantification of quantum effects associated with space dimensionality.
Online Statistical Modeling (Regression Analysis) for Independent Responses
Made Tirta, I.; Anggraeni, Dian; Pandutama, Martinus
2017-06-01
Regression analysis (statistical analmodelling) are among statistical methods which are frequently needed in analyzing quantitative data, especially to model relationship between response and explanatory variables. Nowadays, statistical models have been developed into various directions to model various type and complex relationship of data. Rich varieties of advanced and recent statistical modelling are mostly available on open source software (one of them is R). However, these advanced statistical modelling, are not very friendly to novice R users, since they are based on programming script or command line interface. Our research aims to developed web interface (based on R and shiny), so that most recent and advanced statistical modelling are readily available, accessible and applicable on web. We have previously made interface in the form of e-tutorial for several modern and advanced statistical modelling on R especially for independent responses (including linear models/LM, generalized linier models/GLM, generalized additive model/GAM and generalized additive model for location scale and shape/GAMLSS). In this research we unified them in the form of data analysis, including model using Computer Intensive Statistics (Bootstrap and Markov Chain Monte Carlo/ MCMC). All are readily accessible on our online Virtual Statistics Laboratory. The web (interface) make the statistical modeling becomes easier to apply and easier to compare them in order to find the most appropriate model for the data.
Sturmberg, Joachim P; Martin, Carmel M; Katerndahl, David A
2014-01-01
Over the past 7 decades, theories in the systems and complexity sciences have had a major influence on academic thinking and research. We assessed the impact of complexity science on general practice/family medicine. We performed a historical integrative review using the following systematic search strategy: medical subject heading [humans] combined in turn with the terms complex adaptive systems, nonlinear dynamics, systems biology, and systems theory, limited to general practice/family medicine and published before December 2010. A total of 16,242 articles were retrieved, of which 49 were published in general practice/family medicine journals. Hand searches and snowballing retrieved another 35. After a full-text review, we included 56 articles dealing specifically with systems sciences and general/family practice. General practice/family medicine engaged with the emerging systems and complexity theories in 4 stages. Before 1995, articles tended to explore common phenomenologic general practice/family medicine experiences. Between 1995 and 2000, articles described the complex adaptive nature of this discipline. Those published between 2000 and 2005 focused on describing the system dynamics of medical practice. After 2005, articles increasingly applied the breadth of complex science theories to health care, health care reform, and the future of medicine. This historical review describes the development of general practice/family medicine in relation to complex adaptive systems theories, and shows how systems sciences more accurately reflect the discipline's philosophy and identity. Analysis suggests that general practice/family medicine first embraced systems theories through conscious reorganization of its boundaries and scope, before applying empirical tools. Future research should concentrate on applying nonlinear dynamics and empirical modeling to patient care, and to organizing and developing local practices, engaging in community development, and influencing
Kruger, Uwe
2012-01-01
The development and application of multivariate statistical techniques in process monitoring has gained substantial interest over the past two decades in academia and industry alike. Initially developed for monitoring and fault diagnosis in complex systems, such techniques have been refined and applied in various engineering areas, for example mechanical and manufacturing, chemical, electrical and electronic, and power engineering. The recipe for the tremendous interest in multivariate statistical techniques lies in its simplicity and adaptability for developing monitoring applica
Care complexity in the general hospital - Results from a European study
de Jonge, P; Huyse, FJ; Slaets, JPJ; Herzog, T; Lobo, A; Lyons, JS; Opmeer, BC; Stein, B; Arolt, [No Value; Balogh, N; Cardoso, G; Fink, P; Rigatelli, M; van Dijck, R; Mellenbergh, GJ
2001-01-01
There is increasing pressure to effectively treat patients with complex care needs from the moment of admission to the general hospital. In this study, the authors developed a measurement strategy for hospital-based care complexity. The authors' four-factor model describes the interrelations between
Outer synchronization between two different fractional-order general complex dynamical networks
International Nuclear Information System (INIS)
Xiang-Jun, Wu; Hong-Tao, Lu
2010-01-01
Outer synchronization between two different fractional-order general complex dynamical networks is investigated in this paper. Based on the stability theory of the fractional-order system, the sufficient criteria for outer synchronization are derived analytically by applying the nonlinear control and the bidirectional coupling methods. The proposed synchronization method is applicable to almost all kinds of coupled fractional-order general complex dynamical networks. Neither a symmetric nor irreducible coupling configuration matrix is required. In addition, no constraint is imposed on the inner-coupling matrix. Numerical examples are also provided to demonstrate the validity of the presented synchronization scheme. Numeric evidence shows that both the feedback strength k and the fractional order α can be chosen appropriately to adjust the synchronization effect effectively. (general)
International Nuclear Information System (INIS)
Sewell, G.L.
1986-01-01
The author shows how the basic axioms of quantum field theory, general relativity and statistical thermodynamics lead, in a model-independent way, to a generalized Hawking-Unruh effect, whereby the gravitational fields carried by a class of space-time manifolds with event horizons thermalize ambient quantum fields. The author is concerned with a quantum field on a space-time x containing a submanifold X' bounded by event horizons. The objective is to show that, for a wide class of space-times, the global vacuum state of the field reduces, in X', to a thermal state, whose temperature depends on the geometry. The statistical thermodynaical, geometrical, and quantum field theoretical essential ingredients for the reduction of the vacuum state are discussed
On the general procedure for modelling complex ecological systems
International Nuclear Information System (INIS)
He Shanyu.
1987-12-01
In this paper, the principle of a general procedure for modelling complex ecological systems, i.e. the Adaptive Superposition Procedure (ASP) is shortly stated. The result of application of ASP in a national project for ecological regionalization is also described. (author). 3 refs
Learning with Generalization Capability by Kernel Methods of Bounded Complexity
Czech Academy of Sciences Publication Activity Database
Kůrková, Věra; Sanguineti, M.
2005-01-01
Roč. 21, č. 3 (2005), s. 350-367 ISSN 0885-064X R&D Projects: GA AV ČR 1ET100300419 Institutional research plan: CEZ:AV0Z10300504 Keywords : supervised learning * generalization * model complexity * kernel methods * minimization of regularized empirical errors * upper bounds on rates of approximate optimization Subject RIV: BA - General Mathematics Impact factor: 1.186, year: 2005
Complexity control in statistical learning
Indian Academy of Sciences (India)
complexity of the class of models from which we are to choose our model. In this ... As is explained in §2, we use the concept of covering numbers to quantify the complexity of a class of ..... called structural risk minimization (SRM). Vapnik ...
Zikou, Anastasia K; Xydis, Vasileios G; Astrakas, Loukas G; Nakou, Iliada; Tzarouchi, Loukia C; Tzoufi, Meropi; Argyropoulou, Maria I
2016-07-01
There is evidence of microstructural changes in normal-appearing white matter of patients with tuberous sclerosis complex. To evaluate major white matter tracts in children with tuberous sclerosis complex using tract-based spatial statistics diffusion tensor imaging (DTI) analysis. Eight children (mean age ± standard deviation: 8.5 ± 5.5 years) with an established diagnosis of tuberous sclerosis complex and 8 age-matched controls were studied. The imaging protocol consisted of T1-weighted high-resolution 3-D spoiled gradient-echo sequence and a spin-echo, echo-planar diffusion-weighted sequence. Differences in the diffusion indices were evaluated using tract-based spatial statistics. Tract-based spatial statistics showed increased axial diffusivity in the children with tuberous sclerosis complex in the superior and anterior corona radiata, the superior longitudinal fascicle, the inferior fronto-occipital fascicle, the uncinate fascicle and the anterior thalamic radiation. No significant differences were observed in fractional anisotropy, mean diffusivity and radial diffusivity between patients and control subjects. No difference was found in the diffusion indices between the baseline and follow-up examination in the patient group. Patients with tuberous sclerosis complex have increased axial diffusivity in major white matter tracts, probably related to reduced axonal integrity.
International Nuclear Information System (INIS)
Zikou, Anastasia K.; Xydis, Vasileios G.; Tzarouchi, Loukia C.; Argyropoulou, Maria I.; Astrakas, Loukas G.; Nakou, Iliada; Tzoufi, Meropi
2016-01-01
There is evidence of microstructural changes in normal-appearing white matter of patients with tuberous sclerosis complex. To evaluate major white matter tracts in children with tuberous sclerosis complex using tract-based spatial statistics diffusion tensor imaging (DTI) analysis. Eight children (mean age ± standard deviation: 8.5 ± 5.5 years) with an established diagnosis of tuberous sclerosis complex and 8 age-matched controls were studied. The imaging protocol consisted of T1-weighted high-resolution 3-D spoiled gradient-echo sequence and a spin-echo, echo-planar diffusion-weighted sequence. Differences in the diffusion indices were evaluated using tract-based spatial statistics. Tract-based spatial statistics showed increased axial diffusivity in the children with tuberous sclerosis complex in the superior and anterior corona radiata, the superior longitudinal fascicle, the inferior fronto-occipital fascicle, the uncinate fascicle and the anterior thalamic radiation. No significant differences were observed in fractional anisotropy, mean diffusivity and radial diffusivity between patients and control subjects. No difference was found in the diffusion indices between the baseline and follow-up examination in the patient group. Patients with tuberous sclerosis complex have increased axial diffusivity in major white matter tracts, probably related to reduced axonal integrity. (orig.)
Autonomous Modeling, Statistical Complexity and Semi-annealed Treatment of Boolean Networks
Gong, Xinwei
This dissertation presents three studies on Boolean networks. Boolean networks are a class of mathematical systems consisting of interacting elements with binary state variables. Each element is a node with a Boolean logic gate, and the presence of interactions between any two nodes is represented by directed links. Boolean networks that implement the logic structures of real systems are studied as coarse-grained models of the real systems. Large random Boolean networks are studied with mean field approximations and used to provide a baseline of possible behaviors of large real systems. This dissertation presents one study of the former type, concerning the stable oscillation of a yeast cell-cycle oscillator, and two studies of the latter type, respectively concerning the statistical complexity of large random Boolean networks and an extension of traditional mean field techniques that accounts for the presence of short loops. In the cell-cycle oscillator study, a novel autonomous update scheme is introduced to study the stability of oscillations in small networks. A motif that corrects pulse-growing perturbations and a motif that grows pulses are identified. A combination of the two motifs is capable of sustaining stable oscillations. Examining a Boolean model of the yeast cell-cycle oscillator using an autonomous update scheme yields evidence that it is endowed with such a combination. Random Boolean networks are classified as ordered, critical or disordered based on their response to small perturbations. In the second study, random Boolean networks are taken as prototypical cases for the evaluation of two measures of complexity based on a criterion for optimal statistical prediction. One measure, defined for homogeneous systems, does not distinguish between the static spatial inhomogeneity in the ordered phase and the dynamical inhomogeneity in the disordered phase. A modification in which complexities of individual nodes are calculated yields vanishing
Landsman, V; Lou, W Y W; Graubard, B I
2015-05-20
We present a two-step approach for estimating hazard rates and, consequently, survival probabilities, by levels of general categorical exposure. The resulting estimator utilizes three sources of data: vital statistics data and census data are used at the first step to estimate the overall hazard rate for a given combination of gender and age group, and cohort data constructed from a nationally representative complex survey with linked mortality records, are used at the second step to divide the overall hazard rate by exposure levels. We present an explicit expression for the resulting estimator and consider two methods for variance estimation that account for complex multistage sample design: (1) the leaving-one-out jackknife method, and (2) the Taylor linearization method, which provides an analytic formula for the variance estimator. The methods are illustrated with smoking and all-cause mortality data from the US National Health Interview Survey Linked Mortality Files, and the proposed estimator is compared with a previously studied crude hazard rate estimator that uses survey data only. The advantages of a two-step approach and possible extensions of the proposed estimator are discussed. Copyright © 2015 John Wiley & Sons, Ltd.
Small nodule detectability evaluation using a generalized scan-statistic model
International Nuclear Information System (INIS)
Popescu, Lucretiu M; Lewitt, Robert M
2006-01-01
In this paper is investigated the use of the scan statistic for evaluating the detectability of small nodules in medical images. The scan-statistic method is often used in applications in which random fields must be searched for abnormal local features. Several results of the detection with localization theory are reviewed and a generalization is presented using the noise nodule distribution obtained by scanning arbitrary areas. One benefit of the noise nodule model is that it enables determination of the scan-statistic distribution by using only a few image samples in a way suitable both for simulation and experimental setups. Also, based on the noise nodule model, the case of multiple targets per image is addressed and an image abnormality test using the likelihood ratio and an alternative test using multiple decision thresholds are derived. The results obtained reveal that in the case of low contrast nodules or multiple nodules the usual test strategy based on a single decision threshold underperforms compared with the alternative tests. That is a consequence of the fact that not only the contrast or the size, but also the number of suspicious nodules is a clue indicating the image abnormality. In the case of the likelihood ratio test, the multiple clues are unified in a single decision variable. Other tests that process multiple clues differently do not necessarily produce a unique ROC curve, as shown in examples using a test involving two decision thresholds. We present examples with two-dimensional time-of-flight (TOF) and non-TOF PET image sets analysed using the scan statistic for different search areas, as well as the fixed position observer
Statistics of fermions in the Randall-Wilkins model for kinetics of general order
International Nuclear Information System (INIS)
Nieto H, B.; Azorin N, J.; Vazquez C, G.A.
2004-01-01
As a theoretical planning of the thermoluminescence phenomena (Tl), we study the behavior of the systems formed by fermions, which are related with this phenomenon establishing a generalization of the Randall-Wilkins model, as for first order kinetics as for general order (equation of May and Partridge) in which we consider a of Fermi-Dirac statistics. As consequence of this study a new variable is manifested: the chemical potential, also we establish its relationship with some of the other magnitudes already known in Tl. (Author)
Noble, J. H.; Lubasch, M.; Stevens, J.; Jentschura, U. D.
2017-12-01
We describe a matrix diagonalization algorithm for complex symmetric (not Hermitian) matrices, A ̲ =A̲T, which is based on a two-step algorithm involving generalized Householder reflections based on the indefinite inner product 〈 u ̲ , v ̲ 〉 ∗ =∑iuivi. This inner product is linear in both arguments and avoids complex conjugation. The complex symmetric input matrix is transformed to tridiagonal form using generalized Householder transformations (first step). An iterative, generalized QL decomposition of the tridiagonal matrix employing an implicit shift converges toward diagonal form (second step). The QL algorithm employs iterative deflation techniques when a machine-precision zero is encountered "prematurely" on the super-/sub-diagonal. The algorithm allows for a reliable and computationally efficient computation of resonance and antiresonance energies which emerge from complex-scaled Hamiltonians, and for the numerical determination of the real energy eigenvalues of pseudo-Hermitian and PT-symmetric Hamilton matrices. Numerical reference values are provided.
International Nuclear Information System (INIS)
Tiana-Alsina, J.; Torrent, M. C.; Masoller, C.; Garcia-Ojalvo, J.; Rosso, O. A.
2010-01-01
Low-frequency fluctuations (LFFs) represent a dynamical instability that occurs in semiconductor lasers when they are operated near the lasing threshold and subject to moderate optical feedback. LFFs consist of sudden power dropouts followed by gradual, stepwise recoveries. We analyze experimental time series of intensity dropouts and quantify the complexity of the underlying dynamics employing two tools from information theory, namely, Shannon's entropy and the Martin, Plastino, and Rosso statistical complexity measure. These measures are computed using a method based on ordinal patterns, by which the relative length and ordering of consecutive interdropout intervals (i.e., the time intervals between consecutive intensity dropouts) are analyzed, disregarding the precise timing of the dropouts and the absolute durations of the interdropout intervals. We show that this methodology is suitable for quantifying subtle characteristics of the LFFs, and in particular the transition to fully developed chaos that takes place when the laser's pump current is increased. Our method shows that the statistical complexity of the laser does not increase continuously with the pump current, but levels off before reaching the coherence collapse regime. This behavior coincides with that of the first- and second-order correlations of the interdropout intervals, suggesting that these correlations, and not the chaotic behavior, are what determine the level of complexity of the laser's dynamics. These results hold for two different dynamical regimes, namely, sustained LFFs and coexistence between LFFs and steady-state emission.
Low-complexity blind equalization for OFDM systems with general constellations
Al-Naffouri, Tareq Y.; Dahman, Ala A.; Sohail, Muhammad Sadiq; Xu, Weiyu; Hassibi, Babak
2012-01-01
This paper proposes a low-complexity algorithm for blind equalization of data in orthogonal frequency division multiplexing (OFDM)-based wireless systems with general constellations. The proposed algorithm is able to recover the transmitted data
Statistical mechanics of learning orthogonal signals for general covariance models
International Nuclear Information System (INIS)
Hoyle, David C
2010-01-01
Statistical mechanics techniques have proved to be useful tools in quantifying the accuracy with which signal vectors are extracted from experimental data. However, analysis has previously been limited to specific model forms for the population covariance C, which may be inappropriate for real world data sets. In this paper we obtain new statistical mechanical results for a general population covariance matrix C. For data sets consisting of p sample points in R N we use the replica method to study the accuracy of orthogonal signal vectors estimated from the sample data. In the asymptotic limit of N,p→∞ at fixed α = p/N, we derive analytical results for the signal direction learning curves. In the asymptotic limit the learning curves follow a single universal form, each displaying a retarded learning transition. An explicit formula for the location of the retarded learning transition is obtained and we find marked variation in the location of the retarded learning transition dependent on the distribution of population covariance eigenvalues. The results of the replica analysis are confirmed against simulation
Directory of Open Access Journals (Sweden)
Xin Jin
2012-02-01
Full Text Available This study focuses on the preparation and enzymic hydrolysis of an icariin/β-cyclodextrin inclusion complex to efficiently generate icaritin. The physical characteristics of the inclusion complex were evaluated by differential scanning calorimetry (DSC. Enzymatic hydrolysis was optimized for the conversion of icariin/β-cyclodextrin complex to icaritin by Box–Behnken statistical design. The inclusion complex formulation increased the solubility of icariin approximately 17-fold, from 29.2 to 513.5 μg/mL at 60 °C. The optimum conditions were predicted by Box–Behnken statistical design as follows: 60 °C, pH 7.0, the ratio of enzyme/substrate (1:1.1 and reaction time 7 h. Under the optimal conditions the conversion of icariin was 97.91% and the reaction time was decreased by 68% compared with that without β-CD inclusion. Product analysis by melting point, ESI-MS, UV, IR, 1H NMR and 13C NMR confirmed the authenticity of icaritin with a purity of 99.3% and a yield of 473 mg of icaritin from 1.1 g icariin.
2010-10-01
... testing; cytology general supervisor. 493.1467 Section 493.1467 Public Health CENTERS FOR MEDICARE....1467 Condition: Laboratories performing high complexity testing; cytology general supervisor. For the subspecialty of cytology, the laboratory must have a general supervisor who meets the qualification...
Multivariate statistical modelling based on generalized linear models
Fahrmeir, Ludwig
1994-01-01
This book is concerned with the use of generalized linear models for univariate and multivariate regression analysis. Its emphasis is to provide a detailed introductory survey of the subject based on the analysis of real data drawn from a variety of subjects including the biological sciences, economics, and the social sciences. Where possible, technical details and proofs are deferred to an appendix in order to provide an accessible account for non-experts. Topics covered include: models for multi-categorical responses, model checking, time series and longitudinal data, random effects models, and state-space models. Throughout, the authors have taken great pains to discuss the underlying theoretical ideas in ways that relate well to the data at hand. As a result, numerous researchers whose work relies on the use of these models will find this an invaluable account to have on their desks. "The basic aim of the authors is to bring together and review a large part of recent advances in statistical modelling of m...
Characterization of time series via Rényi complexity-entropy curves
Jauregui, M.; Zunino, L.; Lenzi, E. K.; Mendes, R. S.; Ribeiro, H. V.
2018-05-01
One of the most useful tools for distinguishing between chaotic and stochastic time series is the so-called complexity-entropy causality plane. This diagram involves two complexity measures: the Shannon entropy and the statistical complexity. Recently, this idea has been generalized by considering the Tsallis monoparametric generalization of the Shannon entropy, yielding complexity-entropy curves. These curves have proven to enhance the discrimination among different time series related to stochastic and chaotic processes of numerical and experimental nature. Here we further explore these complexity-entropy curves in the context of the Rényi entropy, which is another monoparametric generalization of the Shannon entropy. By combining the Rényi entropy with the proper generalization of the statistical complexity, we associate a parametric curve (the Rényi complexity-entropy curve) with a given time series. We explore this approach in a series of numerical and experimental applications, demonstrating the usefulness of this new technique for time series analysis. We show that the Rényi complexity-entropy curves enable the differentiation among time series of chaotic, stochastic, and periodic nature. In particular, time series of stochastic nature are associated with curves displaying positive curvature in a neighborhood of their initial points, whereas curves related to chaotic phenomena have a negative curvature; finally, periodic time series are represented by vertical straight lines.
Yilmaz, Ferkan
2012-12-01
The higher-order statistics (HOS) of the channel capacity μn=E[logn (1+γ end)], where n ∈ N denotes the order of the statistics, has received relatively little attention in the literature, due in part to the intractability of its analysis. In this letter, we propose a novel and unified analysis, which is based on the moment generating function (MGF) technique, to exactly compute the HOS of the channel capacity. More precisely, our mathematical formalism can be readily applied to maximal-ratio-combining (MRC) receivers operating in generalized fading environments. The mathematical formalism is illustrated by some numerical examples focusing on the correlated generalized fading environments. © 2012 IEEE.
Yilmaz, Ferkan; Alouini, Mohamed-Slim
2012-01-01
The higher-order statistics (HOS) of the channel capacity μn=E[logn (1+γ end)], where n ∈ N denotes the order of the statistics, has received relatively little attention in the literature, due in part to the intractability of its analysis. In this letter, we propose a novel and unified analysis, which is based on the moment generating function (MGF) technique, to exactly compute the HOS of the channel capacity. More precisely, our mathematical formalism can be readily applied to maximal-ratio-combining (MRC) receivers operating in generalized fading environments. The mathematical formalism is illustrated by some numerical examples focusing on the correlated generalized fading environments. © 2012 IEEE.
Statistical modelling with quantile functions
Gilchrist, Warren
2000-01-01
Galton used quantiles more than a hundred years ago in describing data. Tukey and Parzen used them in the 60s and 70s in describing populations. Since then, the authors of many papers, both theoretical and practical, have used various aspects of quantiles in their work. Until now, however, no one put all the ideas together to form what turns out to be a general approach to statistics.Statistical Modelling with Quantile Functions does just that. It systematically examines the entire process of statistical modelling, starting with using the quantile function to define continuous distributions. The author shows that by using this approach, it becomes possible to develop complex distributional models from simple components. A modelling kit can be developed that applies to the whole model - deterministic and stochastic components - and this kit operates by adding, multiplying, and transforming distributions rather than data.Statistical Modelling with Quantile Functions adds a new dimension to the practice of stati...
International Nuclear Information System (INIS)
Tokuyama, M.; Stanley, H.E.
2000-01-01
The main purpose of the Tohwa University International Conference on Statistical Physics is to provide an opportunity for an international group of experimentalists, theoreticians, and computational scientists who are working on various fields of statistical physics to gather together and discuss their recent advances. The conference covered six topics: complex systems, general methods of statistical physics, biological physics, cross-disciplinary physics, information science, and econophysics
Attempt to generalize fractional-order electric elements to complex-order ones
Si, Gangquan; Diao, Lijie; Zhu, Jianwei; Lei, Yuhang; Zhang, Yanbin
2017-06-01
The complex derivative {D}α +/- {{j}β }, with α, β \\in R+ is a generalization of the concept of integer derivative, where α=1, β=0. Fractional-order electric elements and circuits are becoming more and more attractive. In this paper, the complex-order electric elements concept is proposed for the first time, and the complex-order elements are modeled and analyzed. Some interesting phenomena are found that the real part of the order affects the phase of output signal, and the imaginary part affects the amplitude for both the complex-order capacitor and complex-order memristor. More interesting is that the complex-order capacitor can do well at the time of fitting electrochemistry impedance spectra. The complex-order memristor is also analyzed. The area inside the hysteresis loops increases with the increasing of the imaginary part of the order and decreases with the increasing of the real part. Some complex case of complex-order memristors hysteresis loops are analyzed at last, whose loop has touching points beyond the origin of the coordinate system.
Ma, Chuang; Chen, Han-Shuang; Lai, Ying-Cheng; Zhang, Hai-Feng
2018-02-01
Complex networks hosting binary-state dynamics arise in a variety of contexts. In spite of previous works, to fully reconstruct the network structure from observed binary data remains challenging. We articulate a statistical inference based approach to this problem. In particular, exploiting the expectation-maximization (EM) algorithm, we develop a method to ascertain the neighbors of any node in the network based solely on binary data, thereby recovering the full topology of the network. A key ingredient of our method is the maximum-likelihood estimation of the probabilities associated with actual or nonexistent links, and we show that the EM algorithm can distinguish the two kinds of probability values without any ambiguity, insofar as the length of the available binary time series is reasonably long. Our method does not require any a priori knowledge of the detailed dynamical processes, is parameter-free, and is capable of accurate reconstruction even in the presence of noise. We demonstrate the method using combinations of distinct types of binary dynamical processes and network topologies, and provide a physical understanding of the underlying reconstruction mechanism. Our statistical inference based reconstruction method contributes an additional piece to the rapidly expanding "toolbox" of data based reverse engineering of complex networked systems.
On the role of complex phases in the quantum statistics of weak measurements
International Nuclear Information System (INIS)
Hofmann, Holger F
2011-01-01
Weak measurements carried out between quantum state preparation and post-selection result in complex values for self-adjoint operators, corresponding to complex conditional probabilities for the projections on specific eigenstates. In this paper it is shown that the complex phases of these weak conditional probabilities describe the dynamic response of the system to unitary transformations. Quantum mechanics thus unifies the statistical overlap of different states with the dynamical structure of transformations between these states. Specifically, it is possible to identify the phase of weak conditional probabilities directly with the action of a unitary transform that maximizes the overlap of initial and final states. This action provides a quantitative measure of how much quantum correlations can diverge from the deterministic relations between physical properties expected from classical physics or hidden variable theories. In terms of quantum information, the phases of weak conditional probabilities thus represent the logical tension between sets of three quantum states that is at the heart of quantum paradoxes. (paper)
Son, Ji Y; Ramos, Priscilla; DeWolf, Melissa; Loftus, William; Stigler, James W
2018-01-01
In this article, we begin to lay out a framework and approach for studying how students come to understand complex concepts in rich domains. Grounded in theories of embodied cognition, we advance the view that understanding of complex concepts requires students to practice, over time, the coordination of multiple concepts, and the connection of this system of concepts to situations in the world. Specifically, we explore the role that a teacher's gesture might play in supporting students' coordination of two concepts central to understanding in the domain of statistics: mean and standard deviation. In Study 1 we show that university students who have just taken a statistics course nevertheless have difficulty taking both mean and standard deviation into account when thinking about a statistical scenario. In Study 2 we show that presenting the same scenario with an accompanying gesture to represent variation significantly impacts students' interpretation of the scenario. Finally, in Study 3 we present evidence that instructional videos on the internet fail to leverage gesture as a means of facilitating understanding of complex concepts. Taken together, these studies illustrate an approach to translating current theories of cognition into principles that can guide instructional design.
Li, Jie; Li, Rui; You, Leiming; Xu, Anlong; Fu, Yonggui; Huang, Shengfeng
2015-01-01
Switching between different alternative polyadenylation (APA) sites plays an important role in the fine tuning of gene expression. New technologies for the execution of 3’-end enriched RNA-seq allow genome-wide detection of the genes that exhibit significant APA site switching between different samples. Here, we show that the independence test gives better results than the linear trend test in detecting APA site-switching events. Further examination suggests that the discrepancy between these two statistical methods arises from complex APA site-switching events that cannot be represented by a simple change of average 3’-UTR length. In theory, the linear trend test is only effective in detecting these simple changes. We classify the switching events into four switching patterns: two simple patterns (3’-UTR shortening and lengthening) and two complex patterns. By comparing the results of the two statistical methods, we show that complex patterns account for 1/4 of all observed switching events that happen between normal and cancerous human breast cell lines. Because simple and complex switching patterns may convey different biological meanings, they merit separate study. We therefore propose to combine both the independence test and the linear trend test in practice. First, the independence test should be used to detect APA site switching; second, the linear trend test should be invoked to identify simple switching events; and third, those complex switching events that pass independence testing but fail linear trend testing can be identified. PMID:25875641
Generalized Projective Synchronization between Two Complex Networks with Time-Varying Coupling Delay
International Nuclear Information System (INIS)
Mei, Sun; Chang-Yan, Zeng; Li-Xin, Tian
2009-01-01
Generalized projective synchronization (GPS) between two complex networks with time-varying coupling delay is investigated. Based on the Lyapunov stability theory, a nonlinear controller and adaptive updated laws are designed. Feasibility of the proposed scheme is proven in theory. Moreover, two numerical examples are presented, using the energy resource system and Lü's system [Physica A 382 (2007) 672] as the nodes of the networks. GPS between two energy resource complex networks with time-varying coupling delay is achieved. This study can widen the application range of the generalized synchronization methods and will be instructive for the demand–supply of energy resource in some regions of China
Generalized Projective Synchronization between Two Complex Networks with Time-Varying Coupling Delay
Sun, Mei; Zeng, Chang-Yan; Tian, Li-Xin
2009-01-01
Generalized projective synchronization (GPS) between two complex networks with time-varying coupling delay is investigated. Based on the Lyapunov stability theory, a nonlinear controller and adaptive updated laws are designed. Feasibility of the proposed scheme is proven in theory. Moreover, two numerical examples are presented, using the energy resource system and Lü's system [Physica A 382 (2007) 672] as the nodes of the networks. GPS between two energy resource complex networks with time-varying coupling delay is achieved. This study can widen the application range of the generalized synchronization methods and will be instructive for the demand-supply of energy resource in some regions of China.
Quantum communication complexity advantage implies violation of a Bell inequality
H. Buhrman (Harry); L. Czekaj (Lłukasz); A. Grudka (Andrzej); M. Horodecki (Michalł); P. Horodecki (Pawelł); M. Markiewicz (Marcin); F. Speelman (Florian); S. Strelchuk (Sergii)
2016-01-01
textabstractWe obtain a general connection between a large quantumadvantage in communication complexity and Bell nonlocality. We show that given any protocol offering a sufficiently large quantum advantage in communication complexity, there exists a way of obtaining measurement statistics that
Quantum communication complexity advantage implies violation of a Bell inequality
H. Buhrman (Harry); L. Czekaj (Lłukasz); A. Grudka (Andrzej); M. Horodecki (Michalł); P. Horodecki (Pawelł); M. Markiewicz (Marcin); F. Speelman (Florian); S. Strelchuk (Sergii)
2015-01-01
htmlabstractWe obtain a general connection between a quantum advantage in communication complexity and non-locality. We show that given any protocol offering a (sufficiently large) quantum advantage in communication complexity, there exists a way of obtaining measurement statistics which violate
Stochastic electromagnetic radiation of complex sources
Naus, H.W.L.
2007-01-01
The emission of electromagnetic radiation by localized complex electric charge and current distributions is studied. A statistical formalism in terms of general dynamical multipole fields is developed. The appearing coefficients are treated as stochastic variables. Hereby as much as possible a
Industrial commodity statistics yearbook 2001. Production statistics (1992-2001)
International Nuclear Information System (INIS)
2003-01-01
This is the thirty-fifth in a series of annual compilations of statistics on world industry designed to meet both the general demand for information of this kind and the special requirements of the United Nations and related international bodies. Beginning with the 1992 edition, the title of the publication was changed to industrial Commodity Statistics Yearbook as the result of a decision made by the United Nations Statistical Commission at its twenty-seventh session to discontinue, effective 1994, publication of the Industrial Statistics Yearbook, volume I, General Industrial Statistics by the Statistics Division of the United Nations. The United Nations Industrial Development Organization (UNIDO) has become responsible for the collection and dissemination of general industrial statistics while the Statistics Division of the United Nations continues to be responsible for industrial commodity production statistics. The previous title, Industrial Statistics Yearbook, volume II, Commodity Production Statistics, was introduced in the 1982 edition. The first seven editions in this series were published under the title The Growth of World industry and the next eight editions under the title Yearbook of Industrial Statistics. This edition of the Yearbook contains annual quantity data on production of industrial commodities by country, geographical region, economic grouping and for the world. A standard list of about 530 commodities (about 590 statistical series) has been adopted for the publication. The statistics refer to the ten-year period 1992-2001 for about 200 countries and areas
Industrial commodity statistics yearbook 2002. Production statistics (1993-2002)
International Nuclear Information System (INIS)
2004-01-01
This is the thirty-sixth in a series of annual compilations of statistics on world industry designed to meet both the general demand for information of this kind and the special requirements of the United Nations and related international bodies. Beginning with the 1992 edition, the title of the publication was changed to industrial Commodity Statistics Yearbook as the result of a decision made by the United Nations Statistical Commission at its twenty-seventh session to discontinue, effective 1994, publication of the Industrial Statistics Yearbook, volume I, General Industrial Statistics by the Statistics Division of the United Nations. The United Nations Industrial Development Organization (UNIDO) has become responsible for the collection and dissemination of general industrial statistics while the Statistics Division of the United Nations continues to be responsible for industrial commodity production statistics. The previous title, Industrial Statistics Yearbook, volume II, Commodity Production Statistics, was introduced in the 1982 edition. The first seven editions in this series were published under the title 'The Growth of World industry' and the next eight editions under the title 'Yearbook of Industrial Statistics'. This edition of the Yearbook contains annual quantity data on production of industrial commodities by country, geographical region, economic grouping and for the world. A standard list of about 530 commodities (about 590 statistical series) has been adopted for the publication. The statistics refer to the ten-year period 1993-2002 for about 200 countries and areas
Industrial commodity statistics yearbook 2000. Production statistics (1991-2000)
International Nuclear Information System (INIS)
2002-01-01
This is the thirty-third in a series of annual compilations of statistics on world industry designed to meet both the general demand for information of this kind and the special requirements of the United Nations and related international bodies. Beginning with the 1992 edition, the title of the publication was changed to industrial Commodity Statistics Yearbook as the result of a decision made by the United Nations Statistical Commission at its twenty-seventh session to discontinue, effective 1994, publication of the Industrial Statistics Yearbook, volume I, General Industrial Statistics by the Statistics Division of the United Nations. The United Nations Industrial Development Organization (UNIDO) has become responsible for the collection and dissemination of general industrial statistics while the Statistics Division of the United Nations continues to be responsible for industrial commodity production statistics. The previous title, Industrial Statistics Yearbook, volume II, Commodity Production Statistics, was introduced in the 1982 edition. The first seven editions in this series were published under the title The Growth of World industry and the next eight editions under the title Yearbook of Industrial Statistics. This edition of the Yearbook contains annual quantity data on production of industrial commodities by country, geographical region, economic grouping and for the world. A standard list of about 530 commodities (about 590 statistical series) has been adopted for the publication. Most of the statistics refer to the ten-year period 1991-2000 for about 200 countries and areas
Equivalence of the generalized and complex Kohn variational methods
Energy Technology Data Exchange (ETDEWEB)
Cooper, J N; Armour, E A G [School of Mathematical Sciences, University Park, Nottingham NG7 2RD (United Kingdom); Plummer, M, E-mail: pmxjnc@googlemail.co [STFC Daresbury Laboratory, Daresbury, Warrington, Cheshire WA4 4AD (United Kingdom)
2010-04-30
For Kohn variational calculations on low energy (e{sup +} - H{sub 2}) elastic scattering, we prove that the phase shift approximation, obtained using the complex Kohn method, is precisely equal to a value which can be obtained immediately via the real-generalized Kohn method. Our treatment is sufficiently general to be applied directly to arbitrary potential scattering or single open channel scattering problems, with exchange if required. In the course of our analysis, we develop a framework formally to describe the anomalous behaviour of our generalized Kohn calculations in the regions of the well-known Schwartz singularities. This framework also explains the mathematical origin of the anomaly-free singularities we reported in a previous article. Moreover, we demonstrate a novelty: that explicit solutions of the Kohn equations are not required in order to calculate optimal phase shift approximations. We relate our rigorous framework to earlier descriptions of the Kohn-type methods.
Equivalence of the generalized and complex Kohn variational methods
International Nuclear Information System (INIS)
Cooper, J N; Armour, E A G; Plummer, M
2010-01-01
For Kohn variational calculations on low energy (e + - H 2 ) elastic scattering, we prove that the phase shift approximation, obtained using the complex Kohn method, is precisely equal to a value which can be obtained immediately via the real-generalized Kohn method. Our treatment is sufficiently general to be applied directly to arbitrary potential scattering or single open channel scattering problems, with exchange if required. In the course of our analysis, we develop a framework formally to describe the anomalous behaviour of our generalized Kohn calculations in the regions of the well-known Schwartz singularities. This framework also explains the mathematical origin of the anomaly-free singularities we reported in a previous article. Moreover, we demonstrate a novelty: that explicit solutions of the Kohn equations are not required in order to calculate optimal phase shift approximations. We relate our rigorous framework to earlier descriptions of the Kohn-type methods.
Chemometric and Statistical Analyses of ToF-SIMS Spectra of Increasingly Complex Biological Samples
Energy Technology Data Exchange (ETDEWEB)
Berman, E S; Wu, L; Fortson, S L; Nelson, D O; Kulp, K S; Wu, K J
2007-10-24
Characterizing and classifying molecular variation within biological samples is critical for determining fundamental mechanisms of biological processes that will lead to new insights including improved disease understanding. Towards these ends, time-of-flight secondary ion mass spectrometry (ToF-SIMS) was used to examine increasingly complex samples of biological relevance, including monosaccharide isomers, pure proteins, complex protein mixtures, and mouse embryo tissues. The complex mass spectral data sets produced were analyzed using five common statistical and chemometric multivariate analysis techniques: principal component analysis (PCA), linear discriminant analysis (LDA), partial least squares discriminant analysis (PLSDA), soft independent modeling of class analogy (SIMCA), and decision tree analysis by recursive partitioning. PCA was found to be a valuable first step in multivariate analysis, providing insight both into the relative groupings of samples and into the molecular basis for those groupings. For the monosaccharides, pure proteins and protein mixture samples, all of LDA, PLSDA, and SIMCA were found to produce excellent classification given a sufficient number of compound variables calculated. For the mouse embryo tissues, however, SIMCA did not produce as accurate a classification. The decision tree analysis was found to be the least successful for all the data sets, providing neither as accurate a classification nor chemical insight for any of the tested samples. Based on these results we conclude that as the complexity of the sample increases, so must the sophistication of the multivariate technique used to classify the samples. PCA is a preferred first step for understanding ToF-SIMS data that can be followed by either LDA or PLSDA for effective classification analysis. This study demonstrates the strength of ToF-SIMS combined with multivariate statistical and chemometric techniques to classify increasingly complex biological samples
Narayanan, Roshni; Nugent, Rebecca; Nugent, Kenneth
2015-10-01
Accreditation Council for Graduate Medical Education guidelines require internal medicine residents to develop skills in the interpretation of medical literature and to understand the principles of research. A necessary component is the ability to understand the statistical methods used and their results, material that is not an in-depth focus of most medical school curricula and residency programs. Given the breadth and depth of the current medical literature and an increasing emphasis on complex, sophisticated statistical analyses, the statistical foundation and education necessary for residents are uncertain. We reviewed the statistical methods and terms used in 49 articles discussed at the journal club in the Department of Internal Medicine residency program at Texas Tech University between January 1, 2013 and June 30, 2013. We collected information on the study type and on the statistical methods used for summarizing and comparing samples, determining the relations between independent variables and dependent variables, and estimating models. We then identified the typical statistics education level at which each term or method is learned. A total of 14 articles came from the Journal of the American Medical Association Internal Medicine, 11 from the New England Journal of Medicine, 6 from the Annals of Internal Medicine, 5 from the Journal of the American Medical Association, and 13 from other journals. Twenty reported randomized controlled trials. Summary statistics included mean values (39 articles), category counts (38), and medians (28). Group comparisons were based on t tests (14 articles), χ2 tests (21), and nonparametric ranking tests (10). The relations between dependent and independent variables were analyzed with simple regression (6 articles), multivariate regression (11), and logistic regression (8). Nine studies reported odds ratios with 95% confidence intervals, and seven analyzed test performance using sensitivity and specificity calculations
Beginning R The Statistical Programming Language
Gardener, Mark
2012-01-01
Conquer the complexities of this open source statistical language R is fast becoming the de facto standard for statistical computing and analysis in science, business, engineering, and related fields. This book examines this complex language using simple statistical examples, showing how R operates in a user-friendly context. Both students and workers in fields that require extensive statistical analysis will find this book helpful as they learn to use R for simple summary statistics, hypothesis testing, creating graphs, regression, and much more. It covers formula notation, complex statistics
Solution of generalized shifted linear systems with complex symmetric matrices
International Nuclear Information System (INIS)
Sogabe, Tomohiro; Hoshi, Takeo; Zhang, Shao-Liang; Fujiwara, Takeo
2012-01-01
We develop the shifted COCG method [R. Takayama, T. Hoshi, T. Sogabe, S.-L. Zhang, T. Fujiwara, Linear algebraic calculation of Green’s function for large-scale electronic structure theory, Phys. Rev. B 73 (165108) (2006) 1–9] and the shifted WQMR method [T. Sogabe, T. Hoshi, S.-L. Zhang, T. Fujiwara, On a weighted quasi-residual minimization strategy of the QMR method for solving complex symmetric shifted linear systems, Electron. Trans. Numer. Anal. 31 (2008) 126–140] for solving generalized shifted linear systems with complex symmetric matrices that arise from the electronic structure theory. The complex symmetric Lanczos process with a suitable bilinear form plays an important role in the development of the methods. The numerical examples indicate that the methods are highly attractive when the inner linear systems can efficiently be solved.
International Nuclear Information System (INIS)
Toppan, Francesco
2004-06-01
Relying upon the division-algebra classification of Clifford algebras and spinors, a classification of generalized supersymmetries (or, with a slight abuse of language, 'generalized super translations') is provided. In each given space-time the maximal, saturated, generalized supersymmetry, compatible with the division-algebra constraint that can be consistently imposed on spinors and on superalgebra generators, is furnished. Constraining the superalgebra generators in both the complex and the quaternionic cases gives rise to the two classes of constrained hermitian and holomorphic generalized supersymmetries. In the complex case these two classes of generalized supersymmetries can be regarded as complementary. The quaternionic holomorphic supersymmetry only exists in certain space-time dimensions and can admit at most a single bosonic scalar central charge. The results here presented pave the way for a better understanding of the various M algebra-type of structures which can be introduced in different space-time signatures and in association with different division algebras, as well as their mutual relations. In a previous work, e.g., the introduction of a complex holomorphic generalized supersymmetry was shown to be necessary in order to perform the analytic continuation of the standard M-theory to the 11-dimensional Euclidean space. As an application of the present results, it is shown that the above algebra also admits a 12-dimensional, Euclidean, F-algebra presentation. (author)
International Nuclear Information System (INIS)
Toppan, Francesco
2004-01-01
Relying upon the division-algebra classification of Clifford algebras and spinors, a classification of generalized supersymmetries (or, with a slight abuse of language,'generalized supertranslations') is provided. In each given space-time the maximal, saturated, generalized supersymmetry, compatible with the division-algebra constraint that can be consistently imposed on spinors and on superalgebra generators, is furnished. Constraining the superalgebra generators in both the complex and the quaternionic cases gives rise to the two classes of constrained hermitean and holomorphic generalized supersymmetries. In the complex case these two classes of generalized supersymmetries can be regarded as complementary. The quaternionic holomorphic supersymmetry only exists in certain space-time dimensions and can admit at most a single bosonic scalar central charge. The results here presented pave the way for a better understanding of the various M algebra-type of structures which can be introduced in different space-time signatures and in association with different division algebras, as well as their mutual relations. In a previous work, e.g., the introduction of a complex holomorphic generalized supersymmetry was shown to be necessary in order to perform the analytic continuation of the standard M-theory to the 11-dimensional euclidean space. As an application of the present results, it is shown that the above algebra also admits a 12-dimensional, euclidean, F-algebra presentation. (author)
Statistical mechanics of sparse generalization and graphical model selection
International Nuclear Information System (INIS)
Lage-Castellanos, Alejandro; Pagnani, Andrea; Weigt, Martin
2009-01-01
One of the crucial tasks in many inference problems is the extraction of an underlying sparse graphical model from a given number of high-dimensional measurements. In machine learning, this is frequently achieved using, as a penalty term, the L p norm of the model parameters, with p≤1 for efficient dilution. Here we propose a statistical mechanics analysis of the problem in the setting of perceptron memorization and generalization. Using a replica approach, we are able to evaluate the relative performance of naive dilution (obtained by learning without dilution, following by applying a threshold to the model parameters), L 1 dilution (which is frequently used in convex optimization) and L 0 dilution (which is optimal but computationally hard to implement). Whereas both L p diluted approaches clearly outperform the naive approach, we find a small region where L 0 works almost perfectly and strongly outperforms the simpler to implement L 1 dilution
Right-sizing statistical models for longitudinal data.
Wood, Phillip K; Steinley, Douglas; Jackson, Kristina M
2015-12-01
Arguments are proposed that researchers using longitudinal data should consider more and less complex statistical model alternatives to their initially chosen techniques in an effort to "right-size" the model to the data at hand. Such model comparisons may alert researchers who use poorly fitting, overly parsimonious models to more complex, better-fitting alternatives and, alternatively, may identify more parsimonious alternatives to overly complex (and perhaps empirically underidentified and/or less powerful) statistical models. A general framework is proposed for considering (often nested) relationships between a variety of psychometric and growth curve models. A 3-step approach is proposed in which models are evaluated based on the number and patterning of variance components prior to selection of better-fitting growth models that explain both mean and variation-covariation patterns. The orthogonal free curve slope intercept (FCSI) growth model is considered a general model that includes, as special cases, many models, including the factor mean (FM) model (McArdle & Epstein, 1987), McDonald's (1967) linearly constrained factor model, hierarchical linear models (HLMs), repeated-measures multivariate analysis of variance (MANOVA), and the linear slope intercept (linearSI) growth model. The FCSI model, in turn, is nested within the Tuckerized factor model. The approach is illustrated by comparing alternative models in a longitudinal study of children's vocabulary and by comparing several candidate parametric growth and chronometric models in a Monte Carlo study. (c) 2015 APA, all rights reserved).
On Nonextensive Statistics, Chaos and Fractal Strings
Castro, C
2004-01-01
Motivated by the growing evidence of universality and chaos in QFT and string theory, we study the Tsallis non-extensive statistics ( with a non-additive $ q$-entropy ) of an ensemble of fractal strings and branes of different dimensionalities. Non-equilibrium systems with complex dynamics in stationary states may exhibit large fluctuations of intensive quantities which are described in terms of generalized statistics. Tsallis statistics is a particular representative of such class. The non-extensive entropy and probability distribution of a canonical ensemble of fractal strings and branes is studied in terms of their dimensional spectrum which leads to a natural upper cutoff in energy and establishes a direct correlation among dimensions, energy and temperature. The absolute zero temperature ( Kelvin ) corresponds to zero dimensions (energy ) and an infinite temperature corresponds to infinite dimensions. In the concluding remarks some applications of fractal statistics, quasi-particles, knot theory, quantum...
Simulations with complex measure
International Nuclear Information System (INIS)
Markham, J.K.; Kieu, T.D.
1997-01-01
A method is proposed to handle the sign problem in the simulation of systems having indefinite or complex-valued measures. In general, this new approach, which is based on renormalisation blocking, is shown to yield statistical errors smaller that the crude Monte Carlo method using absolute values of the original measures. The improved method is applied to the 2D Ising model with temperature generalised to take on complex values. It is also adapted to implement Monte Carlo Renormalisation Group calculations of the magnetic and thermal critical exponents. 10 refs., 4 tabs., 7 figs
Renyi statistics in equilibrium statistical mechanics
International Nuclear Information System (INIS)
Parvan, A.S.; Biro, T.S.
2010-01-01
The Renyi statistics in the canonical and microcanonical ensembles is examined both in general and in particular for the ideal gas. In the microcanonical ensemble the Renyi statistics is equivalent to the Boltzmann-Gibbs statistics. By the exact analytical results for the ideal gas, it is shown that in the canonical ensemble, taking the thermodynamic limit, the Renyi statistics is also equivalent to the Boltzmann-Gibbs statistics. Furthermore it satisfies the requirements of the equilibrium thermodynamics, i.e. the thermodynamical potential of the statistical ensemble is a homogeneous function of first degree of its extensive variables of state. We conclude that the Renyi statistics arrives at the same thermodynamical relations, as those stemming from the Boltzmann-Gibbs statistics in this limit.
Decaying states as complex energy eigenvectors in generalized quantum mechanics
International Nuclear Information System (INIS)
Sudarshan, E.C.G.; Chiu, C.B.; Gorini, V.
1977-04-01
The problem of particle decay is reexamined within the Hamiltonian formalism. By deforming contours of integration, the survival amplitude is expressed as a sum of purely exponential contributions arising from the simple poles of the resolvent on the second sheet plus a background integral along a complex contour GAMMA running below the location of the poles. One observes that the time dependence of the survival amplitude in the small time region is strongly correlated to the asymptotic behaviour of the energy spectrum of the system; one computes the small time behavior of the survival amplitude for a wide variety of asymptotic behaviors. In the special case of the Lee model, using a formal procedure of analytic continuation, it is shown that a complete set of complex energy eigenvectors of the Hamiltonian can be associated with the poles of the resolvent of the background contour GAMMA. These poles and points along GAMMA correspond to the discrete and the continuum states respectively. In this context, each unstable particle is associated with a well defined object, which is a discrete generalized eigenstate of the Hamiltonian having a complex eigenvalue, with its real and negative imaginary parts being the mass and half width of the particle respectively. Finally, one briefly discusses the analytic continuation of the scattering amplitude within this generalized scheme, and notes the appearance of ''redundant poles'' which do not correspond to discrete solutions of the modified eigenvalue problem
Global synchronization of general delayed complex networks with stochastic disturbances
International Nuclear Information System (INIS)
Tu Li-Lan
2011-01-01
In this paper, global synchronization of general delayed complex networks with stochastic disturbances, which is a zero-mean real scalar Wiener process, is investigated. The networks under consideration are continuous-time networks with time-varying delay. Based on the stochastic Lyapunov stability theory, Ito's differential rule and the linear matrix inequality (LMI) optimization technique, several delay-dependent synchronous criteria are established, which guarantee the asymptotical mean-square synchronization of drive networks and response networks with stochastic disturbances. The criteria are expressed in terms of LMI, which can be easily solved using the Matlab LMI Control Toolbox. Finally, two examples show the effectiveness and feasibility of the proposed synchronous conditions. (general)
Spreading dynamics on complex networks: a general stochastic approach.
Noël, Pierre-André; Allard, Antoine; Hébert-Dufresne, Laurent; Marceau, Vincent; Dubé, Louis J
2014-12-01
Dynamics on networks is considered from the perspective of Markov stochastic processes. We partially describe the state of the system through network motifs and infer any missing data using the available information. This versatile approach is especially well adapted for modelling spreading processes and/or population dynamics. In particular, the generality of our framework and the fact that its assumptions are explicitly stated suggests that it could be used as a common ground for comparing existing epidemics models too complex for direct comparison, such as agent-based computer simulations. We provide many examples for the special cases of susceptible-infectious-susceptible and susceptible-infectious-removed dynamics (e.g., epidemics propagation) and we observe multiple situations where accurate results may be obtained at low computational cost. Our perspective reveals a subtle balance between the complex requirements of a realistic model and its basic assumptions.
Sampling, Probability Models and Statistical Reasoning Statistical
Indian Academy of Sciences (India)
Home; Journals; Resonance – Journal of Science Education; Volume 1; Issue 5. Sampling, Probability Models and Statistical Reasoning Statistical Inference. Mohan Delampady V R Padmawar. General Article Volume 1 Issue 5 May 1996 pp 49-58 ...
Generalized projective synchronization of two coupled complex networks of different sizes
International Nuclear Information System (INIS)
Li Ke-Zan; He En; Zeng Zhao-Rong; Chi, K. Tse
2013-01-01
We investigate a new generalized projective synchronization between two complex dynamical networks of different sizes. To the best of our knowledge, most of the current studies on projective synchronization have dealt with coupled networks of the same size. By generalized projective synchronization, we mean that the states of the nodes in each network can realize complete synchronization, and the states of a pair of nodes from both networks can achieve projective synchronization. Using the stability theory of the dynamical system, several sufficient conditions for guaranteeing the existence of the generalized projective synchronization under feedback control and adaptive control are obtained. As an example, we use Chua's circuits to demonstrate the effectiveness of our proposed approach
International Nuclear Information System (INIS)
Gross, D.H.E.
2006-01-01
Heat can flow from cold to hot at any phase separation even in macroscopic systems. Therefore also Lynden-Bell's famous gravo-thermal catastrophe must be reconsidered. In contrast to traditional canonical Boltzmann-Gibbs statistics this is correctly described only by microcanonical statistics. Systems studied in chemical thermodynamics (ChTh) by using canonical statistics consist of several homogeneous macroscopic phases. Evidently, macroscopic statistics as in chemistry cannot and should not be applied to non-extensive or inhomogeneous systems like nuclei or galaxies. Nuclei are small and inhomogeneous. Multifragmented nuclei are even more inhomogeneous and the fragments even smaller. Phase transitions of first order and especially phase separations therefore cannot be described by a (homogeneous) canonical ensemble. Taking this serious, fascinating perspectives open for statistical nuclear fragmentation as test ground for the basic principles of statistical mechanics, especially of phase transitions, without the use of the thermodynamic limit. Moreover, there is also a lot of similarity between the accessible phase space of fragmenting nuclei and inhomogeneous multistellar systems. This underlines the fundamental significance for statistical physics in general. (orig.)
DbAccess: Interactive Statistics and Graphics for Plasma Physics Databases
International Nuclear Information System (INIS)
Davis, W.; Mastrovito, D.
2003-01-01
DbAccess is an X-windows application, written in IDL(reg s ign), meeting many specialized statistical and graphical needs of NSTX [National Spherical Torus Experiment] plasma physicists, such as regression statistics and the analysis of variance. Flexible ''views'' and ''joins,'' which include options for complex SQL expressions, facilitate mixing data from different database tables. General Atomics Plot Objects add extensive graphical and interactive capabilities. An example is included for plasma confinement-time scaling analysis using a multiple linear regression least-squares power fit
A generalization of random matrix theory and its application to statistical physics.
Wang, Duan; Zhang, Xin; Horvatic, Davor; Podobnik, Boris; Eugene Stanley, H
2017-02-01
To study the statistical structure of crosscorrelations in empirical data, we generalize random matrix theory and propose a new method of cross-correlation analysis, known as autoregressive random matrix theory (ARRMT). ARRMT takes into account the influence of auto-correlations in the study of cross-correlations in multiple time series. We first analytically and numerically determine how auto-correlations affect the eigenvalue distribution of the correlation matrix. Then we introduce ARRMT with a detailed procedure of how to implement the method. Finally, we illustrate the method using two examples taken from inflation rates for air pressure data for 95 US cities.
Central Limit Theorem for Exponentially Quasi-local Statistics of Spin Models on Cayley Graphs
Reddy, Tulasi Ram; Vadlamani, Sreekar; Yogeshwaran, D.
2018-04-01
Central limit theorems for linear statistics of lattice random fields (including spin models) are usually proven under suitable mixing conditions or quasi-associativity. Many interesting examples of spin models do not satisfy mixing conditions, and on the other hand, it does not seem easy to show central limit theorem for local statistics via quasi-associativity. In this work, we prove general central limit theorems for local statistics and exponentially quasi-local statistics of spin models on discrete Cayley graphs with polynomial growth. Further, we supplement these results by proving similar central limit theorems for random fields on discrete Cayley graphs taking values in a countable space, but under the stronger assumptions of α -mixing (for local statistics) and exponential α -mixing (for exponentially quasi-local statistics). All our central limit theorems assume a suitable variance lower bound like many others in the literature. We illustrate our general central limit theorem with specific examples of lattice spin models and statistics arising in computational topology, statistical physics and random networks. Examples of clustering spin models include quasi-associated spin models with fast decaying covariances like the off-critical Ising model, level sets of Gaussian random fields with fast decaying covariances like the massive Gaussian free field and determinantal point processes with fast decaying kernels. Examples of local statistics include intrinsic volumes, face counts, component counts of random cubical complexes while exponentially quasi-local statistics include nearest neighbour distances in spin models and Betti numbers of sub-critical random cubical complexes.
Statistical symmetries in physics
International Nuclear Information System (INIS)
Green, H.S.; Adelaide Univ., SA
1994-01-01
Every law of physics is invariant under some group of transformations and is therefore the expression of some type of symmetry. Symmetries are classified as geometrical, dynamical or statistical. At the most fundamental level, statistical symmetries are expressed in the field theories of the elementary particles. This paper traces some of the developments from the discovery of Bose statistics, one of the two fundamental symmetries of physics. A series of generalizations of Bose statistics is described. A supersymmetric generalization accommodates fermions as well as bosons, and further generalizations, including parastatistics, modular statistics and graded statistics, accommodate particles with properties such as 'colour'. A factorization of elements of ggl(n b ,n f ) can be used to define truncated boson operators. A general construction is given for q-deformed boson operators, and explicit constructions of the same type are given for various 'deformed' algebras. A summary is given of some of the applications and potential applications. 39 refs., 2 figs
General practice and the new science emerging from the theories of 'chaos' and complexity.
Griffiths, F; Byrne, D
1998-01-01
This paper outlines the general practice world view and introduces the main features of the theories of 'chaos' and complexity. From this, analogies are drawn between general practice and the theories, which suggest a different way of understanding general practice and point to future developments in general practice research. A conceptual and practical link between qualitative and quantitative methods of research is suggested. Methods of combining data about social context with data about in...
Complexity control in statistical learning
Indian Academy of Sciences (India)
Then we describe how the method of regularization is used to control complexity in learning. We discuss two examples of regularization, one in which the function space used is ﬁnite dimensional, and another in which it is a reproducing kernel Hilbert space. Our exposition follows the formulation of Cucker and Smale.
MAGMA: Generalized Gene-Set Analysis of GWAS Data
de Leeuw, C.A.; Mooij, J.M.; Heskes, T.; Posthuma, D.
2015-01-01
By aggregating data for complex traits in a biologically meaningful way, gene and gene-set analysis constitute a valuable addition to single-marker analysis. However, although various methods for gene and gene-set analysis currently exist, they generally suffer from a number of issues. Statistical
MAGMA: generalized gene-set analysis of GWAS data.
de Leeuw, C.A.; Mooij, J.M.; Heskes, T.; Posthuma, D.
2015-01-01
By aggregating data for complex traits in a biologically meaningful way, gene and gene-set analysis constitute a valuable addition to single-marker analysis. However, although various methods for gene and gene-set analysis currently exist, they generally suffer from a number of issues. Statistical
Operationalization of biopsychosocial case complexity in general health care : the INTERMED project
de Jonge, P; Huyse, FJ; Slaets, JPJ; Sollner, W; Stiefel, FC
Objective: Lack of operationalization of the biopsychosocial model hinders its effective application to the increasingly prevalent problems of comorbidities in clinical presentations. Here, we describe the INTERMED, an instrument to assess biopsychosocial case complexity in general health care, and
Yilmaz, Ferkan
2012-06-01
The exact analysis of the higher-order statistics of the channel capacity (i.e., higher-order ergodic capacity) often leads to complicated expressions involving advanced special functions. In this paper, we provide a generic framework for the computation of the higher-order statistics of the channel capacity over generalized fading channels. As such, this novel framework for the higher-order statistics results in simple, closed-form expressions which are shown to be asymptotically tight bounds in the high signal-to-noise ratio (SNR) regime of a variety of fading environment. In addition, it reveals the existence of differences (i.e., constant capacity gaps in log-domain) among different fading environments. By asymptotically tight bound we mean that the high SNR limit of the difference between the actual higher-order statistics of the channel capacity and its asymptotic bound (i.e., lower bound) tends to zero. The mathematical formalism is illustrated with some selected numerical examples that validate the correctness of our newly derived results. © 2012 IEEE.
International Nuclear Information System (INIS)
Dai, Wu-Sheng; Xie, Mi
2013-01-01
In this paper, we give a general discussion on the calculation of the statistical distribution from a given operator relation of creation, annihilation, and number operators. Our result shows that as long as the relation between the number operator and the creation and annihilation operators can be expressed as a † b=Λ(N) or N=Λ −1 (a † b), where N, a † , and b denote the number, creation, and annihilation operators, i.e., N is a function of quadratic product of the creation and annihilation operators, the corresponding statistical distribution is the Gentile distribution, a statistical distribution in which the maximum occupation number is an arbitrary integer. As examples, we discuss the statistical distributions corresponding to various operator relations. In particular, besides the Bose–Einstein and Fermi–Dirac cases, we discuss the statistical distributions for various schemes of intermediate statistics, especially various q-deformation schemes. Our result shows that the statistical distributions corresponding to various q-deformation schemes are various Gentile distributions with different maximum occupation numbers which are determined by the deformation parameter q. This result shows that the results given in much literature on the q-deformation distribution are inaccurate or incomplete. -- Highlights: ► A general discussion on calculating statistical distribution from relations of creation, annihilation, and number operators. ► A systemic study on the statistical distributions corresponding to various q-deformation schemes. ► Arguing that many results of q-deformation distributions in literature are inaccurate or incomplete
Overdispersion in nuclear statistics
International Nuclear Information System (INIS)
Semkow, Thomas M.
1999-01-01
The modern statistical distribution theory is applied to the development of the overdispersion theory in ionizing-radiation statistics for the first time. The physical nuclear system is treated as a sequence of binomial processes, each depending on a characteristic probability, such as probability of decay, detection, etc. The probabilities fluctuate in the course of a measurement, and the physical reasons for that are discussed. If the average values of the probabilities change from measurement to measurement, which originates from the random Lexis binomial sampling scheme, then the resulting distribution is overdispersed. The generating functions and probability distribution functions are derived, followed by a moment analysis. The Poisson and Gaussian limits are also given. The distribution functions belong to a family of generalized hypergeometric factorial moment distributions by Kemp and Kemp, and can serve as likelihood functions for the statistical estimations. An application to radioactive decay with detection is described and working formulae are given, including a procedure for testing the counting data for overdispersion. More complex experiments in nuclear physics (such as solar neutrino) can be handled by this model, as well as distinguishing between the source and background
Caregiver Statistics: Demographics
... You are here Home Selected Long-Term Care Statistics Order this publication Printer-friendly version What is ... needs and services are wide-ranging and complex, statistics may vary from study to study. Sources for ...
International Nuclear Information System (INIS)
Androsenko, A.A.; Androsenko, P.A.
1983-01-01
A description is given of the structure, input procedure and recording rules of initial data for the BRAND programme complex intended for the Monte Carlo simulation of neutron physics experiments. The BRAND complex ideology is based on non-analogous simulation of the neutron and photon transport process (statistic weights are used, absorption and escape of particles from the considered region is taken into account, shifted readouts from a coordinate part of transition nucleus density are applied, local estimations, etc. are used). The preparation of initial data for three sections is described in detail: general information for Monte Carlo calculation, source definition and data for describing the geometry of the system. The complex is to be processed with the BESM-6 computer, the basic programming lan-- guage is FORTRAN, volume - more than 8000 operators
On advisability of developing automatic complexes of radiation flow detection
International Nuclear Information System (INIS)
Akopov, V.S.; Voronin, S.A.; Meshalkin, I.A.
1976-01-01
On the basis of mathematical treatment of statistical data obtained by inquest of specialists from a number of factories, problems associated with the determination of the most acceptable efficiency of radiation defectoscopy automatized complexes are considered. Production requirements for radiation control sensitivity are generalized. The use of providing the complexes with computer technique is substantiated
Denker, Manfred
2017-01-01
Introductory Statistics and Random Phenomena integrates traditional statistical data analysis with new computational experimentation capabilities and concepts of algorithmic complexity and chaotic behavior in nonlinear dynamic systems. This was the first advanced text/reference to bring together such a comprehensive variety of tools for the study of random phenomena occurring in engineering and the natural, life, and social sciences. The crucial computer experiments are conducted using the readily available computer program Mathematica® Uncertain Virtual Worlds™ software packages which optimize and facilitate the simulation environment. Brief tutorials are included that explain how to use theMathematica® programs for effective simulation and computer experiments. Large and original real-life data sets are introduced and analyzed as a model for independent study. This is an excellent classroom tool and self-study guide. The material is presented in a clear and accessible style providing numerous...
Bruña, Ricardo; Poza, Jesús; Gómez, Carlos; García, María; Fernández, Alberto; Hornero, Roberto
2012-06-01
Alzheimer's disease (AD) is the most common cause of dementia. Over the last few years, a considerable effort has been devoted to exploring new biomarkers. Nevertheless, a better understanding of brain dynamics is still required to optimize therapeutic strategies. In this regard, the characterization of mild cognitive impairment (MCI) is crucial, due to the high conversion rate from MCI to AD. However, only a few studies have focused on the analysis of magnetoencephalographic (MEG) rhythms to characterize AD and MCI. In this study, we assess the ability of several parameters derived from information theory to describe spontaneous MEG activity from 36 AD patients, 18 MCI subjects and 26 controls. Three entropies (Shannon, Tsallis and Rényi entropies), one disequilibrium measure (based on Euclidean distance ED) and three statistical complexities (based on Lopez Ruiz-Mancini-Calbet complexity LMC) were used to estimate the irregularity and statistical complexity of MEG activity. Statistically significant differences between AD patients and controls were obtained with all parameters (p validation procedure was applied. The accuracies reached 83.9% and 65.9% to discriminate AD and MCI subjects from controls, respectively. Our findings suggest that MCI subjects exhibit an intermediate pattern of abnormalities between normal aging and AD. Furthermore, the proposed parameters provide a new description of brain dynamics in AD and MCI.
Rajab, Ghada Z; Suh, Soh Youn; Demer, Joseph L
2017-06-01
Dissociated strabismus complex (DSC) is an enigmatic form of strabismus that includes dissociated vertical deviation (DVD) and dissociated horizontal deviation (DHD). We employed magnetic resonance imaging (MRI) to evaluate the extraocular muscles in DSC. We studied 5 patients with DSC and mean age of 25 years (range, 12-42 years), and 15 age-matched, orthotropic control subjects. All patients had DVD; 4 also had DHD. We employed high-resolution, surface coil MRI with thin, 2 mm slices and central target fixation. Volumes of the rectus and superior oblique muscles in the region 12 mm posterior to 4 mm anterior to the globe-optic nerve junction were measured in quasi-coronal planes in central gaze. Patients with DSC had no structural abnormalities of rectus muscles or rectus pulleys or the superior oblique muscle but exhibited modest, statistically significant increased volume of all rectus muscles ranging from 20% for medial rectus to 9% for lateral rectus (P muscles. DSC is associated with generalized rectus extraocular muscle hypertrophy in the absence of other orbital abnormalities. Copyright © 2017 American Association for Pediatric Ophthalmology and Strabismus. Published by Elsevier Inc. All rights reserved.
Pérez, Darío G; Funes, Gustavo
2012-12-03
Under the Geometrics Optics approximation is possible to estimate the covariance between the displacements of two thin beams after they have propagated through a turbulent medium. Previous works have concentrated in long propagation distances to provide models for the wandering statistics. These models are useful when the separation between beams is smaller than the propagation path-regardless of the characteristics scales of the turbulence. In this work we give a complete model for these covariances, behavior introducing absolute limits to the validity of former approximations. Moreover, these generalizations are established for non-Kolmogorov atmospheric models.
Low Complexity Sparse Bayesian Learning for Channel Estimation Using Generalized Mean Field
DEFF Research Database (Denmark)
Pedersen, Niels Lovmand; Manchón, Carles Navarro; Fleury, Bernard Henri
2014-01-01
We derive low complexity versions of a wide range of algorithms for sparse Bayesian learning (SBL) in underdetermined linear systems. The proposed algorithms are obtained by applying the generalized mean field (GMF) inference framework to a generic SBL probabilistic model. In the GMF framework, we...
Heath, Elizabeth M; English, Jeryl D; Johnson, Cleverick D; Swearingen, Elizabeth B; Akyalcin, Sercan
2017-02-01
Our aims were to assess the perceptions of orthodontic case complexity among orthodontists, general dentists, orthodontic residents, and dental students and to compare their perceptions with the American Board of Orthodontics Discrepancy Index (DI). Orthodontists, general dentists, orthodontic residents, and dental students (n = 343) participated in a Web-based survey. Pretreatment orthodontic records of 29 cases with varying DI scores were obtained. Respondents were asked to evaluate case complexity on a 100-point visual analog scale. Additional information was collected on participants' orthodontic education and orthodontic treatment preferences. Pearson correlation coefficients were used to assess the relationship between the average complexity score and the DI score. Repeated measures analysis with linear mixed models was used to assess the association between the average complexity score and the DI score and whether the association between the 2 scores varied by level of difficulty or panel group. The level of significance for all analyses was set at P clear aligners. DI score was significantly associated with complexity perceptions (P = 0.0168). Associations between average complexity and DI score varied significantly by provider group (P = 0.0033), with orthodontists and residents showing the strongest associations. When the DI score was greater than 15, orthodontists and residents perceived cases as more complex than did the other provider groups. Orthodontists and orthodontic residents had better judgments for evaluating orthodontic case complexity. The high correlation between orthodontic professionals' perceptions and DI scores suggested that additional orthodontic education and training have an influence on the ability to recognize case complexity. Copyright © 2017 American Association of Orthodontists. Published by Elsevier Inc. All rights reserved.
Energy Technology Data Exchange (ETDEWEB)
Al Mouhamed, Mayez
1977-09-15
In a number of complex physical systems the accessible signals are often characterized by random fluctuations about a mean value. The fluctuations (signature) often transmit information about the state of the system that the mean value cannot predict. This study is undertaken to elaborate statistical methods of anomaly detection on the basis of signature analysis of the noise inherent in the process. The algorithm presented first learns the characteristics of normal operation of a complex process. Then it detects small deviations from the normal behavior. The algorithm can be implemented in a medium-sized computer for on line application. (author) [French] Dans de nombreux systemes physiques complexes les grandeurs accessibles a l'homme sont souvent caracterisees par des fluctuations aleatoires autour d'une valeur moyenne. Les fluctuations (signatures) transmettent souvent des informations sur l'etat du systeme que la valeur moyenne ne peut predire. Cette etude est entreprise pour elaborer des methodes statistiques de detection d'anomalies de fonctionnement sur la base de l'analyse des signatures contenues dans les signaux de bruit provenant du processus. L'algorithme presente est capable de: 1/ Apprendre les caracteristiques des operations normales dans un processus complexe. 2/ Detecter des petites deviations par rapport a la conduite normale du processus. L'algorithme peut etre implante sur un calculateur de taille moyenne pour les applications en ligne. (auteur)
Complex networks from multivariate time series
Czech Academy of Sciences Publication Activity Database
Paluš, Milan; Hartman, David; Vejmelka, Martin
2010-01-01
Roč. 12, - (2010), A-14382 ISSN 1607-7962. [General Asembly of the European Geophysical Society. 02.05.2010-07.05.2010, Vienna] R&D Projects: GA AV ČR IAA300420805 Institutional research plan: CEZ:AV0Z10300504 Keywords : complex network * surface air temperature * reanalysis data * global change Subject RIV: BB - Applied Statistics, Operational Research
International Nuclear Information System (INIS)
Masoud, M.S.; Motaweh, H.A.; Ali, A.E.
1999-01-01
Full text.the electronic absorption spectra of the octahedral complexes containing monoethanolamine were recorded in different solvents (dioxine, chlororm, ethanol, dimethylformamide, dimethylsulfoxide and water). The data analyzed based on multiple linear regression technique using the equation: ya (a is the regression intercept) are various empirical solvent polarytiparameters; constants are calculated using micro statistic program on pc computer. The solvent spectral data of the complexes are compared to that of nugot, the solvent assists the spectral data to be red shifts. In case of Mn (MEA) CL complex, numerous bands are appeared in presence of CHCI DMF and DMSO solvents probably due to the numerous oxidation states. The solvent parameters: E (solvent-solute hydrogen bond and dipolar interaction); (dipolar interaction related to the dielectric constant); M (solute permanent dipole-solvent induced ipole) and N (solute permanent dipole-solvent permanent dipole) are correlated with the structure of the complexes, in hydrogen bonding solvents (Band in case of complexes as the dielectric constant increases, blue shift occurs in due to conjugation with high stability, the data in DMF and DMSO solvents are nearly the same probably due to their similarity
Subjective randomness as statistical inference.
Griffiths, Thomas L; Daniels, Dylan; Austerweil, Joseph L; Tenenbaum, Joshua B
2018-06-01
Some events seem more random than others. For example, when tossing a coin, a sequence of eight heads in a row does not seem very random. Where do these intuitions about randomness come from? We argue that subjective randomness can be understood as the result of a statistical inference assessing the evidence that an event provides for having been produced by a random generating process. We show how this account provides a link to previous work relating randomness to algorithmic complexity, in which random events are those that cannot be described by short computer programs. Algorithmic complexity is both incomputable and too general to capture the regularities that people can recognize, but viewing randomness as statistical inference provides two paths to addressing these problems: considering regularities generated by simpler computing machines, and restricting the set of probability distributions that characterize regularity. Building on previous work exploring these different routes to a more restricted notion of randomness, we define strong quantitative models of human randomness judgments that apply not just to binary sequences - which have been the focus of much of the previous work on subjective randomness - but also to binary matrices and spatial clustering. Copyright © 2018 Elsevier Inc. All rights reserved.
Directory of Open Access Journals (Sweden)
Swarna Weerasinghe
2017-03-01
Conclusion: This study demonstrated the importance of complex statistical model use and the consequences of lack of such modelling that accounted for data structures in public health risk assessments.
Statistical mechanics for a class of quantum statistics
International Nuclear Information System (INIS)
Isakov, S.B.
1994-01-01
Generalized statistical distributions for identical particles are introduced for the case where filling a single-particle quantum state by particles depends on filling states of different momenta. The system of one-dimensional bosons with a two-body potential that can be solved by means of the thermodynamic Bethe ansatz is shown to be equivalent thermodynamically to a system of free particles obeying statistical distributions of the above class. The quantum statistics arising in this way are completely determined by the two-particle scattering phases of the corresponding interacting systems. An equation determining the statistical distributions for these statistics is derived
International Nuclear Information System (INIS)
Beck, W.
1984-01-01
From the complexity of computer programs for the solution of scientific and technical problems results a lot of questions. Typical questions concern the strength and weakness of computer programs, the propagation of incertainties among the input data, the sensitivity of input data on output data and the substitute of complex models by more simple ones, which provide equivalent results in certain ranges. Those questions have a general practical meaning, principle answers may be found by statistical methods, which are based on the Monte Carlo Method. In this report the statistical methods are chosen, described and valuated. They are implemented into the modular program system STAR, which is an own component of the program system RSYST. The design of STAR considers users with different knowledge of data processing and statistics. The variety of statistical methods, generating and evaluating procedures. The processing of large data sets in complex structures. The coupling to other components of RSYST and RSYST foreign programs. That the system can be easily modificated and enlarged. Four examples are given, which demonstrate the application of STAR. (orig.) [de
Exact solutions of the one-dimensional generalized modified complex Ginzburg-Landau equation
International Nuclear Information System (INIS)
Yomba, Emmanuel; Kofane, Timoleon Crepin
2003-01-01
The one-dimensional (1D) generalized modified complex Ginzburg-Landau (MCGL) equation for the traveling wave systems is analytically studied. Exact solutions of this equation are obtained using a method which combines the Painleve test for integrability in the formalism of Weiss-Tabor-Carnevale and Hirota technique of bilinearization. We show that pulses, fronts, periodic unbounded waves, sources, sinks and solution as collision between two fronts are the important coherent structures that organize much of the dynamical properties of these traveling wave systems. The degeneracies of the 1D generalized MCGL equation are examined as well as several of their solutions. These degeneracies include two important equations: the 1D generalized modified Schroedinger equation and the 1D generalized real modified Ginzburg-Landau equation. We obtain that the one parameter family of traveling localized source solutions called 'Nozaki-Bekki holes' become a subfamily of the dark soliton solutions in the 1D generalized modified Schroedinger limit
Petersson, K M; Nichols, T E; Poline, J B; Holmes, A P
1999-01-01
Functional neuroimaging (FNI) provides experimental access to the intact living brain making it possible to study higher cognitive functions in humans. In this review and in a companion paper in this issue, we discuss some common methods used to analyse FNI data. The emphasis in both papers is on assumptions and limitations of the methods reviewed. There are several methods available to analyse FNI data indicating that none is optimal for all purposes. In order to make optimal use of the methods available it is important to know the limits of applicability. For the interpretation of FNI results it is also important to take into account the assumptions, approximations and inherent limitations of the methods used. This paper gives a brief overview over some non-inferential descriptive methods and common statistical models used in FNI. Issues relating to the complex problem of model selection are discussed. In general, proper model selection is a necessary prerequisite for the validity of the subsequent statistical inference. The non-inferential section describes methods that, combined with inspection of parameter estimates and other simple measures, can aid in the process of model selection and verification of assumptions. The section on statistical models covers approaches to global normalization and some aspects of univariate, multivariate, and Bayesian models. Finally, approaches to functional connectivity and effective connectivity are discussed. In the companion paper we review issues related to signal detection and statistical inference. PMID:10466149
[''R"--project for statistical computing
DEFF Research Database (Denmark)
Dessau, R.B.; Pipper, Christian Bressen
2008-01-01
An introduction to the R project for statistical computing (www.R-project.org) is presented. The main topics are: 1. To make the professional community aware of "R" as a potent and free software for graphical and statistical analysis of medical data; 2. Simple well-known statistical tests are fai...... are fairly easy to perform in R, but more complex modelling requires programming skills; 3. R is seen as a tool for teaching statistics and implementing complex modelling of medical data among medical professionals Udgivelsesdato: 2008/1/28......An introduction to the R project for statistical computing (www.R-project.org) is presented. The main topics are: 1. To make the professional community aware of "R" as a potent and free software for graphical and statistical analysis of medical data; 2. Simple well-known statistical tests...
Sinnott, Jan D.
This paper discusses the utility of a general systems theory paradigm for psychology. The paradigm can be used for conceptualizing such complex phenomena as change over time in living systems, person-society interactions, and the epistemology of multiply determined changes. Consideration is also given to applications of the approach to…
General classical solutions of the complex Grassmannian and CP sub(N-1) sigma models
International Nuclear Information System (INIS)
Sasaki, Ryu.
1983-05-01
General classical solutions are constructed for the complex Grassmannian non-linear sigma models in two euclidean dimensions in terms of holomorphic functions. The Grassmannian sigma models are a simple generalization of the well known CP sup(N-1) model in two dimensions and they share various interesting properties; existence of (anti-) instantons, an infinite number of conserved quantities and complete integrability. (author)
Universal Poisson Statistics of mRNAs with Complex Decay Pathways.
Thattai, Mukund
2016-01-19
Messenger RNA (mRNA) dynamics in single cells are often modeled as a memoryless birth-death process with a constant probability per unit time that an mRNA molecule is synthesized or degraded. This predicts a Poisson steady-state distribution of mRNA number, in close agreement with experiments. This is surprising, since mRNA decay is known to be a complex process. The paradox is resolved by realizing that the Poisson steady state generalizes to arbitrary mRNA lifetime distributions. A mapping between mRNA dynamics and queueing theory highlights an identifiability problem: a measured Poisson steady state is consistent with a large variety of microscopic models. Here, I provide a rigorous and intuitive explanation for the universality of the Poisson steady state. I show that the mRNA birth-death process and its complex decay variants all take the form of the familiar Poisson law of rare events, under a nonlinear rescaling of time. As a corollary, not only steady-states but also transients are Poisson distributed. Deviations from the Poisson form occur only under two conditions, promoter fluctuations leading to transcriptional bursts or nonindependent degradation of mRNA molecules. These results place severe limits on the power of single-cell experiments to probe microscopic mechanisms, and they highlight the need for single-molecule measurements. Copyright © 2016 The Authors. Published by Elsevier Inc. All rights reserved.
Nonequilibrium statistical mechanics in the general theory of relativity. I. A general formalism
International Nuclear Information System (INIS)
Israel, W.; Kandrup, H.E.
1984-01-01
This is the first in a series of papers, the overall objective of which is the formulation of a new covariant approach to nonequilibrium statistical mechanics in classical general relativity. The objecct here is the development of a tractable theory for self-gravitating systems. It is argued that the ''state'' of an N-particle system may be characterized by an N-particle distribution function, defined in an 8N-dimensional phase space, which satisfies a collection of N conservation equations. By mapping the true physics onto a fictitious ''background'' spacetime, which may be chosen to satisfy some ''average'' field equations, one then obtains a useful covariant notion of ''evolution'' in response to a fluctuating ''gravitational force.'' For many cases of practical interest, one may suppose (i) that these fluctuating forces satisfy linear field equations and (ii) that they may be modeled by a direct interaction. In this case, one can use a relativistic projection operator formalism to derive exact closed equations for the evolution of such objects as an appropriately defined reduced one-particle distribution function. By capturing, in a natural way, the notion of a dilute gas, or impulse, approximation, one is then led to a comparatively simple equation for the one-particle distribution. If, furthermore, one treats the effects of the fluctuating forces as ''localized'' in space and time, one obtains a tractable kinetic equation which reduces, in the Newtonian limit, to the stardard Landau equation
Normality of raw data in general linear models: The most widespread myth in statistics
Kery, Marc; Hatfield, Jeff S.
2003-01-01
In years of statistical consulting for ecologists and wildlife biologists, by far the most common misconception we have come across has been the one about normality in general linear models. These comprise a very large part of the statistical models used in ecology and include t tests, simple and multiple linear regression, polynomial regression, and analysis of variance (ANOVA) and covariance (ANCOVA). There is a widely held belief that the normality assumption pertains to the raw data rather than to the model residuals. We suspect that this error may also occur in countless published studies, whenever the normality assumption is tested prior to analysis. This may lead to the use of nonparametric alternatives (if there are any), when parametric tests would indeed be appropriate, or to use of transformations of raw data, which may introduce hidden assumptions such as multiplicative effects on the natural scale in the case of log-transformed data. Our aim here is to dispel this myth. We very briefly describe relevant theory for two cases of general linear models to show that the residuals need to be normally distributed if tests requiring normality are to be used, such as t and F tests. We then give two examples demonstrating that the distribution of the response variable may be nonnormal, and yet the residuals are well behaved. We do not go into the issue of how to test normality; instead we display the distributions of response variables and residuals graphically.
A statistical mechanical approach for the computation of the climatic response to general forcings
Directory of Open Access Journals (Sweden)
V. Lucarini
2011-01-01
Full Text Available The climate belongs to the class of non-equilibrium forced and dissipative systems, for which most results of quasi-equilibrium statistical mechanics, including the fluctuation-dissipation theorem, do not apply. In this paper we show for the first time how the Ruelle linear response theory, developed for studying rigorously the impact of perturbations on general observables of non-equilibrium statistical mechanical systems, can be applied with great success to analyze the climatic response to general forcings. The crucial value of the Ruelle theory lies in the fact that it allows to compute the response of the system in terms of expectation values of explicit and computable functions of the phase space averaged over the invariant measure of the unperturbed state. We choose as test bed a classical version of the Lorenz 96 model, which, in spite of its simplicity, has a well-recognized prototypical value as it is a spatially extended one-dimensional model and presents the basic ingredients, such as dissipation, advection and the presence of an external forcing, of the actual atmosphere. We recapitulate the main aspects of the general response theory and propose some new general results. We then analyze the frequency dependence of the response of both local and global observables to perturbations having localized as well as global spatial patterns. We derive analytically several properties of the corresponding susceptibilities, such as asymptotic behavior, validity of Kramers-Kronig relations, and sum rules, whose main ingredient is the causality principle. We show that all the coefficients of the leading asymptotic expansions as well as the integral constraints can be written as linear function of parameters that describe the unperturbed properties of the system, such as its average energy. Some newly obtained empirical closure equations for such parameters allow to define such properties as an explicit function of the unperturbed forcing
Hanel, Rudolf; Thurner, Stefan; Gell-Mann, Murray
2014-05-13
The maximum entropy principle (MEP) is a method for obtaining the most likely distribution functions of observables from statistical systems by maximizing entropy under constraints. The MEP has found hundreds of applications in ergodic and Markovian systems in statistical mechanics, information theory, and statistics. For several decades there has been an ongoing controversy over whether the notion of the maximum entropy principle can be extended in a meaningful way to nonextensive, nonergodic, and complex statistical systems and processes. In this paper we start by reviewing how Boltzmann-Gibbs-Shannon entropy is related to multiplicities of independent random processes. We then show how the relaxation of independence naturally leads to the most general entropies that are compatible with the first three Shannon-Khinchin axioms, the (c,d)-entropies. We demonstrate that the MEP is a perfectly consistent concept for nonergodic and complex statistical systems if their relative entropy can be factored into a generalized multiplicity and a constraint term. The problem of finding such a factorization reduces to finding an appropriate representation of relative entropy in a linear basis. In a particular example we show that path-dependent random processes with memory naturally require specific generalized entropies. The example is to our knowledge the first exact derivation of a generalized entropy from the microscopic properties of a path-dependent random process.
Correlated electrons and generalized statistics
International Nuclear Information System (INIS)
Wang, Q.A.
2003-01-01
Several important generalizations of Fermi-Dirac distribution are compared to numerical and experimental results for correlated electron systems. It is found that the quantum distributions based on incomplete information hypothesis can be useful for describing this kind of systems. We show that the additive incomplete fermion distribution gives very good description of weakly correlated electrons and that the non-additive one is suitable to very strong correlated cases. (author)
Generalized statistics and the formation of a quark-gluon plasma
International Nuclear Information System (INIS)
Teweldeberhan, A.M.; Miller, H.G.; Tegen, R.
2003-01-01
The aim of this paper is to investigate the effect of a non-extensive form of statistical mechanics proposed by Tsallis on the formation of a quark-gluon plasma (QGP). We suggest to account for the effects of the dominant part of the long-range interactions among the constituents in the QGP by a change in the statistics of the system in this phase, and we study the relevance of this statistics for the phase transition. The results show that small deviations (≈ 10%) from Boltzmann–Gibbs statistics in the QGP produce a noticeable change in the phase diagram, which can, in principle, be tested experimentally. (author)
Chang, Pao-Erh Paul; Yang, Jen-Chih Rena; Den, Walter; Wu, Chang-Fu
2014-09-01
Emissions of volatile organic compounds (VOCs) are most frequent environmental nuisance complaints in urban areas, especially where industrial districts are nearby. Unfortunately, identifying the responsible emission sources of VOCs is essentially a difficult task. In this study, we proposed a dynamic approach to gradually confine the location of potential VOC emission sources in an industrial complex, by combining multi-path open-path Fourier transform infrared spectrometry (OP-FTIR) measurement and the statistical method of principal component analysis (PCA). Close-cell FTIR was further used to verify the VOC emission source by measuring emitted VOCs from selected exhaust stacks at factories in the confined areas. Multiple open-path monitoring lines were deployed during a 3-month monitoring campaign in a complex industrial district. The emission patterns were identified and locations of emissions were confined by the wind data collected simultaneously. N,N-Dimethyl formamide (DMF), 2-butanone, toluene, and ethyl acetate with mean concentrations of 80.0 ± 1.8, 34.5 ± 0.8, 103.7 ± 2.8, and 26.6 ± 0.7 ppbv, respectively, were identified as the major VOC mixture at all times of the day around the receptor site. As the toxic air pollutant, the concentrations of DMF in air samples were found exceeding the ambient standard despite the path-average effect of OP-FTIR upon concentration levels. The PCA data identified three major emission sources, including PU coating, chemical packaging, and lithographic printing industries. Applying instrumental measurement and statistical modeling, this study has established a systematic approach for locating emission sources. Statistical modeling (PCA) plays an important role in reducing dimensionality of a large measured dataset and identifying underlying emission sources. Instrumental measurement, however, helps verify the outcomes of the statistical modeling. The field study has demonstrated the feasibility of
Tenenbaum, Joel
This thesis applies statistical physics concepts and methods to quantitatively analyze complex systems. This thesis is separated into four parts: (i) characteristics of earthquake systems (ii) memory and volatility in data time series (iii) the application of part (ii) to world financial markets, and (iv) statistical observations on the evolution of word usage. In Part I, we observe statistical patterns in the occurrence of earthquakes. We select a 14-year earthquake catalog covering the archipelago of Japan. We find that regions traditionally thought of as being too distant from one another for causal contact display remarkably high correlations, and the networks that result have a tendency to link highly connected areas with other highly connected areas. In Part II, we introduce and apply the concept of "volatility asymmetry", the primary use of which is in financial data. We explain the relation between memory and "volatility asymmetry" in terms of an asymmetry parameter lambda. We define a litmus test for determining whether lambda is statistically significant and propose a stochastic model based on this parameter and use the model to further explain empirical data. In Part III, we expand on volatility asymmetry. Importing the concepts of time dependence and universality from physics, we explore the aspects of emerging (or "transition") economies in Eastern Europe as they relate to asymmetry. We find that these emerging markets in some instances behave like developed markets and in other instances do not, and that the distinction is a matter both of country and a matter of time period, crisis periods showing different asymmetry characteristics than "healthy" periods. In Part IV, we take note of a series of findings in econophysics, showing statistical growth similarities between a variety of different areas that all have in common the fact of taking place in areas that are both (i) competing and (ii) dynamic. We show that this same growth distribution can be
Industrial statistics with Minitab
Cintas, Pere Grima; Llabres, Xavier Tort-Martorell
2012-01-01
Industrial Statistics with MINITAB demonstrates the use of MINITAB as a tool for performing statistical analysis in an industrial context. This book covers introductory industrial statistics, exploring the most commonly used techniques alongside those that serve to give an overview of more complex issues. A plethora of examples in MINITAB are featured along with case studies for each of the statistical techniques presented. Industrial Statistics with MINITAB: Provides comprehensive coverage of user-friendly practical guidance to the essential statistical methods applied in industry.Explores
Complex Quantum Network Manifolds in Dimension d > 2 are Scale-Free
Bianconi, Ginestra; Rahmede, Christoph
2015-09-01
In quantum gravity, several approaches have been proposed until now for the quantum description of discrete geometries. These theoretical frameworks include loop quantum gravity, causal dynamical triangulations, causal sets, quantum graphity, and energetic spin networks. Most of these approaches describe discrete spaces as homogeneous network manifolds. Here we define Complex Quantum Network Manifolds (CQNM) describing the evolution of quantum network states, and constructed from growing simplicial complexes of dimension . We show that in d = 2 CQNM are homogeneous networks while for d > 2 they are scale-free i.e. they are characterized by large inhomogeneities of degrees like most complex networks. From the self-organized evolution of CQNM quantum statistics emerge spontaneously. Here we define the generalized degrees associated with the -faces of the -dimensional CQNMs, and we show that the statistics of these generalized degrees can either follow Fermi-Dirac, Boltzmann or Bose-Einstein distributions depending on the dimension of the -faces.
Directory of Open Access Journals (Sweden)
A. V. Skrypnikov
2015-01-01
Full Text Available Summary. In this work the general ideas of a method of V. I. Skurikhin taking into account the specified features develop and questions of the analysis and synthesis of a complex of technical means, with finishing them to the level suitable for use in engineering practice of design of information management systems are in more detail considered. In work the general system approach to the solution of questions of a choice of technical means of the information management system is created, the general technique of the sys tem analysis and synthesis of a complex of the technical means and its subsystems providing achievement of extreme value of criterion of efficiency of functioning of a technical complex of the information management system is developed. The main attention is paid to the applied party of system researches of complex technical providing, in particular, to definition of criteria of quality of functioning of a technical complex, development of methods of the analysis of information base of the information management system and definition of requirements to technical means, and also methods of structural synthesis of the main subsystems of complex technical providing. Thus, the purpose is research on the basis of system approach of complex technical providing the information management system and development of a number of methods of the analysis and the synthesis of complex technical providing suitable for use in engineering practice of design of systems. The well-known paradox of development of management information consists of that parameters of the system, and consequently, and requirements to the complex hardware, can not be strictly reasonable to development of algorithms and programs, and vice versa. The possible method of overcoming of these difficulties is prognostication of structure and parameters of complex hardware for certain management informations on the early stages of development, with subsequent clarification and
Statistical learning and the challenge of syntax: Beyond finite state automata
Elman, Jeff
2003-10-01
Over the past decade, it has been clear that even very young infants are sensitive to the statistical structure of language input presented to them, and use the distributional regularities to induce simple grammars. But can such statistically-driven learning also explain the acquisition of more complex grammar, particularly when the grammar includes recursion? Recent claims (e.g., Hauser, Chomsky, and Fitch, 2002) have suggested that the answer is no, and that at least recursion must be an innate capacity of the human language acquisition device. In this talk evidence will be presented that indicates that, in fact, statistically-driven learning (embodied in recurrent neural networks) can indeed enable the learning of complex grammatical patterns, including those that involve recursion. When the results are generalized to idealized machines, it is found that the networks are at least equivalent to Push Down Automata. Perhaps more interestingly, with limited and finite resources (such as are presumed to exist in the human brain) these systems demonstrate patterns of performance that resemble those in humans.
Szulc, Stefan
1965-01-01
Statistical Methods provides a discussion of the principles of the organization and technique of research, with emphasis on its application to the problems in social statistics. This book discusses branch statistics, which aims to develop practical ways of collecting and processing numerical data and to adapt general statistical methods to the objectives in a given field.Organized into five parts encompassing 22 chapters, this book begins with an overview of how to organize the collection of such information on individual units, primarily as accomplished by government agencies. This text then
PAFit: A Statistical Method for Measuring Preferential Attachment in Temporal Complex Networks.
Directory of Open Access Journals (Sweden)
Thong Pham
Full Text Available Preferential attachment is a stochastic process that has been proposed to explain certain topological features characteristic of complex networks from diverse domains. The systematic investigation of preferential attachment is an important area of research in network science, not only for the theoretical matter of verifying whether this hypothesized process is operative in real-world networks, but also for the practical insights that follow from knowledge of its functional form. Here we describe a maximum likelihood based estimation method for the measurement of preferential attachment in temporal complex networks. We call the method PAFit, and implement it in an R package of the same name. PAFit constitutes an advance over previous methods primarily because we based it on a nonparametric statistical framework that enables attachment kernel estimation free of any assumptions about its functional form. We show this results in PAFit outperforming the popular methods of Jeong and Newman in Monte Carlo simulations. What is more, we found that the application of PAFit to a publically available Flickr social network dataset yielded clear evidence for a deviation of the attachment kernel from the popularly assumed log-linear form. Independent of our main work, we provide a correction to a consequential error in Newman's original method which had evidently gone unnoticed since its publication over a decade ago.
Generalization of the Activated Complex Theory of Reaction Rates. II. Classical Mechanical Treatment
Marcus, R. A.
1964-01-01
In its usual classical form activated complex theory assumes a particular expression for the kinetic energy of the reacting system -- one associated with a rectilinear motion along the reaction coordinate. The derivation of the rate expression given in the present paper is based on the general kinetic energy expression.
Reichert, B.K.; Bengtsson, L.; Oerlemans, J.
2001-01-01
A process-oriented modeling approach is applied in order to simulate glacier mass balance for individual glaciers using statistically downscaled general circulation models (GCMs). Glacier-specific seasonal sensitivity characteristics based on a mass balance model of intermediate complexity are used
Scalable Algorithms for Adaptive Statistical Designs
Directory of Open Access Journals (Sweden)
Robert Oehmke
2000-01-01
Full Text Available We present a scalable, high-performance solution to multidimensional recurrences that arise in adaptive statistical designs. Adaptive designs are an important class of learning algorithms for a stochastic environment, and we focus on the problem of optimally assigning patients to treatments in clinical trials. While adaptive designs have significant ethical and cost advantages, they are rarely utilized because of the complexity of optimizing and analyzing them. Computational challenges include massive memory requirements, few calculations per memory access, and multiply-nested loops with dynamic indices. We analyze the effects of various parallelization options, and while standard approaches do not work well, with effort an efficient, highly scalable program can be developed. This allows us to solve problems thousands of times more complex than those solved previously, which helps make adaptive designs practical. Further, our work applies to many other problems involving neighbor recurrences, such as generalized string matching.
Energy Technology Data Exchange (ETDEWEB)
Nieto H, B; Azorin N, J; Vazquez C, G A [UAM-I, 09340 Mexico D.F. (Mexico)
2004-07-01
As a theoretical planning of the thermoluminescence phenomena (Tl), we study the behavior of the systems formed by fermions, which are related with this phenomenon establishing a generalization of the Randall-Wilkins model, as for first order kinetics as for general order (equation of May and Partridge) in which we consider a of Fermi-Dirac statistics. As consequence of this study a new variable is manifested: the chemical potential, also we establish its relationship with some of the other magnitudes already known in Tl. (Author)
General-Purpose Computation with Neural Networks: A Survey of Complexity Theoretic Results
Czech Academy of Sciences Publication Activity Database
Šíma, Jiří; Orponen, P.
2003-01-01
Roč. 15, č. 12 (2003), s. 2727-2778 ISSN 0899-7667 R&D Projects: GA AV ČR IAB2030007; GA ČR GA201/02/1456 Institutional research plan: AV0Z1030915 Keywords : computational power * computational complexity * perceptrons * radial basis functions * spiking neurons * feedforward networks * reccurent networks * probabilistic computation * analog computation Subject RIV: BA - General Mathematics Impact factor: 2.747, year: 2003
Boslaugh, Sarah
2013-01-01
Need to learn statistics for your job? Want help passing a statistics course? Statistics in a Nutshell is a clear and concise introduction and reference for anyone new to the subject. Thoroughly revised and expanded, this edition helps you gain a solid understanding of statistics without the numbing complexity of many college texts. Each chapter presents easy-to-follow descriptions, along with graphics, formulas, solved examples, and hands-on exercises. If you want to perform common statistical analyses and learn a wide range of techniques without getting in over your head, this is your book.
Characterizing time series via complexity-entropy curves
Ribeiro, Haroldo V.; Jauregui, Max; Zunino, Luciano; Lenzi, Ervin K.
2017-06-01
The search for patterns in time series is a very common task when dealing with complex systems. This is usually accomplished by employing a complexity measure such as entropies and fractal dimensions. However, such measures usually only capture a single aspect of the system dynamics. Here, we propose a family of complexity measures for time series based on a generalization of the complexity-entropy causality plane. By replacing the Shannon entropy by a monoparametric entropy (Tsallis q entropy) and after considering the proper generalization of the statistical complexity (q complexity), we build up a parametric curve (the q -complexity-entropy curve) that is used for characterizing and classifying time series. Based on simple exact results and numerical simulations of stochastic processes, we show that these curves can distinguish among different long-range, short-range, and oscillating correlated behaviors. Also, we verify that simulated chaotic and stochastic time series can be distinguished based on whether these curves are open or closed. We further test this technique in experimental scenarios related to chaotic laser intensity, stock price, sunspot, and geomagnetic dynamics, confirming its usefulness. Finally, we prove that these curves enhance the automatic classification of time series with long-range correlations and interbeat intervals of healthy subjects and patients with heart disease.
New Hybrid Monte Carlo methods for efficient sampling. From physics to biology and statistics
International Nuclear Information System (INIS)
Akhmatskaya, Elena; Reich, Sebastian
2011-01-01
We introduce a class of novel hybrid methods for detailed simulations of large complex systems in physics, biology, materials science and statistics. These generalized shadow Hybrid Monte Carlo (GSHMC) methods combine the advantages of stochastic and deterministic simulation techniques. They utilize a partial momentum update to retain some of the dynamical information, employ modified Hamiltonians to overcome exponential performance degradation with the system’s size and make use of multi-scale nature of complex systems. Variants of GSHMCs were developed for atomistic simulation, particle simulation and statistics: GSHMC (thermodynamically consistent implementation of constant-temperature molecular dynamics), MTS-GSHMC (multiple-time-stepping GSHMC), meso-GSHMC (Metropolis corrected dissipative particle dynamics (DPD) method), and a generalized shadow Hamiltonian Monte Carlo, GSHmMC (a GSHMC for statistical simulations). All of these are compatible with other enhanced sampling techniques and suitable for massively parallel computing allowing for a range of multi-level parallel strategies. A brief description of the GSHMC approach, examples of its application on high performance computers and comparison with other existing techniques are given. Our approach is shown to resolve such problems as resonance instabilities of the MTS methods and non-preservation of thermodynamic equilibrium properties in DPD, and to outperform known methods in sampling efficiency by an order of magnitude. (author)
Thermodynamic Bethe ansatz with Haldane statistics
International Nuclear Information System (INIS)
Bytsko, A.G.; Fring, A.
1998-01-01
We derive the thermodynamic Bethe ansatz equation for the situation in which the statistical interaction of a multi-particle system is governed by Haldane statistics. We formulate a macroscopical equivalence principle for such systems. Particular CDD ambiguities play a distinguished role in compensating the ambiguity in the exclusion statistics. We derive Y-systems related to generalized statistics. We discuss several fermionic, bosonic and anyonic versions of affine Toda field theories and Calogero-Sutherland type models in the context of generalized statistics. (orig.)
2010-12-02
will face in an uncertain future. Complexity Theory , History, Practice, Military Theory , Leadership 14. SUBJECT TERMS 70 15. NUMBER OF PAGES...complexity theory : scale, adaptive leadership , and bottom up feedback from the agents (the soldiers in the field). These are all key sub components of...Approved for Public Release; Distribution is Unlimited COMPARING THEORY AND PRACTICE: AN APPLICATION OF COMPLEXITY THEORY TO GENERAL RIDGWAY’S
Quantum communication complexity advantage implies violation of a Bell inequality
Buhrman, Harry; Czekaj, Łukasz; Grudka, Andrzej; Horodecki, Michał; Horodecki, Paweł; Markiewicz, Marcin; Speelman, Florian; Strelchuk, Sergii
2016-01-01
We obtain a general connection between a large quantum advantage in communication complexity and Bell nonlocality. We show that given any protocol offering a sufficiently large quantum advantage in communication complexity, there exists a way of obtaining measurement statistics that violate some Bell inequality. Our main tool is port-based teleportation. If the gap between quantum and classical communication complexity can grow arbitrarily large, the ratio of the quantum value to the classical value of the Bell quantity becomes unbounded with the increase in the number of inputs and outputs. PMID:26957600
Kassem, M.; Soize, C.; Gagliardini, L.
2009-06-01
In this paper, an energy-density field approach applied to the vibroacoustic analysis of complex industrial structures in the low- and medium-frequency ranges is presented. This approach uses a statistical computational model. The analyzed system consists of an automotive vehicle structure coupled with its internal acoustic cavity. The objective of this paper is to make use of the statistical properties of the frequency response functions of the vibroacoustic system observed from previous experimental and numerical work. The frequency response functions are expressed in terms of a dimensionless matrix which is estimated using the proposed energy approach. Using this dimensionless matrix, a simplified vibroacoustic model is proposed.
Polarimetric Segmentation Using Wishart Test Statistic
DEFF Research Database (Denmark)
Skriver, Henning; Schou, Jesper; Nielsen, Allan Aasbjerg
2002-01-01
A newly developed test statistic for equality of two complex covariance matrices following the complex Wishart distribution and an associated asymptotic probability for the test statistic has been used in a segmentation algorithm. The segmentation algorithm is based on the MUM (merge using moments......) approach, which is a merging algorithm for single channel SAR images. The polarimetric version described in this paper uses the above-mentioned test statistic for merging. The segmentation algorithm has been applied to polarimetric SAR data from the Danish dual-frequency, airborne polarimetric SAR, EMISAR...
Statistical physics of an anyon gas
International Nuclear Information System (INIS)
Dasnieres de Veigy, A.
1994-01-01
In quantum two-dimensional physics, anyons are particles which have an intermediate statistics between Bose-Einstein and Fermi-Dirac statistics. The wave amplitude can change by an arbitrary phase under particle exchanges. Contrary to bosons or fermions, the permutation group cannot uniquely characterize this phase and one must introduce the braid group. One shows that the statistical ''interaction'' is equivalent to an Aharonov-Bohm interaction which derives from a Chern-Simons lagrangian. The main subject of this thesis is the thermodynamics of an anyon gas. Since the complete spectrum of N anyons seems out of reach, we have done a perturbative computation of the equation of state at second order near Bose or Fermi statistics. One avoids ultraviolet divergences by noticing that the short-range singularities of the statistical interaction enforce the wave functions to vanish when two particles approach each other (statistical exclusion). The gas is confined in a harmonic well in order to obtain the thermodynamics limit when the harmonic attraction goes to zero. Infrared divergences thus cancel in this limit and a finite virial expansion is obtained. The complexity of the anyon model appears in this result. We have also computed the equation of state of an anyon gas in a magnetic field strong enough to project the system in its degenerate groundstate. This result concerns anyons with any statistics. One then finds an exclusion principle generalizing the Pauli principle to anyons. On the other hand, we have defined a model of two-dimensional particles topologically interacting at a distance. The anyon model is recovered as a particular case where all particles are identical. (orig.)
Complex Environmental Data Modelling Using Adaptive General Regression Neural Networks
Kanevski, Mikhail
2015-04-01
The research deals with an adaptation and application of Adaptive General Regression Neural Networks (GRNN) to high dimensional environmental data. GRNN [1,2,3] are efficient modelling tools both for spatial and temporal data and are based on nonparametric kernel methods closely related to classical Nadaraya-Watson estimator. Adaptive GRNN, using anisotropic kernels, can be also applied for features selection tasks when working with high dimensional data [1,3]. In the present research Adaptive GRNN are used to study geospatial data predictability and relevant feature selection using both simulated and real data case studies. The original raw data were either three dimensional monthly precipitation data or monthly wind speeds embedded into 13 dimensional space constructed by geographical coordinates and geo-features calculated from digital elevation model. GRNN were applied in two different ways: 1) adaptive GRNN with the resulting list of features ordered according to their relevancy; and 2) adaptive GRNN applied to evaluate all possible models N [in case of wind fields N=(2^13 -1)=8191] and rank them according to the cross-validation error. In both cases training were carried out applying leave-one-out procedure. An important result of the study is that the set of the most relevant features depends on the month (strong seasonal effect) and year. The predictabilities of precipitation and wind field patterns, estimated using the cross-validation and testing errors of raw and shuffled data, were studied in detail. The results of both approaches were qualitatively and quantitatively compared. In conclusion, Adaptive GRNN with their ability to select features and efficient modelling of complex high dimensional data can be widely used in automatic/on-line mapping and as an integrated part of environmental decision support systems. 1. Kanevski M., Pozdnoukhov A., Timonin V. Machine Learning for Spatial Environmental Data. Theory, applications and software. EPFL Press
Software Alchemy: Turning Complex Statistical Computations into Embarrassingly-Parallel Ones
Directory of Open Access Journals (Sweden)
Norman Matloff
2016-07-01
Full Text Available The growth in the use of computationally intensive statistical procedures, especially with big data, has necessitated the usage of parallel computation on diverse platforms such as multicore, GPUs, clusters and clouds. However, slowdown due to interprocess communication costs typically limits such methods to "embarrassingly parallel" (EP algorithms, especially on non-shared memory platforms. This paper develops a broadlyapplicable method for converting many non-EP algorithms into statistically equivalent EP ones. The method is shown to yield excellent levels of speedup for a variety of statistical computations. It also overcomes certain problems of memory limitations.
International Nuclear Information System (INIS)
Dai Hao; Jia Li-Xin; Zhang Yan-Bin
2012-01-01
The adaptive generalized matrix projective lag synchronization between two different complex networks with non-identical nodes and different dimensions is investigated in this paper. Based on Lyapunov stability theory and Barbalat's lemma, generalized matrix projective lag synchronization criteria are derived by using the adaptive control method. Furthermore, each network can be undirected or directed, connected or disconnected, and nodes in either network may have identical or different dynamics. The proposed strategy is applicable to almost all kinds of complex networks. In addition, numerical simulation results are presented to illustrate the effectiveness of this method, showing that the synchronization speed is sensitively influenced by the adaptive law strength, the network size, and the network topological structure. (general)
A combined statistical model for multiple motifs search
International Nuclear Information System (INIS)
Gao Lifeng; Liu Xin; Guan Shan
2008-01-01
Transcription factor binding sites (TFBS) play key roles in genebior 6.8 wavelet expression and regulation. They are short sequence segments with definite structure and can be recognized by the corresponding transcription factors correctly. From the viewpoint of statistics, the candidates of TFBS should be quite different from the segments that are randomly combined together by nucleotide. This paper proposes a combined statistical model for finding over-represented short sequence segments in different kinds of data set. While the over-represented short sequence segment is described by position weight matrix, the nucleotide distribution at most sites of the segment should be far from the background nucleotide distribution. The central idea of this approach is to search for such kind of signals. This algorithm is tested on 3 data sets, including binding sites data set of cyclic AMP receptor protein in E.coli, PlantProm DB which is a non-redundant collection of proximal promoter sequences from different species, collection of the intergenic sequences of the whole genome of E.Coli. Even though the complexity of these three data sets is quite different, the results show that this model is rather general and sensible. (general)
Naghshpour, Shahdad
2012-01-01
Statistics is the branch of mathematics that deals with real-life problems. As such, it is an essential tool for economists. Unfortunately, the way you and many other economists learn the concept of statistics is not compatible with the way economists think and learn. The problem is worsened by the use of mathematical jargon and complex derivations. Here's a book that proves none of this is necessary. All the examples and exercises in this book are constructed within the field of economics, thus eliminating the difficulty of learning statistics with examples from fields that have no relation to business, politics, or policy. Statistics is, in fact, not more difficult than economics. Anyone who can comprehend economics can understand and use statistics successfully within this field, including you! This book utilizes Microsoft Excel to obtain statistical results, as well as to perform additional necessary computations. Microsoft Excel is not the software of choice for performing sophisticated statistical analy...
Quantify the complexity of turbulence
Tao, Xingtian; Wu, Huixuan
2017-11-01
Many researchers have used Reynolds stress, power spectrum and Shannon entropy to characterize a turbulent flow, but few of them have measured the complexity of turbulence. Yet as this study shows, conventional turbulence statistics and Shannon entropy have limits when quantifying the flow complexity. Thus, it is necessary to introduce new complexity measures- such as topology complexity and excess information-to describe turbulence. Our test flow is a classic turbulent cylinder wake at Reynolds number 8100. Along the stream-wise direction, the flow becomes more isotropic and the magnitudes of normal Reynolds stresses decrease monotonically. These seem to indicate the flow dynamics becomes simpler downstream. However, the Shannon entropy keeps increasing along the flow direction and the dynamics seems to be more complex, because the large-scale vortices cascade to small eddies, the flow is less correlated and more unpredictable. In fact, these two contradictory observations partially describe the complexity of a turbulent wake. Our measurements (up to 40 diameters downstream the cylinder) show that the flow's degree-of-complexity actually increases firstly and then becomes a constant (or drops slightly) along the stream-wise direction. University of Kansas General Research Fund.
Complexity of Economical Systems
Directory of Open Access Journals (Sweden)
G. P. Pavlos
2015-01-01
Full Text Available In this study new theoretical concepts are described concerning the interpretation of economical complex dynamics. In addition a summary of an extended algorithm of nonlinear time series analysis is provided which is applied not only in economical time series but also in other physical complex systems (e.g. [22, 24]. In general, Economy is a vast and complicated set of arrangements and actions wherein agents—consumers, firms, banks, investors, government agencies—buy and sell, speculate, trade, oversee, bring products into being, offer services, invest in companies, strategize, explore, forecast, compete, learn, innovate, and adapt. As a result the economic and financial variables such as foreign exchange rates, gross domestic product, interest rates, production, stock market prices and unemployment exhibit large-amplitude and aperiodic fluctuations evident in complex systems. Thus, the Economics can be considered as spatially distributed non-equilibrium complex system, for which new theoretical concepts, such as Tsallis non extensive statistical mechanics and strange dynamics, percolation, nonGaussian, multifractal and multiscale dynamics related to fractional Langevin equations can be used for modeling and understanding of the economical complexity locally or globally.
Statistical Analysis of Big Data on Pharmacogenomics
Fan, Jianqing; Liu, Han
2013-01-01
This paper discusses statistical methods for estimating complex correlation structure from large pharmacogenomic datasets. We selectively review several prominent statistical methods for estimating large covariance matrix for understanding correlation structure, inverse covariance matrix for network modeling, large-scale simultaneous tests for selecting significantly differently expressed genes and proteins and genetic markers for complex diseases, and high dimensional variable selection for identifying important molecules for understanding molecule mechanisms in pharmacogenomics. Their applications to gene network estimation and biomarker selection are used to illustrate the methodological power. Several new challenges of Big data analysis, including complex data distribution, missing data, measurement error, spurious correlation, endogeneity, and the need for robust statistical methods, are also discussed. PMID:23602905
Learning Predictive Statistics: Strategies and Brain Mechanisms.
Wang, Rui; Shen, Yuan; Tino, Peter; Welchman, Andrew E; Kourtzi, Zoe
2017-08-30
When immersed in a new environment, we are challenged to decipher initially incomprehensible streams of sensory information. However, quite rapidly, the brain finds structure and meaning in these incoming signals, helping us to predict and prepare ourselves for future actions. This skill relies on extracting the statistics of event streams in the environment that contain regularities of variable complexity from simple repetitive patterns to complex probabilistic combinations. Here, we test the brain mechanisms that mediate our ability to adapt to the environment's statistics and predict upcoming events. By combining behavioral training and multisession fMRI in human participants (male and female), we track the corticostriatal mechanisms that mediate learning of temporal sequences as they change in structure complexity. We show that learning of predictive structures relates to individual decision strategy; that is, selecting the most probable outcome in a given context (maximizing) versus matching the exact sequence statistics. These strategies engage distinct human brain regions: maximizing engages dorsolateral prefrontal, cingulate, sensory-motor regions, and basal ganglia (dorsal caudate, putamen), whereas matching engages occipitotemporal regions (including the hippocampus) and basal ganglia (ventral caudate). Our findings provide evidence for distinct corticostriatal mechanisms that facilitate our ability to extract behaviorally relevant statistics to make predictions. SIGNIFICANCE STATEMENT Making predictions about future events relies on interpreting streams of information that may initially appear incomprehensible. Past work has studied how humans identify repetitive patterns and associative pairings. However, the natural environment contains regularities that vary in complexity from simple repetition to complex probabilistic combinations. Here, we combine behavior and multisession fMRI to track the brain mechanisms that mediate our ability to adapt to
Ghikas, Demetris P. K.; Oikonomou, Fotios D.
2018-04-01
Using the generalized entropies which depend on two parameters we propose a set of quantitative characteristics derived from the Information Geometry based on these entropies. Our aim, at this stage, is to construct first some fundamental geometric objects which will be used in the development of our geometrical framework. We first establish the existence of a two-parameter family of probability distributions. Then using this family we derive the associated metric and we state a generalized Cramer-Rao Inequality. This gives a first two-parameter classification of complex systems. Finally computing the scalar curvature of the information manifold we obtain a further discrimination of the corresponding classes. Our analysis is based on the two-parameter family of generalized entropies of Hanel and Thurner (2011).
Boslaugh, Sarah
2008-01-01
Need to learn statistics as part of your job, or want some help passing a statistics course? Statistics in a Nutshell is a clear and concise introduction and reference that's perfect for anyone with no previous background in the subject. This book gives you a solid understanding of statistics without being too simple, yet without the numbing complexity of most college texts. You get a firm grasp of the fundamentals and a hands-on understanding of how to apply them before moving on to the more advanced material that follows. Each chapter presents you with easy-to-follow descriptions illustrat
Entropy: A Unifying Path for Understanding Complexity in Natural, Artificial and Social Systems
2011-07-01
details can be seen. Figure 24 – Rank frequency functions of plays ( Shakespeare ) and books (Dickens) fitted by the generalizations of the...0-387-75888-6]. [4] Tsallis, C. (2009b). Introduction to Nonextensive Statistical Mechanics - Approaching a Complex World. (Springer, New York
Directory of Open Access Journals (Sweden)
Ozonoff Al
2010-07-01
Full Text Available Abstract Background A common, important problem in spatial epidemiology is measuring and identifying variation in disease risk across a study region. In application of statistical methods, the problem has two parts. First, spatial variation in risk must be detected across the study region and, second, areas of increased or decreased risk must be correctly identified. The location of such areas may give clues to environmental sources of exposure and disease etiology. One statistical method applicable in spatial epidemiologic settings is a generalized additive model (GAM which can be applied with a bivariate LOESS smoother to account for geographic location as a possible predictor of disease status. A natural hypothesis when applying this method is whether residential location of subjects is associated with the outcome, i.e. is the smoothing term necessary? Permutation tests are a reasonable hypothesis testing method and provide adequate power under a simple alternative hypothesis. These tests have yet to be compared to other spatial statistics. Results This research uses simulated point data generated under three alternative hypotheses to evaluate the properties of the permutation methods and compare them to the popular spatial scan statistic in a case-control setting. Case 1 was a single circular cluster centered in a circular study region. The spatial scan statistic had the highest power though the GAM method estimates did not fall far behind. Case 2 was a single point source located at the center of a circular cluster and Case 3 was a line source at the center of the horizontal axis of a square study region. Each had linearly decreasing logodds with distance from the point. The GAM methods outperformed the scan statistic in Cases 2 and 3. Comparing sensitivity, measured as the proportion of the exposure source correctly identified as high or low risk, the GAM methods outperformed the scan statistic in all three Cases. Conclusions The GAM
Young, Robin L; Weinberg, Janice; Vieira, Verónica; Ozonoff, Al; Webster, Thomas F
2010-07-19
A common, important problem in spatial epidemiology is measuring and identifying variation in disease risk across a study region. In application of statistical methods, the problem has two parts. First, spatial variation in risk must be detected across the study region and, second, areas of increased or decreased risk must be correctly identified. The location of such areas may give clues to environmental sources of exposure and disease etiology. One statistical method applicable in spatial epidemiologic settings is a generalized additive model (GAM) which can be applied with a bivariate LOESS smoother to account for geographic location as a possible predictor of disease status. A natural hypothesis when applying this method is whether residential location of subjects is associated with the outcome, i.e. is the smoothing term necessary? Permutation tests are a reasonable hypothesis testing method and provide adequate power under a simple alternative hypothesis. These tests have yet to be compared to other spatial statistics. This research uses simulated point data generated under three alternative hypotheses to evaluate the properties of the permutation methods and compare them to the popular spatial scan statistic in a case-control setting. Case 1 was a single circular cluster centered in a circular study region. The spatial scan statistic had the highest power though the GAM method estimates did not fall far behind. Case 2 was a single point source located at the center of a circular cluster and Case 3 was a line source at the center of the horizontal axis of a square study region. Each had linearly decreasing logodds with distance from the point. The GAM methods outperformed the scan statistic in Cases 2 and 3. Comparing sensitivity, measured as the proportion of the exposure source correctly identified as high or low risk, the GAM methods outperformed the scan statistic in all three Cases. The GAM permutation testing methods provide a regression
Latour-Delfgaauw, C.H.M.; van der Windt, D.A.W.M.; de Jonge, P.; Riphagen, II; Vos, R.; Huyse, F.J.; Stalman, W.A.B.
2007-01-01
Objective: The aim of this study was to summarize the available literature on the effectiveness of ambulatory nurse-led case management for complex patients in general health care. Method: We searched MEDLINE, EMBASE, the Cochrane Controlled Trials Register, and Cinahl. We included randomized
Energy Technology Data Exchange (ETDEWEB)
Kuhn, Tilmann E.; Herkel, Sebastian [Fraunhofer Institute for Solar Energy Systems ISE, Heidenhofstr. 2, 79110 Freiburg (Germany); Frontini, Francesco [Fraunhofer Institute for Solar Energy Systems ISE, Heidenhofstr. 2, 79110 Freiburg (Germany); Politecnico di Milano, Dipartimento BEST, Via Bonardi 9, 20133 Milano (Italy); Strachan, Paul; Kokogiannakis, Georgios [ESRU, Dept. of Mechanical Eng., University of Strathclyde, Glasgow G1 1XJ (United Kingdom)
2011-01-15
This paper describes a new general method for building simulation programs which is intended to be used for the modelling of complex facades. The term 'complex facades' is used to designate facades with venetian blinds, prismatic layers, light re-directing surfaces, etc. In all these cases, the facade properties have a complex angular dependence. In addition to this, such facades very often have non-airtight layers and/or imperfect components (e.g. non-ideal sharp edges, non-flat surfaces,..). Therefore building planners often had to neglect some of the innovative features and to use 'work-arounds' in order to approximate the properties of complex facades in building simulation programs. A well-defined methodology for these cases was missing. This paper presents such a general methodology. The main advantage of the new method is that it only uses measureable quantities of the transparent or translucent part of the facade as a whole. This is the main difference in comparison with state of the art modelling based on the characteristics of the individual subcomponents, which is often impossible due to non-existing heat- and/or light-transfer models within the complex facade. It is shown that the new method can significantly increase the accuracy of heating/cooling loads and room temperatures. (author)
Statistics and Data Interpretation for Social Work
Rosenthal, James
2011-01-01
"Without question, this text will be the most authoritative source of information on statistics in the human services. From my point of view, it is a definitive work that combines a rigorous pedagogy with a down to earth (commonsense) exploration of the complex and difficult issues in data analysis (statistics) and interpretation. I welcome its publication.". -Praise for the First Edition. Written by a social worker for social work students, this is a nuts and bolts guide to statistics that presents complex calculations and concepts in clear, easy-to-understand language. It includes
Editorial to: Six papers on Dynamic Statistical Models
DEFF Research Database (Denmark)
2014-01-01
statistical methodology and theory for large and complex data sets that included biostatisticians and mathematical statisticians from three faculties at the University of Copenhagen. The satellite meeting took place August 17–19, 2011. Its purpose was to bring together researchers in statistics and related......The following six papers are based on invited lectures at the satellite meeting held at the University of Copenhagen before the 58th World Statistics Congress of the International Statistical Institute in Dublin in 2011. At the invitation of the Bernoulli Society, the satellite meeting...... was organized around the theme “Dynamic Statistical Models” as a part of the Program of Excellence at the University of Copenhagen on “Statistical methods for complex and high dimensional models” (http://statistics.ku.dk/). The Excellence Program in Statistics was a research project to develop and investigate...
Exclusion statistics and integrable models
International Nuclear Information System (INIS)
Mashkevich, S.
1998-01-01
The definition of exclusion statistics, as given by Haldane, allows for a statistical interaction between distinguishable particles (multi-species statistics). The thermodynamic quantities for such statistics ca be evaluated exactly. The explicit expressions for the cluster coefficients are presented. Furthermore, single-species exclusion statistics is realized in one-dimensional integrable models. The interesting questions of generalizing this correspondence onto the higher-dimensional and the multi-species cases remain essentially open
Comparative Study of Complex Survey Estimation Software in ONS
Directory of Open Access Journals (Sweden)
Andy Fallows
2015-09-01
Full Text Available Many official statistics across the UK Government Statistical Service (GSS are produced using data collected from sample surveys. These survey data are used to estimate population statistics through weighting and calibration techniques. For surveys with complex or unusual sample designs, the weighting can be fairly complicated. Even in more simple cases, appropriate software is required to implement survey weighting and estimation. As with other stages of the survey process, it is preferable to use a standard, generic calibration tool wherever possible. Standard tools allow for efficient use of resources and assist with the harmonisation of methods. In the case of calibration, the Office for National Statistics (ONS has experience of using the Statistics Canada Generalized Estimation System (GES across a range of business and social surveys. GES is a SAS-based system and so is only available in conjunction with an appropriate SAS licence. Given recent initiatives and encouragement to investigate open source solutions across government, it is appropriate to determine whether there are any open source calibration tools available that can provide the same service as GES. This study compares the use of GES with the calibration tool ‘R evolved Generalized software for sampling estimates and errors in surveys’ (ReGenesees available in R, an open source statistical programming language which is beginning to be used in many statistical offices. ReGenesees is a free R package which has been developed by the Italian statistics office (Istat and includes functionality to calibrate survey estimates using similar techniques to GES. This report describes analysis of the performance of ReGenesees in comparison to GES to calibrate a representative selection of ONS surveys. Section 1.1 provides a brief introduction to the current use of SAS and R in ONS. Section 2 describes GES and ReGenesees in more detail. Sections 3.1 and 3.2 consider methods for
Rivas, Elena; Lang, Raymond; Eddy, Sean R
2012-02-01
The standard approach for single-sequence RNA secondary structure prediction uses a nearest-neighbor thermodynamic model with several thousand experimentally determined energy parameters. An attractive alternative is to use statistical approaches with parameters estimated from growing databases of structural RNAs. Good results have been reported for discriminative statistical methods using complex nearest-neighbor models, including CONTRAfold, Simfold, and ContextFold. Little work has been reported on generative probabilistic models (stochastic context-free grammars [SCFGs]) of comparable complexity, although probabilistic models are generally easier to train and to use. To explore a range of probabilistic models of increasing complexity, and to directly compare probabilistic, thermodynamic, and discriminative approaches, we created TORNADO, a computational tool that can parse a wide spectrum of RNA grammar architectures (including the standard nearest-neighbor model and more) using a generalized super-grammar that can be parameterized with probabilities, energies, or arbitrary scores. By using TORNADO, we find that probabilistic nearest-neighbor models perform comparably to (but not significantly better than) discriminative methods. We find that complex statistical models are prone to overfitting RNA structure and that evaluations should use structurally nonhomologous training and test data sets. Overfitting has affected at least one published method (ContextFold). The most important barrier to improving statistical approaches for RNA secondary structure prediction is the lack of diversity of well-curated single-sequence RNA secondary structures in current RNA databases.
Energy Technology Data Exchange (ETDEWEB)
Toh, K.C.; Trefethen, L.N. [Cornell Univ., Ithaca, NY (United States)
1994-12-31
What properties of a nonsymmetric matrix A determine the convergence rate of iterations such as GMRES, QMR, and Arnoldi? If A is far from normal, should one replace the usual Ritz values {r_arrow} eigenvalues notion of convergence of Arnoldi by alternative notions such as Arnoldi lemniscates {r_arrow} pseudospectra? Since Krylov subspace iterations can be interpreted as minimization processes involving polynomials of matrices, the answers to questions such as these depend upon mathematical problems of the following kind. Given a polynomial p(z), how can one bound the norm of p(A) in terms of (1) the size of p(z) on various sets in the complex plane, and (2) the locations of the spectrum and pseudospectra of A? This talk reports some progress towards solving these problems. In particular, the authors present theorems that generalize the Kreiss matrix theorem from the unit disk (for the monomial A{sup n}) to a class of general complex domains (for polynomials p(A)).
The complexity of computing the MCD-estimator
DEFF Research Database (Denmark)
Bernholt, T.; Fischer, Paul
2004-01-01
In modem statistics the robust estimation of parameters is a central problem, i.e., an estimation that is not or only slightly affected by outliers in the data. The minimum covariance determinant (MCD) estimator (J. Amer. Statist. Assoc. 79 (1984) 871) is probably one of the most important robust...... estimators of location and scatter. The complexity of computing the MCD, however, was unknown and generally thought to be exponential even if the dimensionality of the data is fixed. Here we present a polynomial time algorithm for MCD for fixed dimension of the data. In contrast we show that computing...... the MCD-estimator is NP-hard if the dimension varies. (C) 2004 Elsevier B.V. All rights reserved....
On generalized entropies, Bayesian decisions and statistical diversity
Czech Academy of Sciences Publication Activity Database
Vajda, Igor; Zvárová, Jana
2007-01-01
Roč. 43, č. 5 (2007), s. 675-696 ISSN 0023-5954 R&D Projects: GA ČR(CZ) GA102/07/1131; GA MŠk(CZ) 1M06014 Institutional research plan: CEZ:AV0Z10750506; CEZ:AV0Z10300504 Keywords : Generalized information * Generalized entropy * Power entropy * Bayes error * Simpson diversity * Emlen diversity Subject RIV: BD - Theory of Information Impact factor: 0.552, year: 2007
International Nuclear Information System (INIS)
Beedgen, R.
1988-03-01
The computer program PROSA (PROgram for Statistical Analysis of near-real-time accountancy data) was developed as a tool to apply statistical test procedures to a sequence of materials balance results for detecting losses of material. First applications of PROSA to model facility data and real plant data showed that PROSA is also usable as a tool for process or measurement control. To deepen the experience for the application of PROSA to real data of bulk-handling facilities, we applied it to uranium data of the Allied General Nuclear Services miniruns, where accountancy data were collected on a near-real-time basis. Minirun 6 especially was considered, and the pulsed columns were chosen as materials balance area. The structure of the measurement models for flow sheet data and actual operation data are compared, and methods are studied to reduce the error for inventory measurements of the columns
Blind identification and separation of complex-valued signals
Moreau, Eric
2013-01-01
Blind identification consists of estimating a multi-dimensional system only through the use of its output, and source separation, the blind estimation of the inverse of the system. Estimation is generally carried out using different statistics of the output. The authors of this book consider the blind identification and source separation problem in the complex-domain, where the available statistical properties are richer and include non-circularity of the sources - underlying components. They define identifiability conditions and present state-of-the-art algorithms that are based on algebraic methods as well as iterative algorithms based on maximum likelihood theory. Contents 1. Mathematical Preliminaries. 2. Estimation by Joint Diagonalization. 3. Maximum Likelihood ICA. About the Authors Eric Moreau is Professor of Electrical Engineering at the University of Toulon, France. His research interests concern statistical signal processing, high order statistics and matrix/tensor decompositions with applic...
UPPAAL-SMC: Statistical Model Checking for Priced Timed Automata
DEFF Research Database (Denmark)
Bulychev, Petr; David, Alexandre; Larsen, Kim Guldstrand
2012-01-01
on a series of extensions of the statistical model checking approach generalized to handle real-time systems and estimate undecidable problems. U PPAAL - SMC comes together with a friendly user interface that allows a user to specify complex problems in an efficient manner as well as to get feedback...... in the form of probability distributions and compare probabilities to analyze performance aspects of systems. The focus of the survey is on the evolution of the tool – including modeling and specification formalisms as well as techniques applied – together with applications of the tool to case studies....
Rohatgi, Vijay K
2003-01-01
Unified treatment of probability and statistics examines and analyzes the relationship between the two fields, exploring inferential issues. Numerous problems, examples, and diagrams--some with solutions--plus clear-cut, highlighted summaries of results. Advanced undergraduate to graduate level. Contents: 1. Introduction. 2. Probability Model. 3. Probability Distributions. 4. Introduction to Statistical Inference. 5. More on Mathematical Expectation. 6. Some Discrete Models. 7. Some Continuous Models. 8. Functions of Random Variables and Random Vectors. 9. Large-Sample Theory. 10. General Meth
A general exit strategy of monoheme cytochromes c and c2 in electron transfer complexes?
De March, Matteo; Brancatelli, Giovanna; Demitri, Nicola; De Zorzi, Rita; Hickey, Neal; Geremia, Silvano
2015-09-01
Using our previously reported maps of the electrostatic surface of horse heart ferri- and ferro-cyt c, comparisons were made between the complementary electrostatic surfaces of three cyt c peroxidase-cyt c complexes and the photosynthetic reaction center-cyt c complex, considering both iron oxidation states. The results obtained were consistent with a sliding mechanism for the electron shuttle on the surface of the protein complexes, promoted by the change in iron oxidation state. This mechanism was found to be in agreement with theoretical and NMR studies reported in the literature. Importantly, the analysis also provided a rationale for recognition of nonproductive associations. As we have previously reported the same conclusion on examination of redox partners of cyt c in the mitochondrial respiratory pathway, our hypothesis is that the proposed mechanism could represent a general exit strategy of monoheme cyts c and c2 in electron transfer complexes. © 2015 International Union of Biochemistry and Molecular Biology.
Complex network approach to characterize the statistical features of the sunspot series
International Nuclear Information System (INIS)
Zou, Yong; Liu, Zonghua; Small, Michael; Kurths, Jürgen
2014-01-01
Complex network approaches have been recently developed as an alternative framework to study the statistical features of time-series data. We perform a visibility-graph analysis on both the daily and monthly sunspot series. Based on the data, we propose two ways to construct the network: one is from the original observable measurements and the other is from a negative-inverse-transformed series. The degree distribution of the derived networks for the strong maxima has clear non-Gaussian properties, while the degree distribution for minima is bimodal. The long-term variation of the cycles is reflected by hubs in the network that span relatively large time intervals. Based on standard network structural measures, we propose to characterize the long-term correlations by waiting times between two subsequent events. The persistence range of the solar cycles has been identified over 15–1000 days by a power-law regime with scaling exponent γ = 2.04 of the occurrence time of two subsequent strong minima. In contrast, a persistent trend is not present in the maximal numbers, although maxima do have significant deviations from an exponential form. Our results suggest some new insights for evaluating existing models. (paper)
Functional statistics and related fields
Bongiorno, Enea; Cao, Ricardo; Vieu, Philippe
2017-01-01
This volume collects latest methodological and applied contributions on functional, high-dimensional and other complex data, related statistical models and tools as well as on operator-based statistics. It contains selected and refereed contributions presented at the Fourth International Workshop on Functional and Operatorial Statistics (IWFOS 2017) held in A Coruña, Spain, from 15 to 17 June 2017. The series of IWFOS workshops was initiated by the Working Group on Functional and Operatorial Statistics at the University of Toulouse in 2008. Since then, many of the major advances in functional statistics and related fields have been periodically presented and discussed at the IWFOS workshops. .
Koorehdavoudi, Hana; Bogdan, Paul
2016-06-01
Biological systems are frequently categorized as complex systems due to their capabilities of generating spatio-temporal structures from apparent random decisions. In spite of research on analyzing biological systems, we lack a quantifiable framework for measuring their complexity. To fill this gap, in this paper, we develop a new paradigm to study a collective group of N agents moving and interacting in a three-dimensional space. Our paradigm helps to identify the spatio-temporal states of the motion of the group and their associated transition probabilities. This framework enables the estimation of the free energy landscape corresponding to the identified states. Based on the energy landscape, we quantify missing information, emergence, self-organization and complexity for a collective motion. We show that the collective motion of the group of agents evolves to reach the most probable state with relatively lowest energy level and lowest missing information compared to other possible states. Our analysis demonstrates that the natural group of animals exhibit a higher degree of emergence, self-organization and complexity over time. Consequently, this algorithm can be integrated into new frameworks to engineer collective motions to achieve certain degrees of emergence, self-organization and complexity.
Critical statistics for non-Hermitian matrices
International Nuclear Information System (INIS)
Garcia-Garcia, A.M.; Verbaarschot, J.J.M.; Nishigaki, S.M.
2002-01-01
We introduce a generalized ensemble of non-Hermitian matrices interpolating between the Gaussian Unitary Ensemble, the Ginibre ensemble, and the Poisson ensemble. The joint eigenvalue distribution of this model is obtained by means of an extension of the Itzykson-Zuber formula to general complex matrices. Its correlation functions are studied both in the case of weak non-Hermiticity and in the case of strong non-Hermiticity. In the weak non-Hermiticity limit we show that the spectral correlations in the bulk of the spectrum display critical statistics: the asymptotic linear behavior of the number variance is already approached for energy differences of the order of the eigenvalue spacing. To lowest order, its slope does not depend on the degree of non-Hermiticity. Close the edge, the spectral correlations are similar to the Hermitian case. In the strong non-Hermiticity limit the crossover behavior from the Ginibre ensemble to the Poisson ensemble first appears close to the surface of the spectrum. Our model may be relevant for the description of the spectral correlations of an open disordered system close to an Anderson transition
Statistical benchmark for BosonSampling
International Nuclear Information System (INIS)
Walschaers, Mattia; Mayer, Klaus; Buchleitner, Andreas; Kuipers, Jack; Urbina, Juan-Diego; Richter, Klaus; Tichy, Malte Christopher
2016-01-01
Boson samplers—set-ups that generate complex many-particle output states through the transmission of elementary many-particle input states across a multitude of mutually coupled modes—promise the efficient quantum simulation of a classically intractable computational task, and challenge the extended Church–Turing thesis, one of the fundamental dogmas of computer science. However, as in all experimental quantum simulations of truly complex systems, one crucial problem remains: how to certify that a given experimental measurement record unambiguously results from enforcing the claimed dynamics, on bosons, fermions or distinguishable particles? Here we offer a statistical solution to the certification problem, identifying an unambiguous statistical signature of many-body quantum interference upon transmission across a multimode, random scattering device. We show that statistical analysis of only partial information on the output state allows to characterise the imparted dynamics through particle type-specific features of the emerging interference patterns. The relevant statistical quantifiers are classically computable, define a falsifiable benchmark for BosonSampling, and reveal distinctive features of many-particle quantum dynamics, which go much beyond mere bunching or anti-bunching effects. (fast track communication)
Quantum mechanics as applied mathematical statistics
International Nuclear Information System (INIS)
Skala, L.; Cizek, J.; Kapsa, V.
2011-01-01
Basic mathematical apparatus of quantum mechanics like the wave function, probability density, probability density current, coordinate and momentum operators, corresponding commutation relation, Schroedinger equation, kinetic energy, uncertainty relations and continuity equation is discussed from the point of view of mathematical statistics. It is shown that the basic structure of quantum mechanics can be understood as generalization of classical mechanics in which the statistical character of results of measurement of the coordinate and momentum is taken into account and the most important general properties of statistical theories are correctly respected.
Facts and Findings on the Statistical Discrepancies of the Korean Balance of Payments
Directory of Open Access Journals (Sweden)
Sangyong Joo
2000-03-01
Full Text Available The Korea experienced drastic increases in the statistical discrepancy in balance of payments since 1997. In general, the expansion and complexity of external transactions induced by contributing capital account liberalization has contributed to this fact. The abolition of the export/import approval system seems to have accelerated the mismeasurement and omissions in external transactions. Of course, the influence of currency crisis occurred cannot be ruled out. Among economic factors, Won/dollar exchange rate volatility is found to have significant explanatory power on the magnitude of statistical discrepancy, while exchange rate returns have not. WE interpret this as the demand for relatively safe currency rising in presence of the uncertainty in domestic currency values.
Parsons, Nick R; Price, Charlotte L; Hiskens, Richard; Achten, Juul; Costa, Matthew L
2012-04-25
The application of statistics in reported research in trauma and orthopaedic surgery has become ever more important and complex. Despite the extensive use of statistical analysis, it is still a subject which is often not conceptually well understood, resulting in clear methodological flaws and inadequate reporting in many papers. A detailed statistical survey sampled 100 representative orthopaedic papers using a validated questionnaire that assessed the quality of the trial design and statistical analysis methods. The survey found evidence of failings in study design, statistical methodology and presentation of the results. Overall, in 17% (95% confidence interval; 10-26%) of the studies investigated the conclusions were not clearly justified by the results, in 39% (30-49%) of studies a different analysis should have been undertaken and in 17% (10-26%) a different analysis could have made a difference to the overall conclusions. It is only by an improved dialogue between statistician, clinician, reviewer and journal editor that the failings in design methodology and analysis highlighted by this survey can be addressed.
Directory of Open Access Journals (Sweden)
Parsons Nick R
2012-04-01
Full Text Available Abstract Background The application of statistics in reported research in trauma and orthopaedic surgery has become ever more important and complex. Despite the extensive use of statistical analysis, it is still a subject which is often not conceptually well understood, resulting in clear methodological flaws and inadequate reporting in many papers. Methods A detailed statistical survey sampled 100 representative orthopaedic papers using a validated questionnaire that assessed the quality of the trial design and statistical analysis methods. Results The survey found evidence of failings in study design, statistical methodology and presentation of the results. Overall, in 17% (95% confidence interval; 10–26% of the studies investigated the conclusions were not clearly justified by the results, in 39% (30–49% of studies a different analysis should have been undertaken and in 17% (10–26% a different analysis could have made a difference to the overall conclusions. Conclusion It is only by an improved dialogue between statistician, clinician, reviewer and journal editor that the failings in design methodology and analysis highlighted by this survey can be addressed.
Exclusion statistics and integrable models
International Nuclear Information System (INIS)
Mashkevich, S.
1998-01-01
The definition of exclusion statistics that was given by Haldane admits a 'statistical interaction' between distinguishable particles (multispecies statistics). For such statistics, thermodynamic quantities can be evaluated exactly; explicit expressions are presented here for cluster coefficients. Furthermore, single-species exclusion statistics is realized in one-dimensional integrable models of the Calogero-Sutherland type. The interesting questions of generalizing this correspondence to the higher-dimensional and the multispecies cases remain essentially open; however, our results provide some hints as to searches for the models in question
Mathematical statistics and stochastic processes
Bosq, Denis
2013-01-01
Generally, books on mathematical statistics are restricted to the case of independent identically distributed random variables. In this book however, both this case AND the case of dependent variables, i.e. statistics for discrete and continuous time processes, are studied. This second case is very important for today's practitioners.Mathematical Statistics and Stochastic Processes is based on decision theory and asymptotic statistics and contains up-to-date information on the relevant topics of theory of probability, estimation, confidence intervals, non-parametric statistics and rob
Goodman, J. W.
This book is based on the thesis that some training in the area of statistical optics should be included as a standard part of any advanced optics curriculum. Random variables are discussed, taking into account definitions of probability and random variables, distribution functions and density functions, an extension to two or more random variables, statistical averages, transformations of random variables, sums of real random variables, Gaussian random variables, complex-valued random variables, and random phasor sums. Other subjects examined are related to random processes, some first-order properties of light waves, the coherence of optical waves, some problems involving high-order coherence, effects of partial coherence on imaging systems, imaging in the presence of randomly inhomogeneous media, and fundamental limits in photoelectric detection of light. Attention is given to deterministic versus statistical phenomena and models, the Fourier transform, and the fourth-order moment of the spectrum of a detected speckle image.
Kalegowda, Yogesh; Harmer, Sarah L
2012-03-20
Time-of-flight secondary ion mass spectrometry (TOF-SIMS) spectra of mineral samples are complex, comprised of large mass ranges and many peaks. Consequently, characterization and classification analysis of these systems is challenging. In this study, different chemometric and statistical data evaluation methods, based on monolayer sensitive TOF-SIMS data, have been tested for the characterization and classification of copper-iron sulfide minerals (chalcopyrite, chalcocite, bornite, and pyrite) at different flotation pulp conditions (feed, conditioned feed, and Eh modified). The complex mass spectral data sets were analyzed using the following chemometric and statistical techniques: principal component analysis (PCA); principal component-discriminant functional analysis (PC-DFA); soft independent modeling of class analogy (SIMCA); and k-Nearest Neighbor (k-NN) classification. PCA was found to be an important first step in multivariate analysis, providing insight into both the relative grouping of samples and the elemental/molecular basis for those groupings. For samples exposed to oxidative conditions (at Eh ~430 mV), each technique (PCA, PC-DFA, SIMCA, and k-NN) was found to produce excellent classification. For samples at reductive conditions (at Eh ~ -200 mV SHE), k-NN and SIMCA produced the most accurate classification. Phase identification of particles that contain the same elements but a different crystal structure in a mixed multimetal mineral system has been achieved.
Geographical National Condition and Complex System
Directory of Open Access Journals (Sweden)
WANG Jiayao
2016-01-01
Full Text Available The significance of studying the complex system of geographical national conditions lies in rationally expressing the complex relationships of the “resources-environment-ecology-economy-society” system. Aiming to the problems faced by the statistical analysis of geographical national conditions, including the disunity of research contents, the inconsistency of range, the uncertainty of goals, etc.the present paper conducted a range of discussions from the perspectives of concept, theory and method, and designed some solutions based on the complex system theory and coordination degree analysis methods.By analyzing the concepts of geographical national conditions, geographical national conditions survey and geographical national conditions statistical analysis, as well as investigating the relationships between theirs, the statistical contents and the analytical range of geographical national conditions are clarified and defined. This investigation also clarifies the goals of the statistical analysis by analyzing the basic characteristics of the geographical national conditions and the complex system, and the consistency between the analysis of the degree of coordination and statistical analyses. It outlines their goals, proposes a concept for the complex system of geographical national conditions, and it describes the concept. The complex system theory provides new theoretical guidance for the statistical analysis of geographical national conditions. The degree of coordination offers new approaches on how to undertake the analysis based on the measurement method and decision-making analysis scheme upon which the complex system of geographical national conditions is based. It analyzes the overall trend via the degree of coordination of the complex system on a macro level, and it determines the direction of remediation on a micro level based on the degree of coordination among various subsystems and of single systems. These results establish
General renormalized statistical approach with finite cross-field correlations
International Nuclear Information System (INIS)
Vakulenko, M.O.
1992-01-01
The renormalized statistical approach is proposed, accounting for finite correlations of potential and magnetic fluctuations. It may be used for analysis of a wide class of nonlinear model equations describing the cross-correlated plasma states. The influence of a cross spectrum on stationary potential and magnetic ones is investigated. 10 refs. (author)
Directory of Open Access Journals (Sweden)
Laura Badenes-Ribera
2018-06-01
Full Text Available Introduction: Publications arguing against the null hypothesis significance testing (NHST procedure and in favor of good statistical practices have increased. The most frequently mentioned alternatives to NHST are effect size statistics (ES, confidence intervals (CIs, and meta-analyses. A recent survey conducted in Spain found that academic psychologists have poor knowledge about effect size statistics, confidence intervals, and graphic displays for meta-analyses, which might lead to a misinterpretation of the results. In addition, it also found that, although the use of ES is becoming generalized, the same thing is not true for CIs. Finally, academics with greater knowledge about ES statistics presented a profile closer to good statistical practice and research design. Our main purpose was to analyze the extension of these results to a different geographical area through a replication study.Methods: For this purpose, we elaborated an on-line survey that included the same items as the original research, and we asked academic psychologists to indicate their level of knowledge about ES, their CIs, and meta-analyses, and how they use them. The sample consisted of 159 Italian academic psychologists (54.09% women, mean age of 47.65 years. The mean number of years in the position of professor was 12.90 (SD = 10.21.Results: As in the original research, the results showed that, although the use of effect size estimates is becoming generalized, an under-reporting of CIs for ES persists. The most frequent ES statistics mentioned were Cohen's d and R2/η2, which can have outliers or show non-normality or violate statistical assumptions. In addition, academics showed poor knowledge about meta-analytic displays (e.g., forest plot and funnel plot and quality checklists for studies. Finally, academics with higher-level knowledge about ES statistics seem to have a profile closer to good statistical practices.Conclusions: Changing statistical practice is not
Directory of Open Access Journals (Sweden)
Antonius J Poot
Full Text Available BACKGROUND: Satisfaction is widely used to evaluate and direct delivery of medical care; a complicated relationship exists between patient satisfaction, morbidity and age. This study investigates the relationships between complexity of health problems and level of patient satisfaction of older persons with their general practitioner (GP and practice. METHODS AND FINDINGS: This study is embedded in the ISCOPE (Integrated Systematic Care for Older Persons study. Enlisted patients aged ≥75 years from 59 practices received a written questionnaire to screen for complex health problems (somatic, functional, psychological and social. For 2664 randomly chosen respondents (median age 82 years; 68% female information was collected on level of satisfaction (satisfied, neutral, dissatisfied with their GP and general practice, and demographic and clinical characteristics including complexity of health problems. Of all participants, 4% was dissatisfied with their GP care, 59% neutral and 37% satisfied. Between these three categories no differences were observed in age, gender, country of birth or education level. The percentage of participants dissatisfied with their GP care increased from 0.4% in those with 0 problem domains to 8% in those with 4 domains, i.e. having complex health problems (p<0.001. Per additional health domain with problems, the risk of being dissatisfied increased 1.7 times (95% CI 1.4-2.14; p<0.001. This was independent of age, gender, and demographic and clinical parameters (adjusted OR 1.4, 95% CI 1.1-1.8; p = 0.021. CONCLUSION: In older persons, dissatisfaction with general practice is strongly correlated with rising complexity of health problems, independent of age, demographic and clinical parameters. It remains unclear whether complexity of health problems is a patient characteristic influencing the perception of care, or whether the care is unable to handle the demands of these patients. Prospective studies are needed to
A unifying framework for k-statistics, polykays and their multivariate generalizations.
DI NARDO, Elvira; GUARINO G, G.; Senato, D.
2008-01-01
Through the classical umbral calculus, we provide a unifying syntax for single and multivariate $k$-statistics, polykays and multivariate polykays. From a combinatorial point of view, we revisit the theory as exposed by Stuart and Ord, taking into account the Doubilet approach to symmetric functions. Moreover, by using exponential polynomials rather than set partitions, we provide a new formula for $k$-statistics that results in a very fast algorithm to generate such estimators.
Advances in dynamics, patterns, cognition challenges in complexity
Pikovsky, Arkady; Rulkov, Nikolai; Tsimring, Lev
2017-01-01
This book focuses on recent progress in complexity research based on the fundamental nonlinear dynamical and statistical theory of oscillations, waves, chaos, and structures far from equilibrium. Celebrating seminal contributions to the field by Prof. M. I. Rabinovich of the University of California at San Diego, this volume brings together perspectives on both the fundamental aspects of complexity studies, as well as in applications in different fields ranging from granular patterns to understanding of the cognitive brain and mind dynamics. The slate of world-class authors review recent achievements that together present a broad and coherent coverage of modern research in complexity greater than the sum of its parts. Presents the most up-to-date developments in the studies of complexity Combines basic and applied aspects Links background nonlinear theory of oscillations and waves with modern approaches Allows readers to recognize general dynamical principles across the applications fields.
Bayesian Statistics and Uncertainty Quantification for Safety Boundary Analysis in Complex Systems
He, Yuning; Davies, Misty Dawn
2014-01-01
The analysis of a safety-critical system often requires detailed knowledge of safe regions and their highdimensional non-linear boundaries. We present a statistical approach to iteratively detect and characterize the boundaries, which are provided as parameterized shape candidates. Using methods from uncertainty quantification and active learning, we incrementally construct a statistical model from only few simulation runs and obtain statistically sound estimates of the shape parameters for safety boundaries.
Statistical Tutorial | Center for Cancer Research
Recent advances in cancer biology have resulted in the need for increased statistical analysis of research data. ST is designed as a follow up to Statistical Analysis of Research Data (SARD) held in April 2018. The tutorial will apply the general principles of statistical analysis of research data including descriptive statistics, z- and t-tests of means and mean
Lectures on statistical mechanics
Bowler, M G
1982-01-01
Anyone dissatisfied with the almost ritual dullness of many 'standard' texts in statistical mechanics will be grateful for the lucid explanation and generally reassuring tone. Aimed at securing firm foundations for equilibrium statistical mechanics, topics of great subtlety are presented transparently and enthusiastically. Very little mathematical preparation is required beyond elementary calculus and prerequisites in physics are limited to some elementary classical thermodynamics. Suitable as a basis for a first course in statistical mechanics, the book is an ideal supplement to more convent
Gibbs' theorem for open systems with incomplete statistics
International Nuclear Information System (INIS)
Bagci, G.B.
2009-01-01
Gibbs' theorem, which is originally intended for canonical ensembles with complete statistics has been generalized to open systems with incomplete statistics. As a result of this generalization, it is shown that the stationary equilibrium distribution of inverse power law form associated with the incomplete statistics has maximum entropy even for open systems with energy or matter influx. The renormalized entropy definition given in this paper can also serve as a measure of self-organization in open systems described by incomplete statistics.
Practical statistics a handbook for business projects
Buglear, John
2013-01-01
Practical Statistics is a hands-on guide to statistics, progressing by complexity of data (univariate, bivariate, multivariate) and analysis (portray, summarise, generalise) in order to give the reader a solid understanding of the fundamentals and how to apply them.
Mandl, Franz
1988-01-01
The Manchester Physics Series General Editors: D. J. Sandiford; F. Mandl; A. C. Phillips Department of Physics and Astronomy, University of Manchester Properties of Matter B. H. Flowers and E. Mendoza Optics Second Edition F. G. Smith and J. H. Thomson Statistical Physics Second Edition E. Mandl Electromagnetism Second Edition I. S. Grant and W. R. Phillips Statistics R. J. Barlow Solid State Physics Second Edition J. R. Hook and H. E. Hall Quantum Mechanics F. Mandl Particle Physics Second Edition B. R. Martin and G. Shaw The Physics of Stars Second Edition A. C. Phillips Computing for Scient
Frontiers in statistical quality control
Wilrich, Peter-Theodor
2004-01-01
This volume treats the four main categories of Statistical Quality Control: General SQC Methodology, On-line Control including Sampling Inspection and Statistical Process Control, Off-line Control with Data Analysis and Experimental Design, and, fields related to Reliability. Experts with international reputation present their newest contributions.
Statistics for non-statisticians
Madsen, Birger Stjernholm
2016-01-01
This book was written for those who need to know how to collect, analyze and present data. It is meant to be a first course for practitioners, a book for private study or brush-up on statistics, and supplementary reading for general statistics classes. The book is untraditional, both with respect to the choice of topics and the presentation: Topics were determined by what is most useful for practical statistical work, and the presentation is as non-mathematical as possible. The book contains many examples using statistical functions in spreadsheets. In this second edition, new topics have been included e.g. within the area of statistical quality control, in order to make the book even more useful for practitioners working in industry. .
Quantum Kolmogorov complexity and the quantum Turing machine
Energy Technology Data Exchange (ETDEWEB)
Mueller, M.
2007-08-31
The purpose of this thesis is to give a formal definition of quantum Kolmogorov complexity and rigorous mathematical proofs of its basic properties. Classical Kolmogorov complexity is a well-known and useful measure of randomness for binary strings. In recent years, several different quantum generalizations of Kolmogorov complexity have been proposed. The most natural generalization is due to A. Berthiaume et al. (2001), defining the complexity of a quantum bit (qubit) string as the length of the shortest quantum input for a universal quantum computer that outputs the desired string. Except for slight modifications, it is this definition of quantum Kolmogorov complexity that we study in this thesis. We start by analyzing certain aspects of the underlying quantum Turing machine (QTM) model in a more detailed formal rigour than was done previously. Afterwards, we apply these results to quantum Kolmogorov complexity. Our first result is a proof of the existence of a universal QTM which simulates every other QTM for an arbitrary number of time steps and than halts with probability one. In addition, we show that every input that makes a QTM almost halt can be modified to make the universal QTM halt entirely, by adding at most a constant number of qubits. It follows that quantum Kolmogorov complexity has the invariance property, i.e. it depends on the choice of the universal QTM only up to an additive constant. Moreover, the quantum complexity of classical strings agrees with classical complexity, again up to an additive constant. The proofs are based on several analytic estimates. Furthermore, we prove several incompressibility theorems for quantum Kolmogorov complexity. Finally, we show that for ergodic quantum information sources, complexity rate and entropy rate coincide with probability one. The thesis is finished with an outlook on a possible application of quantum Kolmogorov complexity in statistical mechanics. (orig.)
Statistical physics of non-thermal phase transitions from foundations to applications
Abaimov, Sergey G
2015-01-01
Statistical physics can be used to better understand non-thermal complex systems—phenomena such as stock-market crashes, revolutions in society and in science, fractures in engineered materials and in the Earth’s crust, catastrophes, traffic jams, petroleum clusters, polymerization, self-organized criticality and many others exhibit behaviors resembling those of thermodynamic systems. In particular, many of these systems possess phase transitions identical to critical or spinodal phenomena in statistical physics. The application of the well-developed formalism of statistical physics to non-thermal complex systems may help to predict and prevent such catastrophes as earthquakes, snow-avalanches and landslides, failure of engineering structures, or economical crises. This book addresses the issue step-by-step, from phenomenological analogies between complex systems and statistical physics to more complex aspects, such as correlations, fluctuation-dissipation theorem, susceptibility, the concept of free ener...
Statistical principles for prospective study protocols:
DEFF Research Database (Denmark)
Christensen, Robin; Langberg, Henning
2012-01-01
In the design of scientific studies it is essential to decide on which scientific questions one aims to answer, just as it is important to decide on the correct statistical methods to use to answer these questions. The correct use of statistical methods is crucial in all aspects of research...... to quantify relationships in data. Despite an increased focus on statistical content and complexity of biomedical research these topics remain difficult for most researchers. Statistical methods enable researchers to condense large spreadsheets with data into means, proportions, and difference between means...... the statistical principles for trial protocols in terms of design, analysis, and reporting of findings....
Joint probability of statistical success of multiple phase III trials.
Zhang, Jianliang; Zhang, Jenny J
2013-01-01
In drug development, after completion of phase II proof-of-concept trials, the sponsor needs to make a go/no-go decision to start expensive phase III trials. The probability of statistical success (PoSS) of the phase III trials based on data from earlier studies is an important factor in that decision-making process. Instead of statistical power, the predictive power of a phase III trial, which takes into account the uncertainty in the estimation of treatment effect from earlier studies, has been proposed to evaluate the PoSS of a single trial. However, regulatory authorities generally require statistical significance in two (or more) trials for marketing licensure. We show that the predictive statistics of two future trials are statistically correlated through use of the common observed data from earlier studies. Thus, the joint predictive power should not be evaluated as a simplistic product of the predictive powers of the individual trials. We develop the relevant formulae for the appropriate evaluation of the joint predictive power and provide numerical examples. Our methodology is further extended to the more complex phase III development scenario comprising more than two (K > 2) trials, that is, the evaluation of the PoSS of at least k₀ (k₀≤ K) trials from a program of K total trials. Copyright © 2013 John Wiley & Sons, Ltd.
Proof of the Spin Statistics Connection 2: Relativistic Theory
Santamato, Enrico; De Martini, Francesco
2017-12-01
The traditional standard theory of quantum mechanics is unable to solve the spin-statistics problem, i.e. to justify the utterly important "Pauli Exclusion Principle" but by the adoption of the complex standard relativistic quantum field theory. In a recent paper (Santamato and De Martini in Found Phys 45(7):858-873, 2015) we presented a proof of the spin-statistics problem in the nonrelativistic approximation on the basis of the "Conformal Quantum Geometrodynamics". In the present paper, by the same theory the proof of the spin-statistics theorem is extended to the relativistic domain in the general scenario of curved spacetime. The relativistic approach allows to formulate a manifestly step-by-step Weyl gauge invariant theory and to emphasize some fundamental aspects of group theory in the demonstration. No relativistic quantum field operators are used and the particle exchange properties are drawn from the conservation of the intrinsic helicity of elementary particles. It is therefore this property, not considered in the standard quantum mechanics, which determines the correct spin-statistics connection observed in Nature (Santamato and De Martini in Found Phys 45(7):858-873, 2015). The present proof of the spin-statistics theorem is simpler than the one presented in Santamato and De Martini (Found Phys 45(7):858-873, 2015), because it is based on symmetry group considerations only, without having recourse to frames attached to the particles. Second quantization and anticommuting operators are not necessary.
A New Look at Worst Case Complexity: A Statistical Approach
Directory of Open Access Journals (Sweden)
Niraj Kumar Singh
2014-01-01
Full Text Available We present a new and improved worst case complexity model for quick sort as yworst(n,td=b0+b1n2+g(n,td+ɛ, where the LHS gives the worst case time complexity, n is the input size, td is the frequency of sample elements, and g(n,td is a function of both the input size n and the parameter td. The rest of the terms arising due to linear regression have usual meanings. We claim this to be an improvement over the conventional model; namely, yworst(n=b0+b1n+b2n2+ɛ, which stems from the worst case O(n2 complexity for this algorithm.
Statistics in action a Canadian outlook
Lawless, Jerald F
2014-01-01
Commissioned by the Statistical Society of Canada (SSC), Statistics in Action: A Canadian Outlook helps both general readers and users of statistics better appreciate the scope and importance of statistics. It presents the ways in which statistics is used while highlighting key contributions that Canadian statisticians are making to science, technology, business, government, and other areas. The book emphasizes the role and impact of computing in statistical modeling and analysis, including the issues involved with the huge amounts of data being generated by automated processes.The first two c
Unsupervised Learning and Generalization
DEFF Research Database (Denmark)
Hansen, Lars Kai; Larsen, Jan
1996-01-01
The concept of generalization is defined for a general class of unsupervised learning machines. The generalization error is a straightforward extension of the corresponding concept for supervised learning, and may be estimated empirically using a test set or by statistical means-in close analogy ...... with supervised learning. The empirical and analytical estimates are compared for principal component analysis and for K-means clustering based density estimation......The concept of generalization is defined for a general class of unsupervised learning machines. The generalization error is a straightforward extension of the corresponding concept for supervised learning, and may be estimated empirically using a test set or by statistical means-in close analogy...
International Nuclear Information System (INIS)
Zhang, Xiaoguang; Varga, Kalman; Pantelides, Sokrates T
2007-01-01
Band-theoretic methods with periodically repeated supercells have been a powerful approach for ground-state electronic structure calculations, but have not so far been adapted for quantum transport problems with open boundary conditions. Here we introduce a generalized Bloch theorem for complex periodic potentials and use a transfer-matrix formulation to cast the transmission probability in a scattering problem with open boundary conditions in terms of the complex wave vectors of a periodic system with absorbing layers, allowing a band technique for quantum transport calculations. The accuracy and utility of the method is demonstrated by the model problems of the transmission of an electron over a square barrier and the scattering of a phonon in an inhomogeneous nanowire. Application to the resistance of a twin boundary in nanocrystalline copper yields excellent agreement with recent experimental data
Statistical inference for template aging
Schuckers, Michael E.
2006-04-01
A change in classification error rates for a biometric device is often referred to as template aging. Here we offer two methods for determining whether the effect of time is statistically significant. The first of these is the use of a generalized linear model to determine if these error rates change linearly over time. This approach generalizes previous work assessing the impact of covariates using generalized linear models. The second approach uses of likelihood ratio tests methodology. The focus here is on statistical methods for estimation not the underlying cause of the change in error rates over time. These methodologies are applied to data from the National Institutes of Standards and Technology Biometric Score Set Release 1. The results of these applications are discussed.
A General Linear Model (GLM) was used to evaluate the deviation of predicted values from expected values for a complex environmental model. For this demonstration, we used the default level interface of the Regional Mercury Cycling Model (R-MCM) to simulate epilimnetic total mer...
A general statistical test for correlations in a finite-length time series.
Hanson, Jeffery A; Yang, Haw
2008-06-07
The statistical properties of the autocorrelation function from a time series composed of independently and identically distributed stochastic variables has been studied. Analytical expressions for the autocorrelation function's variance have been derived. It has been found that two common ways of calculating the autocorrelation, moving-average and Fourier transform, exhibit different uncertainty characteristics. For periodic time series, the Fourier transform method is preferred because it gives smaller uncertainties that are uniform through all time lags. Based on these analytical results, a statistically robust method has been proposed to test the existence of correlations in a time series. The statistical test is verified by computer simulations and an application to single-molecule fluorescence spectroscopy is discussed.
Statistical and Visualization Data Mining Tools for Foundry Production
Directory of Open Access Journals (Sweden)
M. Perzyk
2007-07-01
Full Text Available In recent years a rapid development of a new, interdisciplinary knowledge area, called data mining, is observed. Its main task is extracting useful information from previously collected large amount of data. The main possibilities and potential applications of data mining in manufacturing industry are characterized. The main types of data mining techniques are briefly discussed, including statistical, artificial intelligence, data base and visualization tools. The statistical methods and visualization methods are presented in more detail, showing their general possibilities, advantages as well as characteristic examples of applications in foundry production. Results of the author’s research are presented, aimed at validation of selected statistical tools which can be easily and effectively used in manufacturing industry. A performance analysis of ANOVA and contingency tables based methods, dedicated for determination of the most significant process parameters as well as for detection of possible interactions among them, has been made. Several numerical tests have been performed using simulated data sets, with assumed hidden relationships as well some real data, related to the strength of ductile cast iron, collected in a foundry. It is concluded that the statistical methods offer relatively easy and fairly reliable tools for extraction of that type of knowledge about foundry manufacturing processes. However, further research is needed, aimed at explanation of some imperfections of the investigated tools as well assessment of their validity for more complex tasks.
Imperfect generalized transmit beamforming with co-channel interference cancelation
Radaydeh, Redha Mahmoud Mesleh
2010-10-01
The performance of a generalized single-stream transmit beamforming scheme employing receive co-channel interference -steering algorithms in slowly varying and flat fading channels is analyzed. The impact of imperfect prediction of channel state information (CSI) for the desired user spatially uncorrelated transmit channels is considered. Both dominant interference cancelation and adaptive arbitrary interference cancelation algorithms for closely spaced receive antennas are used. The impact of outdated statistical ordering of the interferers instantaneous powers on the effectiveness of dominant interference cancelation is investigated against the less complex adaptive arbitrary cancelation scheme. For the system models described above, new exact formulas for the statistics of combined signal-to-interference-plus-noise ratio (SINR) are derived, from which results for conventional maximum ratio transmission (MRT) and best transmit channel selection schemes can be deduced as limiting cases. The results presented herein can be used to obtain quantitative measure for various performance metrics, and in addition to investigate the performance-complexity tradeoff for different multiple-antenna system models. © 2010 IEEE.
Bowe, Conor M; Gargan, Mary Louise; Kearns, Gerard J; Stassen, Leo F A
2015-01-01
This is a retrospective study to review the treatment and management of patients presenting with odontogenic infections in a large urban teaching hospital over a four-year period, comparing the number and complexity of odontogenic infections presenting to an acute general hospital in two periods, as follows: Group A (January 2008 to March 2010) versus Group B (April 2010 to December 2011). The background to the study is 'An alteration in patient access to primary dental care instituted by the Department of Health in April 2010'. a) to identify any alteration in the pattern and complexity of patients' presentation with odontogenic infections following recent changes in access to treatment via the Dental Treatment Services Scheme (DTSS) and the Dental Treatment Benefit Scheme (DTBS) in April 2010; and, b) to evaluate the management of severe odontogenic infections. Data was collated by a combination of a comprehensive chart review and electronic patient record analysis based on the primary discharge diagnosis as recorded in the Hospital In-Patient Enquiry (HIPE) system. Fifty patients were admitted to the National Maxillofacial Unit, St James's Hospital, under the oral and maxillofacial service over a four-year period, with an odontogenic infection as the primary diagnosis. There was an increased number of patients presenting with odontogenic infections during Group B of the study. These patients showed an increased complexity and severity of infection. Although there was an upward trend in the numbers and complexity of infections, this trending did not reach statistical significance. The primary cause of infection was dental caries in all patients. Dental caries is a preventable and treatable disease. Increased resources should be made available to support access to dental care, and thereby lessen the potential for the morbidity and mortality associated with serious odontogenic infections. The study at present continues as a prospective study.
Statistical methods and materials characterisation
International Nuclear Information System (INIS)
Wallin, K.R.W.
2010-01-01
Statistics is a wide mathematical area, which covers a myriad of analysis and estimation options, some of which suit special cases better than others. A comprehensive coverage of the whole area of statistics would be an enormous effort and would also be outside the capabilities of this author. Therefore, this does not intend to be a textbook on statistical methods available for general data analysis and decision making. Instead it will highlight a certain special statistical case applicable to mechanical materials characterization. The methods presented here do not in any way rule out other statistical methods by which to analyze mechanical property material data. (orig.)
Annual Bulletin of General Energy Statistics for Europe. V. 23, 1990
International Nuclear Information System (INIS)
1992-01-01
The purpose of the Bulletin is to provide basic data on the energy situation as a whole in European countries, Canada and the United States of America. This publication is purely statistical in character. As from the 1980 edition of the bulletin the scope of statistics comprises production of energy by form, overall energy balance sheets and deliveries of petroleum products for inland consumption. While less details are given for solid and gaseous fuels as sources of energy than in previous editions of the bulletin, more information is available for liquid fuels and nuclear, hydro- and geothermal energy
Kowalski, Jeanne
2008-01-01
A timely and applied approach to the newly discovered methods and applications of U-statisticsBuilt on years of collaborative research and academic experience, Modern Applied U-Statistics successfully presents a thorough introduction to the theory of U-statistics using in-depth examples and applications that address contemporary areas of study including biomedical and psychosocial research. Utilizing a "learn by example" approach, this book provides an accessible, yet in-depth, treatment of U-statistics, as well as addresses key concepts in asymptotic theory by integrating translational and cross-disciplinary research.The authors begin with an introduction of the essential and theoretical foundations of U-statistics such as the notion of convergence in probability and distribution, basic convergence results, stochastic Os, inference theory, generalized estimating equations, as well as the definition and asymptotic properties of U-statistics. With an emphasis on nonparametric applications when and where applic...
String theory flux vacua on twisted tori and generalized complex geometry
International Nuclear Information System (INIS)
Andriot, David
2010-01-01
This thesis is devoted to the study of flux vacua of string theory, with the ten-dimensional space-time split into a four-dimensional maximally symmetric space-time, and a six-dimensional internal manifold M, taken to be a solv-manifold (twisted torus). Such vacua are of particular interest when trying to relate string theory to supersymmetric (SUSY) extensions of the standard model of particles, or to cosmological models. For SUSY solutions of type II supergravities, allowing for fluxes on M helps to solve the moduli problem. Then, a broader class of manifolds than just the Calabi-Yau can be considered for M, and a general characterization is given in terms of Generalized Complex Geometry: M has to be a Generalized Calabi-Yau (GCY). A subclass of solv-manifolds have been proven to be GCY, so we look for solutions with such M. To do so, we use an algorithmic resolution method. Then we focus on specific new solutions: those admitting an intermediate SU(2) structure. A transformation named the twist is then discussed. It relates solutions on torus to solutions on solv-manifolds. Working out constraints on the twist to generate solutions, we can relate known solutions, and find a new one. We also use the twist to relate flux vacua of heterotic string. Finally we consider ten-dimensional de Sitter solutions. Looking for such solutions is difficult, because of several problems among which the breaking of SUSY. We propose an Ansatz for SUSY breaking sources which helps to overcome these difficulties. We give an explicit solution on a solv-manifold, and discuss partially its four-dimensional stability. (author)
Statistical Problems In Medical Research | Okeh | East African ...
African Journals Online (AJOL)
Given the main role of a general practitioner as a biostatistician, I thought it would be of interest to enumerate statistical problems in assessing methods of medical diagnosis in general terms. In conducting and reporting of medical research, there are some common problems in using statistical methodology which may result ...
Statistical distribution of resonance parameters for inelastic scattering of fast neutrons
International Nuclear Information System (INIS)
Radunovic, J.
1973-01-01
This paper deals with the application of statistical method for the analysis of nuclear reactions related to complex nuclei. It is shown that inelastic neutron scattering which occurs by creation of a complex nucleus in the higher energy range can be treated by statistical approach
Wave Mechanics or Wave Statistical Mechanics
International Nuclear Information System (INIS)
Qian Shangwu; Xu Laizi
2007-01-01
By comparison between equations of motion of geometrical optics and that of classical statistical mechanics, this paper finds that there should be an analogy between geometrical optics and classical statistical mechanics instead of geometrical mechanics and classical mechanics. Furthermore, by comparison between the classical limit of quantum mechanics and classical statistical mechanics, it finds that classical limit of quantum mechanics is classical statistical mechanics not classical mechanics, hence it demonstrates that quantum mechanics is a natural generalization of classical statistical mechanics instead of classical mechanics. Thence quantum mechanics in its true appearance is a wave statistical mechanics instead of a wave mechanics.
Poot, Antonius J; den Elzen, Wendy P J; Blom, Jeanet W; Gussekloo, Jacobijn
2014-01-01
Satisfaction is widely used to evaluate and direct delivery of medical care; a complicated relationship exists between patient satisfaction, morbidity and age. This study investigates the relationships between complexity of health problems and level of patient satisfaction of older persons with their general practitioner (GP) and practice. This study is embedded in the ISCOPE (Integrated Systematic Care for Older Persons) study. Enlisted patients aged ≥75 years from 59 practices received a written questionnaire to screen for complex health problems (somatic, functional, psychological and social). For 2664 randomly chosen respondents (median age 82 years; 68% female) information was collected on level of satisfaction (satisfied, neutral, dissatisfied) with their GP and general practice, and demographic and clinical characteristics including complexity of health problems. Of all participants, 4% was dissatisfied with their GP care, 59% neutral and 37% satisfied. Between these three categories no differences were observed in age, gender, country of birth or education level. The percentage of participants dissatisfied with their GP care increased from 0.4% in those with 0 problem domains to 8% in those with 4 domains, i.e. having complex health problems (ppatient characteristic influencing the perception of care, or whether the care is unable to handle the demands of these patients. Prospective studies are needed to investigate the causal associations between care organization, patient characteristics, indicators of quality, and patient perceptions.
Quantifying spatial and temporal trends in beach-dune volumetric changes using spatial statistics
Eamer, Jordan B. R.; Walker, Ian J.
2013-06-01
Spatial statistics are generally underutilized in coastal geomorphology, despite offering great potential for identifying and quantifying spatial-temporal trends in landscape morphodynamics. In particular, local Moran's Ii provides a statistical framework for detecting clusters of significant change in an attribute (e.g., surface erosion or deposition) and quantifying how this changes over space and time. This study analyzes and interprets spatial-temporal patterns in sediment volume changes in a beach-foredune-transgressive dune complex following removal of invasive marram grass (Ammophila spp.). Results are derived by detecting significant changes in post-removal repeat DEMs derived from topographic surveys and airborne LiDAR. The study site was separated into discrete, linked geomorphic units (beach, foredune, transgressive dune complex) to facilitate sub-landscape scale analysis of volumetric change and sediment budget responses. Difference surfaces derived from a pixel-subtraction algorithm between interval DEMs and the LiDAR baseline DEM were filtered using the local Moran's Ii method and two different spatial weights (1.5 and 5 m) to detect statistically significant change. Moran's Ii results were compared with those derived from a more spatially uniform statistical method that uses a simpler student's t distribution threshold for change detection. Morphodynamic patterns and volumetric estimates were similar between the uniform geostatistical method and Moran's Ii at a spatial weight of 5 m while the smaller spatial weight (1.5 m) consistently indicated volumetric changes of less magnitude. The larger 5 m spatial weight was most representative of broader site morphodynamics and spatial patterns while the smaller spatial weight provided volumetric changes consistent with field observations. All methods showed foredune deflation immediately following removal with increased sediment volumes into the spring via deposition at the crest and on lobes in the lee
[The research protocol VI: How to choose the appropriate statistical test. Inferential statistics].
Flores-Ruiz, Eric; Miranda-Novales, María Guadalupe; Villasís-Keever, Miguel Ángel
2017-01-01
The statistical analysis can be divided in two main components: descriptive analysis and inferential analysis. An inference is to elaborate conclusions from the tests performed with the data obtained from a sample of a population. Statistical tests are used in order to establish the probability that a conclusion obtained from a sample is applicable to the population from which it was obtained. However, choosing the appropriate statistical test in general poses a challenge for novice researchers. To choose the statistical test it is necessary to take into account three aspects: the research design, the number of measurements and the scale of measurement of the variables. Statistical tests are divided into two sets, parametric and nonparametric. Parametric tests can only be used if the data show a normal distribution. Choosing the right statistical test will make it easier for readers to understand and apply the results.
The research protocol VI: How to choose the appropriate statistical test. Inferential statistics
Directory of Open Access Journals (Sweden)
Eric Flores-Ruiz
2017-10-01
Full Text Available The statistical analysis can be divided in two main components: descriptive analysis and inferential analysis. An inference is to elaborate conclusions from the tests performed with the data obtained from a sample of a population. Statistical tests are used in order to establish the probability that a conclusion obtained from a sample is applicable to the population from which it was obtained. However, choosing the appropriate statistical test in general poses a challenge for novice researchers. To choose the statistical test it is necessary to take into account three aspects: the research design, the number of measurements and the scale of measurement of the variables. Statistical tests are divided into two sets, parametric and nonparametric. Parametric tests can only be used if the data show a normal distribution. Choosing the right statistical test will make it easier for readers to understand and apply the results.
More than a sum of its parts: A Keynesian epistemology of statistics
Directory of Open Access Journals (Sweden)
Nicholas Werle
2011-05-01
Full Text Available The major theoretical insight of Keynes’ General Theory is that aggregate quantities describing the state of an economy as a whole are irreducible to arithmetic summations of individual decisions. This breaks with the logic of classical political economy and establishes macroeconomics as the study of economy-wide dynamics, logically independent from any underlying theory of individual rationality. However, Keynes does have a theory of individual psychology that links expectations back up to aggregate quantities with robust statistical methods, which account for the fundamental uncertainty one faces in predicting the future. By comparing the theoretical structure of macroeconomics to that of thermodynamics and statistical mechanics, this essay proposes a novel reading of Keynes’ epistemology of statistical laws. On this view, statistical methods allow theoreticians to connect the mechanics of vast numbers of micro-scale entities to a macroscale dynamics, even in the absence of a fully determinate causal story. Keynes’ belief that organic wholes emerge from the interactions of complex systems is a product of his early work on the development of statistical mechanics from kinetic theory. In light of this epistemological foundation, this essay shows how the neoclassical idea of supplying macroeconomics with microfoundations is inherently contradictory.
50 CFR 600.415 - Access to statistics.
2010-10-01
... 50 Wildlife and Fisheries 8 2010-10-01 2010-10-01 false Access to statistics. 600.415 Section 600... ADMINISTRATION, DEPARTMENT OF COMMERCE MAGNUSON-STEVENS ACT PROVISIONS Confidentiality of Statistics § 600.415 Access to statistics. (a) General. In determining whether to grant a request for access to confidential...
Gessner, Manuel; Breuer, Heinz-Peter
2013-04-01
We obtain exact analytic expressions for a class of functions expressed as integrals over the Haar measure of the unitary group in d dimensions. Based on these general mathematical results, we investigate generic dynamical properties of complex open quantum systems, employing arguments from ensemble theory. We further generalize these results to arbitrary eigenvalue distributions, allowing a detailed comparison of typical regular and chaotic systems with the help of concepts from random matrix theory. To illustrate the physical relevance and the general applicability of our results we present a series of examples related to the fields of open quantum systems and nonequilibrium quantum thermodynamics. These include the effect of initial correlations, the average quantum dynamical maps, the generic dynamics of system-environment pure state entanglement and, finally, the equilibration of generic open and closed quantum systems.
Directory of Open Access Journals (Sweden)
Wright F
2012-10-01
Full Text Available Abstract Background Electronic linkage to routine administrative datasets, such as the Hospital Episode Statistics (HES in England, is increasingly used in medical research. Relatively little is known about the reliability of HES diagnostic information for epidemiological studies. In the United Kingdom (UK, general practitioners hold comprehensive records for individuals relating to their primary, secondary and tertiary care. For a random sample of participants in a large UK cohort, we compared vascular disease diagnoses in HES and general practice records to assess agreement between the two sources. Methods Million Women Study participants with a HES record of hospital admission with vascular disease (ischaemic heart disease [ICD-10 codes I20-I25], cerebrovascular disease [G45, I60-I69] or venous thromboembolism [I26, I80-I82] between April 1st 1997 and March 31st 2005 were identified. In each broad diagnostic group and in women with no such HES diagnoses, a random sample of about a thousand women was selected for study. We asked each woman’s general practitioner to provide information on her history of vascular disease and this information was compared with the HES diagnosis record. Results Over 90% of study forms sent to general practitioners were returned and 88% of these contained analysable data. For the vast majority of study participants for whom information was available, diagnostic information from general practice and HES records was consistent. Overall, for 93% of women with a HES diagnosis of vascular disease, general practice records agreed with the HES diagnosis; and for 97% of women with no HES diagnosis of vascular disease, the general practitioner had no record of a diagnosis of vascular disease. For severe vascular disease, including myocardial infarction (I21-22, stroke, both overall (I60-64 and by subtype, and pulmonary embolism (I26, HES records appeared to be both reliable and complete. Conclusion Hospital admission data
Statistical selection : a way of thinking !
Laan, van der P.; Aarts, E.H.L.; Eikelder, ten H.M.M.; Hemerik, C.; Rem, M.
1995-01-01
Statistical selection of the best population is discussed in general terms and the principles of statistical selection procedures are presented. Advantages and disadvantages of Subset Selection, one of the main approaches, are indicated. The selection of an almost best population is considered and
Statistical selection : a way of thinking!
Laan, van der P.
1995-01-01
Statistical selection of the best population is discussed in general terms and the principles of statistical selection procedures are presented. Advantages and disadvantages of Subset Selection, one of the main approaches, are indicated. The selection of an almost best population is considered and
DEFF Research Database (Denmark)
Lindström, Erik; Madsen, Henrik; Nielsen, Jan Nygaard
Statistics for Finance develops students’ professional skills in statistics with applications in finance. Developed from the authors’ courses at the Technical University of Denmark and Lund University, the text bridges the gap between classical, rigorous treatments of financial mathematics...... that rarely connect concepts to data and books on econometrics and time series analysis that do not cover specific problems related to option valuation. The book discusses applications of financial derivatives pertaining to risk assessment and elimination. The authors cover various statistical...... and mathematical techniques, including linear and nonlinear time series analysis, stochastic calculus models, stochastic differential equations, Itō’s formula, the Black–Scholes model, the generalized method-of-moments, and the Kalman filter. They explain how these tools are used to price financial derivatives...
International Nuclear Information System (INIS)
Brekke, L.; Imbo, T.D.
1992-01-01
The authors study the inequivalent quantizations of (1 + 1)-dimensional nonlinear sigma models with space manifold S 1 and target manifold X. If x is multiply connected, these models possess topological solitons. After providing a definition of spin and statistics for these solitons and demonstrating a spin-statistics correlation, we give various examples where the solitons can have exotic statistics. In some of these models, the solitons may obey a generalized version of fractional statistics called ambistatistics. In this paper the relevance of these 2d models to the statistics of vortices in (2 + 1)-dimensional spontaneously broken gauge theories is discussed. The authors close with a discussion concerning the extension of our results to higher dimensions
Towards better error statistics for atmospheric inversions of methane surface fluxes
Directory of Open Access Journals (Sweden)
A. Berchet
2013-07-01
Full Text Available We adapt general statistical methods to estimate the optimal error covariance matrices in a regional inversion system inferring methane surface emissions from atmospheric concentrations. Using a minimal set of physical hypotheses on the patterns of errors, we compute a guess of the error statistics that is optimal in regard to objective statistical criteria for the specific inversion system. With this very general approach applied to a real-data case, we recover sources of errors in the observations and in the prior state of the system that are consistent with expert knowledge while inferred from objective criteria and with affordable computation costs. By not assuming any specific error patterns, our results depict the variability and the inter-dependency of errors induced by complex factors such as the misrepresentation of the observations in the transport model or the inability of the model to reproduce well the situations of steep gradients of concentrations. Situations with probable significant biases (e.g., during the night when vertical mixing is ill-represented by the transport model can also be diagnosed by our methods in order to point at necessary improvement in a model. By additionally analysing the sensitivity of the inversion to each observation, guidelines to enhance data selection in regional inversions are also proposed. We applied our method to a recent significant accidental methane release from an offshore platform in the North Sea and found methane fluxes of the same magnitude than what was officially declared.
Palozzi, Jason; Pantopoulos, George; Maravelis, Angelos G.; Nordsvan, Adam; Zelilidis, Avraam
2018-02-01
This investigation presents an outcrop-based integrated study of internal division analysis and statistical treatment of turbidite bed thickness applied to a Carboniferous deep-water channel-levee complex in the Myall Trough, southeast Australia. Turbidite beds of the studied succession are characterized by a range of sedimentary structures grouped into two main associations, a thick-bedded and a thin-bedded one, that reflect channel-fill and overbank/levee deposits, respectively. Three vertically stacked channel-levee cycles have been identified. Results of statistical analysis of bed thickness, grain-size and internal division patterns applied on the studied channel-levee succession, indicate that turbidite bed thickness data seem to be well characterized by a bimodal lognormal distribution, which is possibly reflecting the difference between deposition from lower-density flows (in a levee/overbank setting) and very high-density flows (in a channel fill setting). Power law and exponential distributions were observed to hold only for the thick-bedded parts of the succession and cannot characterize the whole bed thickness range of the studied sediments. The succession also exhibits non-random clustering of bed thickness and grain-size measurements. The studied sediments are also characterized by the presence of statistically detected fining-upward sandstone packets. A novel quantitative approach (change-point analysis) is proposed for the detection of those packets. Markov permutation statistics also revealed the existence of order in the alternation of internal divisions in the succession expressed by an optimal internal division cycle reflecting two main types of gravity flow events deposited within both thick-bedded conglomeratic and thin-bedded sandstone associations. The analytical methods presented in this study can be used as additional tools for quantitative analysis and recognition of depositional environments in hydrocarbon-bearing research of ancient
Learning the Language of Statistics: Challenges and Teaching Approaches
Dunn, Peter K.; Carey, Michael D.; Richardson, Alice M.; McDonald, Christine
2016-01-01
Learning statistics requires learning the language of statistics. Statistics draws upon words from general English, mathematical English, discipline-specific English and words used primarily in statistics. This leads to many linguistic challenges in teaching statistics and the way in which the language is used in statistics creates an extra layer…
Best Statistical Distribution of flood variables for Johor River in Malaysia
Salarpour Goodarzi, M.; Yusop, Z.; Yusof, F.
2012-12-01
A complex flood event is always characterized by a few characteristics such as flood peak, flood volume, and flood duration, which might be mutually correlated. This study explored the statistical distribution of peakflow, flood duration and flood volume at Rantau Panjang gauging station on the Johor River in Malaysia. Hourly data were recorded for 45 years. The data were analysed based on water year (July - June). Five distributions namely, Log Normal, Generalize Pareto, Log Pearson, Normal and Generalize Extreme Value (GEV) were used to model the distribution of all the three variables. Anderson-Darling and Kolmogorov-Smirnov goodness-of-fit tests were used to evaluate the best fit. Goodness-of-fit tests at 5% level of significance indicate that all the models can be used to model the distribution of peakflow, flood duration and flood volume. However, Generalize Pareto distribution is found to be the most suitable model when tested with the Anderson-Darling test and the, Kolmogorov-Smirnov suggested that GEV is the best for peakflow. The result of this research can be used to improve flood frequency analysis. Comparison between Generalized Extreme Value, Generalized Pareto and Log Pearson distributions in the Cumulative Distribution Function of peakflow
International Nuclear Information System (INIS)
Khrennikov, A.
2005-01-01
We constructed the representation of contextual probabilistic dynamics in the complex Hilbert space. Thus dynamics of the wave function can be considered as Hilbert space projection of realistic dynamics in a pre space. The basic condition for representing the pre space-dynamics is the law of statistical conservation of energy-conservation of probabilities. The construction of the dynamical representation is an important step in the development of contextual statistical viewpoint of quantum processes. But the contextual statistical model is essentially more general than the quantum one. Therefore in general the Hilbert space projection of the pre space dynamics can be nonlinear and even irreversible (but it is always unitary). There were found conditions of linearity and reversibility of the Hilbert space dynamical projection. We also found conditions for the conventional Schrodinger dynamics (including time-dependent Hamiltonians). We remark that in general even the Schrodinger dynamics is based just on the statistical conservation of energy; for individual systems the law of conservation of energy can be violated (at least in our theoretical model)
29 CFR 1614.601 - EEO group statistics.
2010-07-01
... 29 Labor 4 2010-07-01 2010-07-01 false EEO group statistics. 1614.601 Section 1614.601 Labor... EMPLOYMENT OPPORTUNITY Matters of General Applicability § 1614.601 EEO group statistics. (a) Each agency... provided by an employee is inaccurate, the agency shall advise the employee about the solely statistical...
12 CFR 268.601 - EEO group statistics.
2010-01-01
... 12 Banks and Banking 3 2010-01-01 2010-01-01 false EEO group statistics. 268.601 Section 268.601... RULES REGARDING EQUAL OPPORTUNITY Matters of General Applicability § 268.601 EEO group statistics. (a... solely statistical purpose for which the data is being collected, the need for accuracy, the Board's...
Statistical Data Processing with R – Metadata Driven Approach
Directory of Open Access Journals (Sweden)
Rudi SELJAK
2016-06-01
Full Text Available In recent years the Statistical Office of the Republic of Slovenia has put a lot of effort into re-designing its statistical process. We replaced the classical stove-pipe oriented production system with general software solutions, based on the metadata driven approach. This means that one general program code, which is parametrized with process metadata, is used for data processing for a particular survey. Currently, the general program code is entirely based on SAS macros, but in the future we would like to explore how successfully statistical software R can be used for this approach. Paper describes the metadata driven principle for data validation, generic software solution and main issues connected with the use of statistical software R for this approach.
The Entropy of Non-Ergodic Complex Systems — a Derivation from First Principles
Thurner, Stefan; Hanel, Rudolf
In information theory the 4 Shannon-Khinchin1,2 (SK) axioms determine Boltzmann Gibbs entropy, S -∑i pilog pi, as the unique entropy. Physics is different from information in the sense that physical systems can be non-ergodic or non-Markovian. To characterize such strongly interacting, statistical systems - complex systems in particular - within a thermodynamical framework it might be necessary to introduce generalized entropies. A series of such entropies have been proposed in the past decades. Until now the understanding of their fundamental origin and their deeper relations to complex systems remains unclear. To clarify the situation we note that non-ergodicity explicitly violates the fourth SK axiom. We show that by relaxing this axiom the entropy generalizes to, S ∑i Γ(d + 1, 1 - c log pi), where Γ is the incomplete Gamma function, and c and d are scaling exponents. All recently proposed entropies compatible with the first 3 SK axioms appear to be special cases. We prove that each statistical system is uniquely characterized by the pair of the two scaling exponents (c, d), which defines equivalence classes for all systems. The corresponding distribution functions are special forms of Lambert-W exponentials containing, as special cases, Boltzmann, stretched exponential and Tsallis distributions (power-laws) - all widely abundant in nature. This derivation is the first ab initio justification for generalized entropies. We next show how the phasespace volume of a system is related to its generalized entropy, and provide a concise criterion when it is not of Boltzmann-Gibbs type but assumes a generalized form. We show that generalized entropies only become relevant when the dynamically (statistically) relevant fraction of degrees of freedom in a system vanishes in the thermodynamic limit. These are systems where the bulk of the degrees of freedom is frozen. Systems governed by generalized entropies are therefore systems whose phasespace volume effectively
The Generalization Complexity Measure for Continuous Input Data
Directory of Open Access Journals (Sweden)
Iván Gómez
2014-01-01
defined in Boolean space, quantifies the complexity of data in relationship to the prediction accuracy that can be expected when using a supervised classifier like a neural network, SVM, and so forth. We first extend the original measure for its use with continuous functions to later on, using an approach based on the use of the set of Walsh functions, consider the case of having a finite number of data points (inputs/outputs pairs, that is, usually the practical case. Using a set of trigonometric functions a model that gives a relationship between the size of the hidden layer of a neural network and the complexity is constructed. Finally, we demonstrate the application of the introduced complexity measure, by using the generated model, to the problem of estimating an adequate neural network architecture for real-world data sets.
Statistical assessment on a combined analysis of GRYN-ROMN-UCBN upland vegetation vital signs
Irvine, Kathryn M.; Rodhouse, Thomas J.
2014-01-01
As of 2013, Rocky Mountain and Upper Columbia Basin Inventory and Monitoring Networks have multiple years of vegetation data and Greater Yellowstone Network has three years of vegetation data and monitoring is ongoing in all three networks. Our primary objective is to assess whether a combined analysis of these data aimed at exploring correlations with climate and weather data is feasible. We summarize the core survey design elements across protocols and point out the major statistical challenges for a combined analysis at present. The dissimilarity in response designs between ROMN and UCBN-GRYN network protocols presents a statistical challenge that has not been resolved yet. However, the UCBN and GRYN data are compatible as they implement a similar response design; therefore, a combined analysis is feasible and will be pursued in future. When data collected by different networks are combined, the survey design describing the merged dataset is (likely) a complex survey design. A complex survey design is the result of combining datasets from different sampling designs. A complex survey design is characterized by unequal probability sampling, varying stratification, and clustering (see Lohr 2010 Chapter 7 for general overview). Statistical analysis of complex survey data requires modifications to standard methods, one of which is to include survey design weights within a statistical model. We focus on this issue for a combined analysis of upland vegetation from these networks, leaving other topics for future research. We conduct a simulation study on the possible effects of equal versus unequal probability selection of points on parameter estimates of temporal trend using available packages within the R statistical computing package. We find that, as written, using lmer or lm for trend detection in a continuous response and clm and clmm for visually estimated cover classes with “raw” GRTS design weights specified for the weight argument leads to substantially
Statistical downscaling of precipitation using long short-term memory recurrent neural networks
Misra, Saptarshi; Sarkar, Sudeshna; Mitra, Pabitra
2017-11-01
Hydrological impacts of global climate change on regional scale are generally assessed by downscaling large-scale climatic variables, simulated by General Circulation Models (GCMs), to regional, small-scale hydrometeorological variables like precipitation, temperature, etc. In this study, we propose a new statistical downscaling model based on Recurrent Neural Network with Long Short-Term Memory which captures the spatio-temporal dependencies in local rainfall. The previous studies have used several other methods such as linear regression, quantile regression, kernel regression, beta regression, and artificial neural networks. Deep neural networks and recurrent neural networks have been shown to be highly promising in modeling complex and highly non-linear relationships between input and output variables in different domains and hence we investigated their performance in the task of statistical downscaling. We have tested this model on two datasets—one on precipitation in Mahanadi basin in India and the second on precipitation in Campbell River basin in Canada. Our autoencoder coupled long short-term memory recurrent neural network model performs the best compared to other existing methods on both the datasets with respect to temporal cross-correlation, mean squared error, and capturing the extremes.
50 CFR 600.410 - Collection and maintenance of statistics.
2010-10-01
... 50 Wildlife and Fisheries 8 2010-10-01 2010-10-01 false Collection and maintenance of statistics... of Statistics § 600.410 Collection and maintenance of statistics. (a) General. (1) All statistics..., the Assistant Administrator will remove all identifying particulars from the statistics if doing so is...
Generalized massive optimal data compression
Alsing, Justin; Wandelt, Benjamin
2018-05-01
In this paper, we provide a general procedure for optimally compressing N data down to n summary statistics, where n is equal to the number of parameters of interest. We show that compression to the score function - the gradient of the log-likelihood with respect to the parameters - yields n compressed statistics that are optimal in the sense that they preserve the Fisher information content of the data. Our method generalizes earlier work on linear Karhunen-Loéve compression for Gaussian data whilst recovering both lossless linear compression and quadratic estimation as special cases when they are optimal. We give a unified treatment that also includes the general non-Gaussian case as long as mild regularity conditions are satisfied, producing optimal non-linear summary statistics when appropriate. As a worked example, we derive explicitly the n optimal compressed statistics for Gaussian data in the general case where both the mean and covariance depend on the parameters.
Microcanonical ensemble and algebra of conserved generators for generalized quantum dynamics
International Nuclear Information System (INIS)
Adler, S.L.; Horwitz, L.P.
1996-01-01
It has recently been shown, by application of statistical mechanical methods to determine the canonical ensemble governing the equilibrium distribution of operator initial values, that complex quantum field theory can emerge as a statistical approximation to an underlying generalized quantum dynamics. This result was obtained by an argument based on a Ward identity analogous to the equipartition theorem of classical statistical mechanics. We construct here a microcanonical ensemble which forms the basis of this canonical ensemble. This construction enables us to define the microcanonical entropy and free energy of the field configuration of the equilibrium distribution and to study the stability of the canonical ensemble. We also study the algebraic structure of the conserved generators from which the microcanonical and canonical ensembles are constructed, and the flows they induce on the phase space. copyright 1996 American Institute of Physics
Statistical Literacy in the Data Science Workplace
Grant, Robert
2017-01-01
Statistical literacy, the ability to understand and make use of statistical information including methods, has particular relevance in the age of data science, when complex analyses are undertaken by teams from diverse backgrounds. Not only is it essential to communicate to the consumers of information but also within the team. Writing from the…
POWER ANALYSIS FOR COMPLEX MEDIATIONAL DESIGNS USING MONTE CARLO METHODS
Thoemmes, Felix; MacKinnon, David P.; Reiser, Mark R.
2010-01-01
Applied researchers often include mediation effects in applications of advanced methods such as latent variable models and linear growth curve models. Guidance on how to estimate statistical power to detect mediation for these models has not yet been addressed in the literature. We describe a general framework for power analyses for complex mediational models. The approach is based on the well known technique of generating a large number of samples in a Monte Carlo study, and estimating power...
Nigmatullin, Raoul R.; Maione, Guido; Lino, Paolo; Saponaro, Fabrizio; Zhang, Wei
2017-01-01
In this paper, we suggest a general theory that enables to describe experiments associated with reproducible or quasi-reproducible data reflecting the dynamical and self-similar properties of a wide class of complex systems. Under complex system we understand a system when the model based on microscopic principles and suppositions about the nature of the matter is absent. This microscopic model is usually determined as ;the best fit" model. The behavior of the complex system relatively to a control variable (time, frequency, wavelength, etc.) can be described in terms of the so-called intermediate model (IM). One can prove that the fitting parameters of the IM are associated with the amplitude-frequency response of the segment of the Prony series. The segment of the Prony series including the set of the decomposition coefficients and the set of the exponential functions (with k = 1,2,…,K) is limited by the final mode K. The exponential functions of this decomposition depend on time and are found by the original algorithm described in the paper. This approach serves as a logical continuation of the results obtained earlier in paper [Nigmatullin RR, W. Zhang and Striccoli D. General theory of experiment containing reproducible data: The reduction to an ideal experiment. Commun Nonlinear Sci Numer Simul, 27, (2015), pp 175-192] for reproducible experiments and includes the previous results as a partial case. In this paper, we consider a more complex case when the available data can create short samplings or exhibit some instability during the process of measurements. We give some justified evidences and conditions proving the validity of this theory for the description of a wide class of complex systems in terms of the reduced set of the fitting parameters belonging to the segment of the Prony series. The elimination of uncontrollable factors expressed in the form of the apparatus function is discussed. To illustrate how to apply the theory and take advantage of its
Arizona Public Library Statistics, 1999-2000.
Arizona State Dept. of Library, Archives and Public Records, Phoenix.
These statistics were compiled from information supplied by Arizona's public libraries. The document is divided according to the following county groups: Apache, Cochise; Coconino, Gila; Graham, Greenlee, La Paz; Maricopa; Mohave, Navajo; Pima, Pinal; Santa Cruz, Yavapai; Yuma. Statistics are presented on the following: general information;…
Research design and statistical analysis
Myers, Jerome L; Lorch Jr, Robert F
2013-01-01
Research Design and Statistical Analysis provides comprehensive coverage of the design principles and statistical concepts necessary to make sense of real data. The book's goal is to provide a strong conceptual foundation to enable readers to generalize concepts to new research situations. Emphasis is placed on the underlying logic and assumptions of the analysis and what it tells the researcher, the limitations of the analysis, and the consequences of violating assumptions. Sampling, design efficiency, and statistical models are emphasized throughout. As per APA recommendations
Targeted learning in data science causal inference for complex longitudinal studies
van der Laan, Mark J
2018-01-01
This textbook for graduate students in statistics, data science, and public health deals with the practical challenges that come with big, complex, and dynamic data. It presents a scientific roadmap to translate real-world data science applications into formal statistical estimation problems by using the general template of targeted maximum likelihood estimators. These targeted machine learning algorithms estimate quantities of interest while still providing valid inference. Targeted learning methods within data science area critical component for solving scientific problems in the modern age. The techniques can answer complex questions including optimal rules for assigning treatment based on longitudinal data with time-dependent confounding, as well as other estimands in dependent data structures, such as networks. Included in Targeted Learning in Data Science are demonstrations with soft ware packages and real data sets that present a case that targeted learning is crucial for the next generatio...
Directory of Open Access Journals (Sweden)
Amany E. Aly
2016-04-01
Full Text Available When a system consisting of independent components of the same type, some appropriate actions may be done as soon as a portion of them have failed. It is, therefore, important to be able to predict later failure times from earlier ones. One of the well-known failure distributions commonly used to model component life, is the modified Weibull distribution (MWD. In this paper, two pivotal quantities are proposed to construct prediction intervals for future unobservable lifetimes based on generalized order statistics (gos from MWD. Moreover, a pivotal quantity is developed to reconstruct missing observations at the beginning of experiment. Furthermore, Monte Carlo simulation studies are conducted and numerical computations are carried out to investigate the efficiency of presented results. Finally, two illustrative examples for real data sets are analyzed.
Aspects of modern fracture statistics
International Nuclear Information System (INIS)
Tradinik, W.; Pabst, R.F.; Kromp, K.
1981-01-01
This contribution begins with introductory general remarks about fracture statistics. Then the fundamentals of the distribution of fracture probability are described. In the following part the application of the Weibull Statistics is justified. In the fourth chapter the microstructure of the material is considered in connection with calculations made in order to determine the fracture probability or risk of fracture. (RW) [de
Computational statistics handbook with Matlab
Martinez, Wendy L
2007-01-01
Prefaces Introduction What Is Computational Statistics? An Overview of the Book Probability Concepts Introduction Probability Conditional Probability and Independence Expectation Common Distributions Sampling Concepts Introduction Sampling Terminology and Concepts Sampling Distributions Parameter Estimation Empirical Distribution Function Generating Random Variables Introduction General Techniques for Generating Random Variables Generating Continuous Random Variables Generating Discrete Random Variables Exploratory Data Analysis Introduction Exploring Univariate Data Exploring Bivariate and Trivariate Data Exploring Multidimensional Data Finding Structure Introduction Projecting Data Principal Component Analysis Projection Pursuit EDA Independent Component Analysis Grand Tour Nonlinear Dimensionality Reduction Monte Carlo Methods for Inferential Statistics Introduction Classical Inferential Statistics Monte Carlo Methods for Inferential Statist...
Directory of Open Access Journals (Sweden)
Peter W. Egolf
2018-02-01
Full Text Available The extended thermodynamics of Tsallis is reviewed in detail and applied to turbulence. It is based on a generalization of the exponential and logarithmic functions with a parameter q. By applying this nonequilibrium thermodynamics, the Boltzmann-Gibbs thermodynamic approach of Kraichnan to 2-d turbulence is generalized. This physical modeling implies fractional calculus methods, obeying anomalous diffusion, described by Lévy statistics with q < 5/3 (sub diffusion, q = 5/3 (normal or Brownian diffusion and q > 5/3 (super diffusion. The generalized energy spectrum of Kraichnan, occurring at small wave numbers k, now reveals the more general and precise result k−q. This corresponds well for q = 5/3 with the Kolmogorov-Oboukov energy spectrum and for q > 5/3 to turbulence with intermittency. The enstrophy spectrum, occurring at large wave numbers k, leads to a k−3q power law, suggesting that large wave-number eddies are in thermodynamic equilibrium, which is characterized by q = 1, finally resulting in Kraichnan’s correct k−3 enstrophy spectrum. The theory reveals in a natural manner a generalized temperature of turbulence, which in the non-equilibrium energy transfer domain decreases with wave number and shows an energy equipartition law with a constant generalized temperature in the equilibrium enstrophy transfer domain. The article contains numerous new results; some are stated in form of eight new (proven propositions.
International Nuclear Information System (INIS)
Har-Shemesh, Omri; Quax, Rick; Hoekstra, Alfons G; Sloot, Peter M A
2016-01-01
The Fisher–Rao metric from information geometry is related to phase transition phenomena in classical statistical mechanics. Several studies propose to extend the use of information geometry to study more general phase transitions in complex systems. However, it is unclear whether the Fisher–Rao metric does indeed detect these more general transitions, especially in the absence of a statistical model. In this paper we study the transitions between patterns in the Gray-Scott reaction–diffusion model using Fisher information. We describe the system by a probability density function that represents the size distribution of blobs in the patterns and compute its Fisher information with respect to changing the two rate parameters of the underlying model. We estimate the distribution non-parametrically so that we do not assume any statistical model. The resulting Fisher map can be interpreted as a phase-map of the different patterns. Lines with high Fisher information can be considered as boundaries between regions of parameter space where patterns with similar characteristics appear. These lines of high Fisher information can be interpreted as phase transitions between complex patterns. (paper: disordered systems, classical and quantum)
How to construct the statistic network? An association network of herbaceous
Directory of Open Access Journals (Sweden)
WenJun Zhang
2012-06-01
constructed network increases. Network compactness also follows the trend. In addition, as the increase of network compactness and connectance, the portion and number of negative association declines dramatically.(2 In an association (interaction network, only a few connections follow the linear relationship. Most connections follow the quasi-linear or non-linear relationships. (3 The association networks constructed from partial linear correlation and linear correlation measures are generally scale-free complex networks. The degree of these networks is power low distributed. (4 Isolated species (families, etc. are likely important in the statistic network. They are the sink species for shaping new network after a community is seriously disturbed. (5 Beween-taxa connections at higher taxonomic level are generally weaker than that at lower taxonomic level.
Learning predictive statistics from temporal sequences: Dynamics and strategies.
Wang, Rui; Shen, Yuan; Tino, Peter; Welchman, Andrew E; Kourtzi, Zoe
2017-10-01
Human behavior is guided by our expectations about the future. Often, we make predictions by monitoring how event sequences unfold, even though such sequences may appear incomprehensible. Event structures in the natural environment typically vary in complexity, from simple repetition to complex probabilistic combinations. How do we learn these structures? Here we investigate the dynamics of structure learning by tracking human responses to temporal sequences that change in structure unbeknownst to the participants. Participants were asked to predict the upcoming item following a probabilistic sequence of symbols. Using a Markov process, we created a family of sequences, from simple frequency statistics (e.g., some symbols are more probable than others) to context-based statistics (e.g., symbol probability is contingent on preceding symbols). We demonstrate the dynamics with which individuals adapt to changes in the environment's statistics-that is, they extract the behaviorally relevant structures to make predictions about upcoming events. Further, we show that this structure learning relates to individual decision strategy; faster learning of complex structures relates to selection of the most probable outcome in a given context (maximizing) rather than matching of the exact sequence statistics. Our findings provide evidence for alternate routes to learning of behaviorally relevant statistics that facilitate our ability to predict future events in variable environments.
Multiple commodities in statistical microeconomics: Model and market
Baaquie, Belal E.; Yu, Miao; Du, Xin
2016-11-01
A statistical generalization of microeconomics has been made in Baaquie (2013). In Baaquie et al. (2015), the market behavior of single commodities was analyzed and it was shown that market data provides strong support for the statistical microeconomic description of commodity prices. The case of multiple commodities is studied and a parsimonious generalization of the single commodity model is made for the multiple commodities case. Market data shows that the generalization can accurately model the simultaneous correlation functions of up to four commodities. To accurately model five or more commodities, further terms have to be included in the model. This study shows that the statistical microeconomics approach is a comprehensive and complete formulation of microeconomics, and which is independent to the mainstream formulation of microeconomics.
DEFF Research Database (Denmark)
Bergholdt, Stinne Holm; Gilså Hansen, Dorte; Larsen, Pia Veldt
2013-01-01
' rehabilitation course. OUTCOME MEASURES: 6 months after inclusion of the patient, patient satisfaction with their GP during the last 12 months in five different dimensions of GP care was assessed using the Danish version of the EuroPEP (European Patients Evaluate General Practice Care) questionnaire (DanPEP). 14....... Subgroup analysis of the patients with breast cancer showed statistically significant improvement of satisfaction with the GP in two of the five DanPEP dimensions. CONCLUSIONS: This complex intervention aiming at improving GPs' services in cancer rehabilitation had no impact on patient satisfaction. TRIAL...
Arizona Public Library Statistics, 2000-2001.
Elliott, Jan, Comp.
These statistics were compiled from information supplied by Arizona's public libraries. The document is divided according to the following county groups: Apache, Cochise; Coconino, Gila; Graham, Greenlee, La Paz; Maricopa; Mohave, Navajo; Pima, Pinal; Santa Cruz, Yavapai; and Yuma. Statistics are presented on the following: general information;…
DEFF Research Database (Denmark)
Zhang, Shun; Yang, Yi; Hanson, Steen Grüner
2015-01-01
for the superiority of the proposed PSVC technique, we study the statistical properties of the spatial derivatives of the complex signal representation generated from the Riesz transform. Under the assumption of a Gaussian random process, a theoretical analysis for the pseudo Stokes vector correlation has been...... provided. Based on these results, we show mathematically that PSVC has a performance advantage over conventional intensity-based correlation technique....
Directory of Open Access Journals (Sweden)
Brion Philippe
2015-12-01
Full Text Available Using as much administrative data as possible is a general trend among most national statistical institutes. Different kinds of administrative sources, from tax authorities or other administrative bodies, are very helpful material in the production of business statistics. However, these sources often have to be completed by information collected through statistical surveys. This article describes the way Insee has implemented such a strategy in order to produce French structural business statistics. The originality of the French procedure is that administrative and survey variables are used jointly for the same enterprises, unlike the majority of multisource systems, in which the two kinds of sources generally complement each other for different categories of units. The idea is to use, as much as possible, the richness of the administrative sources combined with the timeliness of a survey, even if the latter is conducted only on a sample of enterprises. One main issue is the classification of enterprises within the NACE nomenclature, which is a cornerstone variable in producing the breakdown of the results by industry. At a given date, two values of the corresponding code may coexist: the value of the register, not necessarily up to date, and the value resulting from the data collected via the survey, but only from a sample of enterprises. Using all this information together requires the implementation of specific statistical estimators combining some properties of the difference estimators with calibration techniques. This article presents these estimators, as well as their statistical properties, and compares them with those of other methods.
DESIGNING ENVIRONMENTAL MONITORING DATABASES FOR STATISTIC ASSESSMENT
Databases designed for statistical analyses have characteristics that distinguish them from databases intended for general use. EMAP uses a probabilistic sampling design to collect data to produce statistical assessments of environmental conditions. In addition to supporting the ...
Emergence of biological complexity: Criticality, renewal and memory
International Nuclear Information System (INIS)
Grigolini, Paolo
2015-01-01
The key purpose of this article is to establish a connection between two emerging fields of research in theoretical biology. The former focuses on the concept of criticality borrowed from physics that is expected to be extensible to biology through a robust theoretical approach that although not yet available shall eventually shed light into the origin of cognition. The latter, largely based on the tracking of single molecules diffusing in biological cells, is bringing to the general attention the need to go beyond the ergodic assumption currently done in the traditional statistical physics. We show that replacing critical slowing down with temporal complexity explains why biological systems at criticality are resilient and why long-range correlations are compatible with the free-will condition necessary for the emergence of cognition. Temporal complexity generates ergodicity breakdown and requires new forms of response of complex systems to external stimuli. We concisely illustrate these new forms of information transport and we also address the challenging issue of combining temporal complexity with coherence and renewal with infinite memory.
Fish: A New Computer Program for Friendly Introductory Statistics Help
Brooks, Gordon P.; Raffle, Holly
2005-01-01
All introductory statistics students must master certain basic descriptive statistics, including means, standard deviations and correlations. Students must also gain insight into such complex concepts as the central limit theorem and standard error. This article introduces and describes the Friendly Introductory Statistics Help (FISH) computer…
COMPARATIVE STATISTICAL ANALYSIS OF GENOTYPES’ COMBINING
Directory of Open Access Journals (Sweden)
V. Z. Stetsyuk
2015-05-01
The program provides the creation of desktop program complex for statistics calculations on a personal computer of doctor. Modern methods and tools for development of information systems were described to create program.
Path integral molecular dynamics for exact quantum statistics of multi-electronic-state systems.
Liu, Xinzijian; Liu, Jian
2018-03-14
An exact approach to compute physical properties for general multi-electronic-state (MES) systems in thermal equilibrium is presented. The approach is extended from our recent progress on path integral molecular dynamics (PIMD), Liu et al. [J. Chem. Phys. 145, 024103 (2016)] and Zhang et al. [J. Chem. Phys. 147, 034109 (2017)], for quantum statistical mechanics when a single potential energy surface is involved. We first define an effective potential function that is numerically favorable for MES-PIMD and then derive corresponding estimators in MES-PIMD for evaluating various physical properties. Its application to several representative one-dimensional and multi-dimensional models demonstrates that MES-PIMD in principle offers a practical tool in either of the diabatic and adiabatic representations for studying exact quantum statistics of complex/large MES systems when the Born-Oppenheimer approximation, Condon approximation, and harmonic bath approximation are broken.
Statistical Techniques for Project Control
Badiru, Adedeji B
2012-01-01
A project can be simple or complex. In each case, proven project management processes must be followed. In all cases of project management implementation, control must be exercised in order to assure that project objectives are achieved. Statistical Techniques for Project Control seamlessly integrates qualitative and quantitative tools and techniques for project control. It fills the void that exists in the application of statistical techniques to project control. The book begins by defining the fundamentals of project management then explores how to temper quantitative analysis with qualitati
A Multi-Class, Interdisciplinary Project Using Elementary Statistics
Reese, Margaret
2012-01-01
This article describes a multi-class project that employs statistical computing and writing in a statistics class. Three courses, General Ecology, Meteorology, and Introductory Statistics, cooperated on a project for the EPA's Student Design Competition. The continuing investigation has also spawned several undergraduate research projects in…
The Scythe Statistical Library: An Open Source C++ Library for Statistical Computation
Directory of Open Access Journals (Sweden)
Daniel Pemstein
2011-08-01
Full Text Available The Scythe Statistical Library is an open source C++ library for statistical computation. It includes a suite of matrix manipulation functions, a suite of pseudo-random number generators, and a suite of numerical optimization routines. Programs written using Scythe are generally much faster than those written in commonly used interpreted languages, such as R and proglang{MATLAB}; and can be compiled on any system with the GNU GCC compiler (and perhaps with other C++ compilers. One of the primary design goals of the Scythe developers has been ease of use for non-expert C++ programmers. Ease of use is provided through three primary mechanisms: (1 operator and function over-loading, (2 numerous pre-fabricated utility functions, and (3 clear documentation and example programs. Additionally, Scythe is quite flexible and entirely extensible because the source code is available to all users under the GNU General Public License.
Statistical Model of Extreme Shear
DEFF Research Database (Denmark)
Larsen, Gunner Chr.; Hansen, Kurt Schaldemose
2004-01-01
In order to continue cost-optimisation of modern large wind turbines, it is important to continously increase the knowledge on wind field parameters relevant to design loads. This paper presents a general statistical model that offers site-specific prediction of the probability density function...... by a model that, on a statistically consistent basis, describe the most likely spatial shape of an extreme wind shear event. Predictions from the model have been compared with results from an extreme value data analysis, based on a large number of high-sampled full-scale time series measurements...... are consistent, given the inevitabel uncertainties associated with model as well as with the extreme value data analysis. Keywords: Statistical model, extreme wind conditions, statistical analysis, turbulence, wind loading, statistical analysis, turbulence, wind loading, wind shear, wind turbines....
International Nuclear Information System (INIS)
Chen Kouping; Jiao, Jiu J.; Huang Jianmin; Huang Runqiu
2007-01-01
Multivariate statistical techniques are efficient ways to display complex relationships among many objects. An attempt was made to study the data of trace elements in groundwater using multivariate statistical techniques such as principal component analysis (PCA), Q-mode factor analysis and cluster analysis. The original matrix consisted of 17 trace elements estimated from 55 groundwater samples colleted in 27 wells located in a coastal area in Shenzhen, China. PCA results show that trace elements of V, Cr, As, Mo, W, and U with greatest positive loadings typically occur as soluble oxyanions in oxidizing waters, while Mn and Co with greatest negative loadings are generally more soluble within oxygen depleted groundwater. Cluster analyses demonstrate that most groundwater samples collected from the same well in the study area during summer and winter still fall into the same group. This study also demonstrates the usefulness of multivariate statistical analysis in hydrochemical studies. - Multivariate statistical analysis was used to investigate relationships among trace elements and factors controlling trace element distribution in groundwater
Scheck, Florian
2016-01-01
Scheck’s textbook starts with a concise introduction to classical thermodynamics, including geometrical aspects. Then a short introduction to probabilities and statistics lays the basis for the statistical interpretation of thermodynamics. Phase transitions, discrete models and the stability of matter are explained in great detail. Thermodynamics has a special role in theoretical physics. Due to the general approach of thermodynamics the field has a bridging function between several areas like the theory of condensed matter, elementary particle physics, astrophysics and cosmology. The classical thermodynamics describes predominantly averaged properties of matter, reaching from few particle systems and state of matter to stellar objects. Statistical Thermodynamics covers the same fields, but explores them in greater depth and unifies classical statistical mechanics with quantum theory of multiple particle systems. The content is presented as two tracks: the fast track for master students, providing the essen...
Introduction to generalized linear models
Dobson, Annette J
2008-01-01
Introduction Background Scope Notation Distributions Related to the Normal Distribution Quadratic Forms Estimation Model Fitting Introduction Examples Some Principles of Statistical Modeling Notation and Coding for Explanatory Variables Exponential Family and Generalized Linear Models Introduction Exponential Family of Distributions Properties of Distributions in the Exponential Family Generalized Linear Models Examples Estimation Introduction Example: Failure Times for Pressure Vessels Maximum Likelihood Estimation Poisson Regression Example Inference Introduction Sampling Distribution for Score Statistics Taylor Series Approximations Sampling Distribution for MLEs Log-Likelihood Ratio Statistic Sampling Distribution for the Deviance Hypothesis Testing Normal Linear Models Introduction Basic Results Multiple Linear Regression Analysis of Variance Analysis of Covariance General Linear Models Binary Variables and Logistic Regression Probability Distributions ...
Cohn, T.A.; England, J.F.; Berenbrock, C.E.; Mason, R.R.; Stedinger, J.R.; Lamontagne, J.R.
2013-01-01
he Grubbs-Beck test is recommended by the federal guidelines for detection of low outliers in flood flow frequency computation in the United States. This paper presents a generalization of the Grubbs-Beck test for normal data (similar to the Rosner (1983) test; see also Spencer and McCuen (1996)) that can provide a consistent standard for identifying multiple potentially influential low flows. In cases where low outliers have been identified, they can be represented as “less-than” values, and a frequency distribution can be developed using censored-data statistical techniques, such as the Expected Moments Algorithm. This approach can improve the fit of the right-hand tail of a frequency distribution and provide protection from lack-of-fit due to unimportant but potentially influential low flows (PILFs) in a flood series, thus making the flood frequency analysis procedure more robust.
Statistics: The stethoscope of a thinking urologist
Directory of Open Access Journals (Sweden)
Arun S Sivanandam
2009-01-01
Full Text Available Understanding statistical terminology and the ability to appraise clinical research findings and statistical tests are critical to the practice of evidence-based medicine. Urologists require statistics in their toolbox of skills in order to successfully sift through increasingly complex studies and realize the drawbacks of statistical tests. Currently, the level of evidence in urology literature is low and the majority of research abstracts published for the American Urological Association (AUA meetings lag behind for full-text publication because of a lack of statistical reporting. Underlying these issues is a distinct deficiency in solid comprehension of statistics in the literature and a discomfort with the application of statistics for clinical decision-making. This review examines the plight of statistics in urology and investigates the reason behind the white-coat aversion to biostatistics. Resources such as evidence-based medicine websites, primers in statistics, and guidelines for statistical reporting exist for quick reference by urologists. Ultimately, educators should take charge of monitoring statistical knowledge among trainees by bolstering competency requirements and creating sustained opportunities for statistics and methodology exposure.
Introductory statistics for engineering experimentation
Nelson, Peter R; Coffin, Marie
2003-01-01
The Accreditation Board for Engineering and Technology (ABET) introduced a criterion starting with their 1992-1993 site visits that "Students must demonstrate a knowledge of the application of statistics to engineering problems." Since most engineering curricula are filled with requirements in their own discipline, they generally do not have time for a traditional two semesters of probability and statistics. Attempts to condense that material into a single semester often results in so much time being spent on probability that the statistics useful for designing and analyzing engineering/scientific experiments is never covered. In developing a one-semester course whose purpose was to introduce engineering/scientific students to the most useful statistical methods, this book was created to satisfy those needs. - Provides the statistical design and analysis of engineering experiments & problems - Presents a student-friendly approach through providing statistical models for advanced learning techniques - Cove...
The statistical decay of very hot nuclei: from sequential decay to multifragmentation
International Nuclear Information System (INIS)
Carlson, B.V.; Donangelo, R.; Universidad de la Republica, Montevideo; Souza, S.R.; Universidade Federal do Rio Grande do Sul; Lynch, W.G.; Steiner, A.W.; Tsang, M.B.
2010-01-01
Full text. At low excitation energies, the compound nucleus typically decays through the sequential emission of light particles. As the energy increases, the emission probability of heavier fragments increases until, at sufficiently high energies, several heavy complex fragments are emitted during the decay. The extent to which this fragment emission is simultaneous or sequential has been a subject of theoretical and experimental study for almost 30 years. The Statistical Multifragmentation Model, an equilibrium model of simultaneous fragment emission, uses the configurations of a statistical ensemble to determine the distribution of primary fragments of a compound nucleus. The primary fragments are then assumed to decay by sequential compound emission or Fermi breakup. As the first step toward a more unified model of these processes, we demonstrate the equivalence of a generalized Fermi breakup model, in which densities of excited states are taken into account, to the microcanonical version of the statistical multifragmentation model. We then establish a link between this unified Fermi breakup / statistical multifragmentation model and the well-known process of compound nucleus emission, which permits to consider simultaneous and sequential emission on the same footing. Within this unified framework, we analyze the increasing importance of simultaneous, multifragment decay with increasing excitation energy and decreasing lifetime of the compound nucleus. (author)
Teaching Statistics in Language Testing Courses
Brown, James Dean
2013-01-01
The purpose of this article is to examine the literature on teaching statistics for useful ideas that teachers of language testing courses can draw on and incorporate into their teaching toolkits as they see fit. To those ends, the article addresses eight questions: What is known generally about teaching statistics? Why are students so anxious…
Search Databases and Statistics
DEFF Research Database (Denmark)
Refsgaard, Jan C; Munk, Stephanie; Jensen, Lars J
2016-01-01
having strengths and weaknesses that must be considered for the individual needs. These are reviewed in this chapter. Equally critical for generating highly confident output datasets is the application of sound statistical criteria to limit the inclusion of incorrect peptide identifications from database...... searches. Additionally, careful filtering and use of appropriate statistical tests on the output datasets affects the quality of all downstream analyses and interpretation of the data. Our considerations and general practices on these aspects of phosphoproteomics data processing are presented here....
Directory of Open Access Journals (Sweden)
Theocharis Theofanidis
2016-01-01
Full Text Available Real hypersurfaces satisfying the condition ϕl=lϕ(l=R(·,ξξ have been studied by many authors under at least one more condition, since the class of these hypersurfaces is quite tough to be classified. The aim of the present paper is the classification of real hypersurfaces in complex projective plane CP2 satisfying a generalization of ϕl=lϕ under an additional restriction on a specific function.
Simple statistical methods for software engineering data and patterns
Pandian, C Ravindranath
2015-01-01
Although there are countless books on statistics, few are dedicated to the application of statistical methods to software engineering. Simple Statistical Methods for Software Engineering: Data and Patterns fills that void. Instead of delving into overly complex statistics, the book details simpler solutions that are just as effective and connect with the intuition of problem solvers.Sharing valuable insights into software engineering problems and solutions, the book not only explains the required statistical methods, but also provides many examples, review questions, and case studies that prov
28 CFR 0.93 - Bureau of Justice Statistics.
2010-07-01
... 28 Judicial Administration 1 2010-07-01 2010-07-01 false Bureau of Justice Statistics. 0.93...-Office of Justice Programs and Related Agencies § 0.93 Bureau of Justice Statistics. The Bureau of Justice Statistics is headed by a Director appointed by the President. Under the general authority of the...
Reliability and Efficiency of Generalized Rumor Spreading Model on Complex Social Networks
International Nuclear Information System (INIS)
Naimi, Yaghoob; Naimi, Mohammad
2013-01-01
We introduce the generalized rumor spreading model and investigate some properties of this model on different complex social networks. Despite pervious rumor models that both the spreader-spreader (SS) and the spreader-stifler (SR) interactions have the same rate α, we define α (1) and α (2) for SS and SR interactions, respectively. The effect of variation of α (1) and α (2) on the final density of stiflers is investigated. Furthermore, the influence of the topological structure of the network in rumor spreading is studied by analyzing the behavior of several global parameters such as reliability and efficiency. Our results show that while networks with homogeneous connectivity patterns reach a higher reliability, scale-free topologies need a less time to reach a steady state with respect the rumor. (interdisciplinary physics and related areas of science and technology)
Qi, Di
Turbulent dynamical systems are ubiquitous in science and engineering. Uncertainty quantification (UQ) in turbulent dynamical systems is a grand challenge where the goal is to obtain statistical estimates for key physical quantities. In the development of a proper UQ scheme for systems characterized by both a high-dimensional phase space and a large number of instabilities, significant model errors compared with the true natural signal are always unavoidable due to both the imperfect understanding of the underlying physical processes and the limited computational resources available. One central issue in contemporary research is the development of a systematic methodology for reduced order models that can recover the crucial features both with model fidelity in statistical equilibrium and with model sensitivity in response to perturbations. In the first part, we discuss a general mathematical framework to construct statistically accurate reduced-order models that have skill in capturing the statistical variability in the principal directions of a general class of complex systems with quadratic nonlinearity. A systematic hierarchy of simple statistical closure schemes, which are built through new global statistical energy conservation principles combined with statistical equilibrium fidelity, are designed and tested for UQ of these problems. Second, the capacity of imperfect low-order stochastic approximations to model extreme events in a passive scalar field advected by turbulent flows is investigated. The effects in complicated flow systems are considered including strong nonlinear and non-Gaussian interactions, and much simpler and cheaper imperfect models with model error are constructed to capture the crucial statistical features in the stationary tracer field. Several mathematical ideas are introduced to improve the prediction skill of the imperfect reduced-order models. Most importantly, empirical information theory and statistical linear response theory are
Multidimensional generalized-ensemble algorithms for complex systems.
Mitsutake, Ayori; Okamoto, Yuko
2009-06-07
We give general formulations of the multidimensional multicanonical algorithm, simulated tempering, and replica-exchange method. We generalize the original potential energy function E(0) by adding any physical quantity V of interest as a new energy term. These multidimensional generalized-ensemble algorithms then perform a random walk not only in E(0) space but also in V space. Among the three algorithms, the replica-exchange method is the easiest to perform because the weight factor is just a product of regular Boltzmann-like factors, while the weight factors for the multicanonical algorithm and simulated tempering are not a priori known. We give a simple procedure for obtaining the weight factors for these two latter algorithms, which uses a short replica-exchange simulation and the multiple-histogram reweighting techniques. As an example of applications of these algorithms, we have performed a two-dimensional replica-exchange simulation and a two-dimensional simulated-tempering simulation using an alpha-helical peptide system. From these simulations, we study the helix-coil transitions of the peptide in gas phase and in aqueous solution.
Generalized structured component analysis a component-based approach to structural equation modeling
Hwang, Heungsun
2014-01-01
Winner of the 2015 Sugiyama Meiko Award (Publication Award) of the Behaviormetric Society of Japan Developed by the authors, generalized structured component analysis is an alternative to two longstanding approaches to structural equation modeling: covariance structure analysis and partial least squares path modeling. Generalized structured component analysis allows researchers to evaluate the adequacy of a model as a whole, compare a model to alternative specifications, and conduct complex analyses in a straightforward manner. Generalized Structured Component Analysis: A Component-Based Approach to Structural Equation Modeling provides a detailed account of this novel statistical methodology and its various extensions. The authors present the theoretical underpinnings of generalized structured component analysis and demonstrate how it can be applied to various empirical examples. The book enables quantitative methodologists, applied researchers, and practitioners to grasp the basic concepts behind this new a...
Statistical inference using weak chaos and infinite memory
International Nuclear Information System (INIS)
Welling, Max; Chen Yutian
2010-01-01
We describe a class of deterministic weakly chaotic dynamical systems with infinite memory. These 'herding systems' combine learning and inference into one algorithm, where moments or data-items are converted directly into an arbitrarily long sequence of pseudo-samples. This sequence has infinite range correlations and as such is highly structured. We show that its information content, as measured by sub-extensive entropy, can grow as fast as K log T, which is faster than the usual 1/2 K log T for exchangeable sequences generated by random posterior sampling from a Bayesian model. In one dimension we prove that herding sequences are equivalent to Sturmian sequences which have complexity exactly log(T + 1). More generally, we advocate the application of the rich theoretical framework around nonlinear dynamical systems, chaos theory and fractal geometry to statistical learning.
Statistical inference using weak chaos and infinite memory
Energy Technology Data Exchange (ETDEWEB)
Welling, Max; Chen Yutian, E-mail: welling@ics.uci.ed, E-mail: yutian.chen@uci.ed [Donald Bren School of Information and Computer Science, University of California Irvine CA 92697-3425 (United States)
2010-06-01
We describe a class of deterministic weakly chaotic dynamical systems with infinite memory. These 'herding systems' combine learning and inference into one algorithm, where moments or data-items are converted directly into an arbitrarily long sequence of pseudo-samples. This sequence has infinite range correlations and as such is highly structured. We show that its information content, as measured by sub-extensive entropy, can grow as fast as K log T, which is faster than the usual 1/2 K log T for exchangeable sequences generated by random posterior sampling from a Bayesian model. In one dimension we prove that herding sequences are equivalent to Sturmian sequences which have complexity exactly log(T + 1). More generally, we advocate the application of the rich theoretical framework around nonlinear dynamical systems, chaos theory and fractal geometry to statistical learning.
Statistical inference based on divergence measures
Pardo, Leandro
2005-01-01
The idea of using functionals of Information Theory, such as entropies or divergences, in statistical inference is not new. However, in spite of the fact that divergence statistics have become a very good alternative to the classical likelihood ratio test and the Pearson-type statistic in discrete models, many statisticians remain unaware of this powerful approach.Statistical Inference Based on Divergence Measures explores classical problems of statistical inference, such as estimation and hypothesis testing, on the basis of measures of entropy and divergence. The first two chapters form an overview, from a statistical perspective, of the most important measures of entropy and divergence and study their properties. The author then examines the statistical analysis of discrete multivariate data with emphasis is on problems in contingency tables and loglinear models using phi-divergence test statistics as well as minimum phi-divergence estimators. The final chapter looks at testing in general populations, prese...
Skinner, Carl G; Patel, Manish M; Thomas, Jerry D; Miller, Michael A
2011-01-01
Statistical methods are pervasive in medical research and general medical literature. Understanding general statistical concepts will enhance our ability to critically appraise the current literature and ultimately improve the delivery of patient care. This article intends to provide an overview of the common statistical methods relevant to medicine.
Lepore, Natasha; Brun, Caroline; Chou, Yi-Yu; Chiang, Ming-Chang; Dutton, Rebecca A.; Hayashi, Kiralee M.; Luders, Eileen; Lopez, Oscar L.; Aizenstein, Howard J.; Toga, Arthur W.; Becker, James T.; Thompson, Paul M.
2008-01-01
This paper investigates the performance of a new multivariate method for tensor-based morphometry (TBM). Statistics on Riemannian manifolds are developed that exploit the full information in deformation tensor fields. In TBM, multiple brain images are warped to a common neuroanatomical template via 3-D nonlinear registration; the resulting deformation fields are analyzed statistically to identify group differences in anatomy. Rather than study the Jacobian determinant (volume expansion factor...
Statistics 101 for Radiologists.
Anvari, Arash; Halpern, Elkan F; Samir, Anthony E
2015-10-01
Diagnostic tests have wide clinical applications, including screening, diagnosis, measuring treatment effect, and determining prognosis. Interpreting diagnostic test results requires an understanding of key statistical concepts used to evaluate test efficacy. This review explains descriptive statistics and discusses probability, including mutually exclusive and independent events and conditional probability. In the inferential statistics section, a statistical perspective on study design is provided, together with an explanation of how to select appropriate statistical tests. Key concepts in recruiting study samples are discussed, including representativeness and random sampling. Variable types are defined, including predictor, outcome, and covariate variables, and the relationship of these variables to one another. In the hypothesis testing section, we explain how to determine if observed differences between groups are likely to be due to chance. We explain type I and II errors, statistical significance, and study power, followed by an explanation of effect sizes and how confidence intervals can be used to generalize observed effect sizes to the larger population. Statistical tests are explained in four categories: t tests and analysis of variance, proportion analysis tests, nonparametric tests, and regression techniques. We discuss sensitivity, specificity, accuracy, receiver operating characteristic analysis, and likelihood ratios. Measures of reliability and agreement, including κ statistics, intraclass correlation coefficients, and Bland-Altman graphs and analysis, are introduced. © RSNA, 2015.
Surfaces allowing for fractional statistics
International Nuclear Information System (INIS)
Aneziris, Charilaos.
1992-07-01
In this paper we give a necessary condition in order for a geometrical surface to allow for Abelian fractional statistics. In particular, we show that such statistics is possible only for two-dimentional oriented surfaces of genus zero, namely the sphere S 2 , the plane R 2 and the cylindrical surface R 1 *S 1 , and in general the connected sum of n planes R 2 -R 2 -R 2 -...-R 2 . (Author)
Directory of Open Access Journals (Sweden)
Ayhan Esi
2013-01-01
Full Text Available In this article, we have introduced the idea of statistically convergent generalized difference lacunary double sequence spaces [¯w2 (M, Δn, p,q]θ, [¯w2 (M, Δn, p,q]θ and defined over a semi norm space (X, q. Also we have study some basic properties and obtained some inclusion relations between them.
Chemometrics as a tool to analyse complex chemical mixtures
DEFF Research Database (Denmark)
Christensen, J. H.
Chemical characterisation of contaminant mixtures is important for environmental forensics and risk assessment. The great challenge in future research lies in develop- ing suitable, rapid, reliable and objective methods for analysis of the composition of complex chemical mixtures. This thesis...... describes the development of such methods for assessing the identity (chemical fingerprinting) and fate (e.g. biodegradation) of petroleum hydrocarbon mixtures. The methods comply with the general concept that suitable methods must be rapid and inexpensive, objective with limited human in- tervention...... and at the same time must consider a substantial fraction of compounds in the complex mixture. A combination of a) limited sample preparation, b) rapid chemical screening analysis, c) fast and semi-automatic pre-processing, d) compre- hensive multivariate statistical data analysis and e) objective data evaluation...
Exclusion Statistics in Conformal Field Theory Spectra
International Nuclear Information System (INIS)
Schoutens, K.
1997-01-01
We propose a new method for investigating the exclusion statistics of quasiparticles in conformal field theory (CFT) spectra. The method leads to one-particle distribution functions, which generalize the Fermi-Dirac distribution. For the simplest SU(n) invariant CFTs we find a generalization of Gentile parafermions, and we obtain new distributions for the simplest Z N -invariant CFTs. In special examples, our approach reproduces distributions based on 'fractional exclusion statistics' in the sense of Haldane. We comment on applications to fractional quantum Hall effect edge theories. copyright 1997 The American Physical Society
Variance estimation for complex indicators of poverty and inequality using linearization techniques
Directory of Open Access Journals (Sweden)
Guillaume Osier
2009-12-01
Full Text Available The paper presents the Eurostat experience in calculating measures of precision, including standard errors, confidence intervals and design effect coefficients - the ratio of the variance of a statistic with the actual sample design to the variance of that statistic with a simple random sample of same size - for the "Laeken" indicators, that is, a set of complex indicators of poverty and inequality which had been set out in the framework of the EU-SILC project (European Statistics on Income and Living Conditions. The Taylor linearization method (Tepping, 1968; Woodruff, 1971; Wolter, 1985; Tille, 2000 is actually a well-established method to obtain variance estimators for nonlinear statistics such as ratios, correlation or regression coefficients. It consists of approximating a nonlinear statistic with a linear function of the observations by using first-order Taylor Series expansions. Then, an easily found variance estimator of the linear approximation is used as an estimator of the variance of the nonlinear statistic. Although the Taylor linearization method handles all the nonlinear statistics which can be expressed as a smooth function of estimated totals, the approach fails to encompass the "Laeken" indicators since the latter are having more complex mathematical expressions. Consequently, a generalized linearization method (Deville, 1999, which relies on the concept of influence function (Hampel, Ronchetti, Rousseeuw and Stahel, 1986, has been implemented. After presenting the EU-SILC instrument and the main target indicators for which variance estimates are needed, the paper elaborates on the main features of the linearization approach based on influence functions. Ultimately, estimated standard errors, confidence intervals and design effect coefficients obtained from this approach are presented and discussed.
A new approach to spin and statistics
International Nuclear Information System (INIS)
Kuckert, B.
1994-11-01
We give an algebraic proof of the spin-statistics connection for the parabosonic and parafermionic quantum topological charges of a theory of local observables with a modular P 1 CT-symmetry. The argument avoids the use of the spinor calculus and also works in 1+2 dimensions. It is expected to be a progress towards a general spin-statistics theorem including also (1+2)-dimensional theories with braid group statistics. (orig.)
Statistical principles for prospective study protocols:
DEFF Research Database (Denmark)
Christensen, Robin; Langberg, Henning
2012-01-01
In the design of scientific studies it is essential to decide on which scientific questions one aims to answer, just as it is important to decide on the correct statistical methods to use to answer these questions. The correct use of statistical methods is crucial in all aspects of research...... to quantify relationships in data. Despite an increased focus on statistical content and complexity of biomedical research these topics remain difficult for most researchers. Statistical methods enable researchers to condense large spreadsheets with data into means, proportions, and difference between means......, risk differences, and other quantities that convey information. One of the goals in biomedical research is to develop parsimonious models - meaning as simple as possible. This approach is valid if the subsequent research report (the article) is written independent of whether the results...
Statistical and Computational Techniques in Manufacturing
2012-01-01
In recent years, interest in developing statistical and computational techniques for applied manufacturing engineering has been increased. Today, due to the great complexity of manufacturing engineering and the high number of parameters used, conventional approaches are no longer sufficient. Therefore, in manufacturing, statistical and computational techniques have achieved several applications, namely, modelling and simulation manufacturing processes, optimization manufacturing parameters, monitoring and control, computer-aided process planning, etc. The present book aims to provide recent information on statistical and computational techniques applied in manufacturing engineering. The content is suitable for final undergraduate engineering courses or as a subject on manufacturing at the postgraduate level. This book serves as a useful reference for academics, statistical and computational science researchers, mechanical, manufacturing and industrial engineers, and professionals in industries related to manu...
FUNSTAT and statistical image representations
Parzen, E.
1983-01-01
General ideas of functional statistical inference analysis of one sample and two samples, univariate and bivariate are outlined. ONESAM program is applied to analyze the univariate probability distributions of multi-spectral image data.
33 CFR 209.320 - Policy on release of commercial statistics.
2010-07-01
... statistics. 209.320 Section 209.320 Navigation and Navigable Waters CORPS OF ENGINEERS, DEPARTMENT OF THE... statistics. The collection of commercial statistics pertaining to rivers, harbors, and waterways, and annual... waterway statistics. In case Federal or State agencies or local interests request other than general...
Conformity and statistical tolerancing
Leblond, Laurent; Pillet, Maurice
2018-02-01
Statistical tolerancing was first proposed by Shewhart (Economic Control of Quality of Manufactured Product, (1931) reprinted 1980 by ASQC), in spite of this long history, its use remains moderate. One of the probable reasons for this low utilization is undoubtedly the difficulty for designers to anticipate the risks of this approach. The arithmetic tolerance (worst case) allows a simple interpretation: conformity is defined by the presence of the characteristic in an interval. Statistical tolerancing is more complex in its definition. An interval is not sufficient to define the conformance. To justify the statistical tolerancing formula used by designers, a tolerance interval should be interpreted as the interval where most of the parts produced should probably be located. This tolerance is justified by considering a conformity criterion of the parts guaranteeing low offsets on the latter characteristics. Unlike traditional arithmetic tolerancing, statistical tolerancing requires a sustained exchange of information between design and manufacture to be used safely. This paper proposes a formal definition of the conformity, which we apply successively to the quadratic and arithmetic tolerancing. We introduce a concept of concavity, which helps us to demonstrate the link between tolerancing approach and conformity. We use this concept to demonstrate the various acceptable propositions of statistical tolerancing (in the space decentring, dispersion).
Network geometry with flavor: From complexity to quantum geometry
Bianconi, Ginestra; Rahmede, Christoph
2016-03-01
Network geometry is attracting increasing attention because it has a wide range of applications, ranging from data mining to routing protocols in the Internet. At the same time advances in the understanding of the geometrical properties of networks are essential for further progress in quantum gravity. In network geometry, simplicial complexes describing the interaction between two or more nodes play a special role. In fact these structures can be used to discretize a geometrical d -dimensional space, and for this reason they have already been widely used in quantum gravity. Here we introduce the network geometry with flavor s =-1 ,0 ,1 (NGF) describing simplicial complexes defined in arbitrary dimension d and evolving by a nonequilibrium dynamics. The NGF can generate discrete geometries of different natures, ranging from chains and higher-dimensional manifolds to scale-free networks with small-world properties, scale-free degree distribution, and nontrivial community structure. The NGF admits as limiting cases both the Bianconi-Barabási models for complex networks, the stochastic Apollonian network, and the recently introduced model for complex quantum network manifolds. The thermodynamic properties of NGF reveal that NGF obeys a generalized area law opening a new scenario for formulating its coarse-grained limit. The structure of NGF is strongly dependent on the dimensionality d . In d =1 NGFs grow complex networks for which the preferential attachment mechanism is necessary in order to obtain a scale-free degree distribution. Instead, for NGF with dimension d >1 it is not necessary to have an explicit preferential attachment rule to generate scale-free topologies. We also show that NGF admits a quantum mechanical description in terms of associated quantum network states. Quantum network states evolve by a Markovian dynamics and a quantum network state at time t encodes all possible NGF evolutions up to time t . Interestingly the NGF remains fully classical but
Moments of generalized Husimi distributions and complexity of many-body quantum states
International Nuclear Information System (INIS)
Sugita, Ayumu
2003-01-01
We consider generalized Husimi distributions for many-body systems, and show that their moments are good measures of complexity of many-body quantum states. Our construction of the Husimi distribution is based on the coherent state of the single-particle transformation group. Then the coherent states are independent-particle states, and, at the same time, the most localized states in the Husimi representation. Therefore delocalization of the Husimi distribution, which can be measured by the moments, is a sign of many-body correlation (entanglement). Since the delocalization of the Husimi distribution is also related to chaoticity of the dynamics, it suggests a relation between entanglement and chaos. Our definition of the Husimi distribution can be applied not only to systems of distinguishable particles, but also to those of identical particles, i.e., fermions and bosons. We derive an algebraic formula to evaluate the moments of the Husimi distribution
Statistical black-hole thermodynamics
International Nuclear Information System (INIS)
Bekenstein, J.D.
1975-01-01
Traditional methods from statistical thermodynamics, with appropriate modifications, are used to study several problems in black-hole thermodynamics. Jaynes's maximum-uncertainty method for computing probabilities is used to show that the earlier-formulated generalized second law is respected in statistically averaged form in the process of spontaneous radiation by a Kerr black hole discovered by Hawking, and also in the case of a Schwarzschild hole immersed in a bath of black-body radiation, however cold. The generalized second law is used to motivate a maximum-entropy principle for determining the equilibrium probability distribution for a system containing a black hole. As an application we derive the distribution for the radiation in equilibrium with a Kerr hole (it is found to agree with what would be expected from Hawking's results) and the form of the associated distribution among Kerr black-hole solution states of definite mass. The same results are shown to follow from a statistical interpretation of the concept of black-hole entropy as the natural logarithm of the number of possible interior configurations that are compatible with the given exterior black-hole state. We also formulate a Jaynes-type maximum-uncertainty principle for black holes, and apply it to obtain the probability distribution among Kerr solution states for an isolated radiating Kerr hole
Directory of Open Access Journals (Sweden)
Kuo Zhang
2018-01-01
Full Text Available The mechanisms of acupuncture are still unclear. In order to reveal the regulatory effect of manual acupuncture (MA on the neuroendocrine-immune (NEI network and identify the key signaling molecules during MA modulating NEI network, we used a rat complete Freund’s adjuvant (CFA model to observe the analgesic and anti-inflammatory effect of MA, and, what is more, we used statistical and complex network methods to analyze the data about the expression of 55 common signaling molecules of NEI network in ST36 (Zusanli acupoint, and serum and hind foot pad tissue. The results indicate that MA had significant analgesic, anti-inflammatory effects on CFA rats; the key signaling molecules may play a key role during MA regulating NEI network, but further research is needed.
A statistical mechanical model of economics
Lubbers, Nicholas Edward Williams
Statistical mechanics pursues low-dimensional descriptions of systems with a very large number of degrees of freedom. I explore this theme in two contexts. The main body of this dissertation explores and extends the Yard Sale Model (YSM) of economic transactions using a combination of simulations and theory. The YSM is a simple interacting model for wealth distributions which has the potential to explain the empirical observation of Pareto distributions of wealth. I develop the link between wealth condensation and the breakdown of ergodicity due to nonlinear diffusion effects which are analogous to the geometric random walk. Using this, I develop a deterministic effective theory of wealth transfer in the YSM that is useful for explaining many quantitative results. I introduce various forms of growth to the model, paying attention to the effect of growth on wealth condensation, inequality, and ergodicity. Arithmetic growth is found to partially break condensation, and geometric growth is found to completely break condensation. Further generalizations of geometric growth with growth in- equality show that the system is divided into two phases by a tipping point in the inequality parameter. The tipping point marks the line between systems which are ergodic and systems which exhibit wealth condensation. I explore generalizations of the YSM transaction scheme to arbitrary betting functions to develop notions of universality in YSM-like models. I find that wealth vi condensation is universal to a large class of models which can be divided into two phases. The first exhibits slow, power-law condensation dynamics, and the second exhibits fast, finite-time condensation dynamics. I find that the YSM, which exhibits exponential dynamics, is the critical, self-similar model which marks the dividing line between the two phases. The final chapter develops a low-dimensional approach to materials microstructure quantification. Modern materials design harnesses complex
A general theory of quantum relativity
International Nuclear Information System (INIS)
Minic, Djordje; Tze, C.-H.
2004-01-01
The geometric form of standard quantum mechanics is compatible with the two postulates: (1) the laws of physics are invariant under the choice of experimental setup and (2) every quantum observation or event is intrinsically statistical. These postulates remain compatible within a background independent extension of quantum theory with a local intrinsic time implying the relativity of the concept of a quantum event. In this extension the space of quantum events becomes dynamical and only individual quantum events make sense observationally. At the core of such a general theory of quantum relativity is the three-way interplay between the symplectic form, the dynamical metric and non-integrable almost complex structure of the space of quantum events. Such a formulation provides a missing conceptual ingredient in the search for a background independent quantum theory of gravity and matter. The crucial new technical element in our scheme derives from a set of recent mathematical results on certain infinite-dimensional almost Kahler manifolds which replace the complex projective spaces of standard quantum mechanics
Huppert, Theodore J
2016-01-01
Functional near-infrared spectroscopy (fNIRS) is a noninvasive neuroimaging technique that uses low levels of light to measure changes in cerebral blood oxygenation levels. In the majority of NIRS functional brain studies, analysis of this data is based on a statistical comparison of hemodynamic levels between a baseline and task or between multiple task conditions by means of a linear regression model: the so-called general linear model. Although these methods are similar to their implementation in other fields, particularly for functional magnetic resonance imaging, the specific application of these methods in fNIRS research differs in several key ways related to the sources of noise and artifacts unique to fNIRS. In this brief communication, we discuss the application of linear regression models in fNIRS and the modifications needed to generalize these models in order to deal with structured (colored) noise due to systemic physiology and noise heteroscedasticity due to motion artifacts. The objective of this work is to present an overview of these noise properties in the context of the linear model as it applies to fNIRS data. This work is aimed at explaining these mathematical issues to the general fNIRS experimental researcher but is not intended to be a complete mathematical treatment of these concepts.
Statistical mechanics in the context of special relativity. II.
Kaniadakis, G
2005-09-01
The special relativity laws emerge as one-parameter (light speed) generalizations of the corresponding laws of classical physics. These generalizations, imposed by the Lorentz transformations, affect both the definition of the various physical observables (e.g., momentum, energy, etc.), as well as the mathematical apparatus of the theory. Here, following the general lines of [Phys. Rev. E 66, 056125 (2002)], we show that the Lorentz transformations impose also a proper one-parameter generalization of the classical Boltzmann-Gibbs-Shannon entropy. The obtained relativistic entropy permits us to construct a coherent and self-consistent relativistic statistical theory, preserving the main features of the ordinary statistical theory, which is recovered in the classical limit. The predicted distribution function is a one-parameter continuous deformation of the classical Maxwell-Boltzmann distribution and has a simple analytic form, showing power law tails in accordance with the experimental evidence. Furthermore, this statistical mechanics can be obtained as the stationary case of a generalized kinetic theory governed by an evolution equation obeying the H theorem and reproducing the Boltzmann equation of the ordinary kinetics in the classical limit.
A goodness of fit statistic for the geometric distribution
Ferreira, J.A.
2003-01-01
textabstractWe propose a goodness of fit statistic for the geometric distribution and compare it in terms of power, via simulation, with the chi-square statistic. The statistic is based on the Lau-Rao theorem and can be seen as a discrete analogue of the total time on test statistic. The results suggest that the test based on the new statistic is generally superior to the chi-square test.
The CEO performance effect : Statistical issues and a complex fit perspective
Blettner, D.P.; Chaddad, F.R.; Bettis, R.
2012-01-01
How CEOs affect strategy and performance is important to strategic management research. We show that sophisticated statistical analysis alone is problematic for establishing the magnitude and causes of CEO impact on performance. We discuss three problem areas that substantially distort the
International Nuclear Information System (INIS)
Zhou Sumin; Das, Shiva; Wang Zhiheng; Marks, Lawrence B.
2004-01-01
The generalized equivalent uniform dose (GEUD) model uses a power-law formalism, where the outcome is related to the dose via a power law. We herein investigate the mathematical compatibility between this GEUD model and the Poisson statistics based tumor control probability (TCP) model. The GEUD and TCP formulations are combined and subjected to a compatibility constraint equation. This compatibility constraint equates tumor control probability from the original heterogeneous target dose distribution to that from the homogeneous dose from the GEUD formalism. It is shown that this constraint equation possesses a unique, analytical closed-form solution which relates radiation dose to the tumor cell survival fraction. It is further demonstrated that, when there is no positive threshold or finite critical dose in the tumor response to radiation, this relationship is not bounded within the realistic cell survival limits of 0%-100%. Thus, the GEUD and TCP formalisms are, in general, mathematically inconsistent. However, when a threshold dose or finite critical dose exists in the tumor response to radiation, there is a unique mathematical solution for the tumor cell survival fraction that allows the GEUD and TCP formalisms to coexist, provided that all portions of the tumor are confined within certain specific dose ranges
The spin-statistics connection in quantum gravity
International Nuclear Information System (INIS)
Balachandran, A.P.; Batista, E.; Costa e Silva, I.P.; Teotonio-Sobrinho, P.
2000-01-01
It is well known that in spite of sharing some properties with conventional particles, topological geons in general violate the spin-statistics theorem. On the other hand, it is generally believed that in quantum gravity theories allowing for topology change, using pair creation and annihilation of geons, one should be able to recover this theorem. In this paper, we take an alternative route, and use an algebraic formalism developed in previous work. We give a description of topological geons where an algebra of 'observables' is identified and quantized. Different irreducible representations of this algebra correspond to different kinds of geons, and are labeled by a non-abelian 'charge' and 'magnetic flux'. We then find that the usual spin-statistics theorem is indeed violated, but a new spin-statistics relation arises, when we assume that the fluxes are superselected. This assumption can be proved if all observables are local, as is generally the case in physical theories. Finally, we also discuss how our approach fits into conventional formulations of quantum gravity
FAA statistical handbook of aviation
1994-01-01
This report presents statistical information pertaining to the Federal Aviation Administration, the National Airspace System, Airports, Airport Activity, U.S. Civil Air Carrier Fleet, U.S. Civil Air Carrier Operating Data, Airmen, General Aviation Ai...
Natrella, Mary Gibbons
1963-01-01
Formulated to assist scientists and engineers engaged in army ordnance research and development programs, this well-known and highly regarded handbook is a ready reference for advanced undergraduate and graduate students as well as for professionals seeking engineering information and quantitative data for designing, developing, constructing, and testing equipment. Topics include characterizing and comparing the measured performance of a material, product, or process; general considerations in planning experiments; statistical techniques for analyzing extreme-value data; use of transformations
Design research in statistics education : on symbolizing and computer tools
Bakker, A.
2004-01-01
The present knowledge society requires statistical literacy-the ability to interpret, critically evaluate, and communicate about statistical information and messages (Gal, 2002). However, research shows that students generally do not gain satisfactory statistical understanding. The research
The Statistics of wood assays for preservative retention
Patricia K. Lebow; Scott W. Conklin
2011-01-01
This paper covers general statistical concepts that apply to interpreting wood assay retention values. In particular, since wood assays are typically obtained from a single composited sample, the statistical aspects, including advantages and disadvantages, of simple compositing are covered.
Heuristic versus statistical physics approach to optimization problems
International Nuclear Information System (INIS)
Jedrzejek, C.; Cieplinski, L.
1995-01-01
Optimization is a crucial ingredient of many calculation schemes in science and engineering. In this paper we assess several classes of methods: heuristic algorithms, methods directly relying on statistical physics such as the mean-field method and simulated annealing; and Hopfield-type neural networks and genetic algorithms partly related to statistical physics. We perform the analysis for three types of problems: (1) the Travelling Salesman Problem, (2) vector quantization, and (3) traffic control problem in multistage interconnection network. In general, heuristic algorithms perform better (except for genetic algorithms) and much faster but have to be specific for every problem. The key to improving the performance could be to include heuristic features into general purpose statistical physics methods. (author)
Do doctors need statistics? Doctors' use of and attitudes to probability and statistics.
Swift, Louise; Miles, Susan; Price, Gill M; Shepstone, Lee; Leinster, Sam J
2009-07-10
There is little published evidence on what doctors do in their work that requires probability and statistics, yet the General Medical Council (GMC) requires new doctors to have these skills. This study investigated doctors' use of and attitudes to probability and statistics with a view to informing undergraduate teaching.An email questionnaire was sent to 473 clinicians with an affiliation to the University of East Anglia's Medical School.Of 130 respondents approximately 90 per cent of doctors who performed each of the following activities found probability and statistics useful for that activity: accessing clinical guidelines and evidence summaries, explaining levels of risk to patients, assessing medical marketing and advertising material, interpreting the results of a screening test, reading research publications for general professional interest, and using research publications to explore non-standard treatment and management options.Seventy-nine per cent (103/130, 95 per cent CI 71 per cent, 86 per cent) of participants considered probability and statistics important in their work. Sixty-three per cent (78/124, 95 per cent CI 54 per cent, 71 per cent) said that there were activities that they could do better or start doing if they had an improved understanding of these areas and 74 of these participants elaborated on this. Themes highlighted by participants included: being better able to critically evaluate other people's research; becoming more research-active, having a better understanding of risk; and being better able to explain things to, or teach, other people.Our results can be used to inform how probability and statistics should be taught to medical undergraduates and should encourage today's medical students of the subjects' relevance to their future careers. Copyright 2009 John Wiley & Sons, Ltd.
Towards an Information Theory of Complex Networks
Dehmer, Matthias; Mehler, Alexander
2011-01-01
For over a decade, complex networks have steadily grown as an important tool across a broad array of academic disciplines, with applications ranging from physics to social media. A tightly organized collection of carefully-selected papers on the subject, Towards an Information Theory of Complex Networks: Statistical Methods and Applications presents theoretical and practical results about information-theoretic and statistical models of complex networks in the natural sciences and humanities. The book's major goal is to advocate and promote a combination of graph-theoretic, information-theoreti
Fermi-Dirac statistics and traffic in complex networks.
de Moura, Alessandro P S
2005-06-01
We propose an idealized model for traffic in a network, in which many particles move randomly from node to node, following the network's links, and it is assumed that at most one particle can occupy any given node. This is intended to mimic the finite forwarding capacity of nodes in communication networks, thereby allowing the possibility of congestion and jamming phenomena. We show that the particles behave like free fermions, with appropriately defined energy-level structure and temperature. The statistical properties of this system are thus given by the corresponding Fermi-Dirac distribution. We use this to obtain analytical expressions for dynamical quantities of interest, such as the mean occupation of each node and the transport efficiency, for different network topologies and particle densities. We show that the subnetwork of free nodes always fragments into small isolated clusters for a sufficiently large number of particles, implying a communication breakdown at some density for all network topologies. These results are compared to direct simulations.
Statistical and particle physics: Common problems and techniques
International Nuclear Information System (INIS)
Bowler, K.C.; Mc Kane, A.J.
1984-01-01
These proceedings contain statistical mechanical studies in condensed matter physics; interfacial problems in statistical physics; string theory; general monte carlo methods and their application to Lattice gauge theories; topological excitations in field theory; phase transformation kinetics; and studies of chaotic systems
2012 Aerospace Medical Certification Statistical Handbook
2013-12-01
2012 Aerospace Medical Certification Statistical Handbook Valerie J. Skaggs Ann I. Norris Civil Aerospace Medical Institute Federal Aviation...Certification Statistical Handbook December 2013 6. Performing Organization Code 7. Author(s) 8. Performing Organization Report No. Skaggs VJ, Norris AI 9...2.57 Hayfever 14,477 2.49 Asthma 12,558 2.16 Other general heart pathology (abnormal ECG, open heart surgery, etc.). Wolff-Parkinson-White syndrome
Ignorability in Statistical and Probabilistic Inference
DEFF Research Database (Denmark)
Jaeger, Manfred
2005-01-01
When dealing with incomplete data in statistical learning, or incomplete observations in probabilistic inference, one needs to distinguish the fact that a certain event is observed from the fact that the observed event has happened. Since the modeling and computational complexities entailed...
The epistemological status of general circulation models
Loehle, Craig
2018-03-01
Forecasts of both likely anthropogenic effects on climate and consequent effects on nature and society are based on large, complex software tools called general circulation models (GCMs). Forecasts generated by GCMs have been used extensively in policy decisions related to climate change. However, the relation between underlying physical theories and results produced by GCMs is unclear. In the case of GCMs, many discretizations and approximations are made, and simulating Earth system processes is far from simple and currently leads to some results with unknown energy balance implications. Statistical testing of GCM forecasts for degree of agreement with data would facilitate assessment of fitness for use. If model results need to be put on an anomaly basis due to model bias, then both visual and quantitative measures of model fit depend strongly on the reference period used for normalization, making testing problematic. Epistemology is here applied to problems of statistical inference during testing, the relationship between the underlying physics and the models, the epistemic meaning of ensemble statistics, problems of spatial and temporal scale, the existence or not of an unforced null for climate fluctuations, the meaning of existing uncertainty estimates, and other issues. Rigorous reasoning entails carefully quantifying levels of uncertainty.
Statistical Inferences from the Topology of Complex Networks
2016-10-04
stable, does not lose any information, has continuous and discrete versions, and obeys a strong law of large numbers and a central limit theorem. The...paper (with J.A. Scott) “Categorification of persistent homology” [7] in the journal Discrete and Computational Geome- try and the paper “Metrics for...Generalized Persistence Modules” (with J.A. Scott and V. de Silva) in the journal Foundations of Computational Math - ematics [5]. These papers develop
Dynamics, stability, and statistics on lattices and networks
International Nuclear Information System (INIS)
Livi, Roberto
2014-01-01
These lectures aim at surveying some dynamical models that have been widely explored in the recent scientific literature as case studies of complex dynamical evolution, emerging from the spatio-temporal organization of several coupled dynamical variables. The first message is that a suitable mathematical description of such models needs tools and concepts borrowed from the general theory of dynamical systems and from out-of-equilibrium statistical mechanics. The second message is that the overall scenario is definitely reacher than the standard problems in these fields. For instance, systems exhibiting complex unpredictable evolution do not necessarily exhibit deterministic chaotic behavior (i.e., Lyapunov chaos) as it happens for dynamical models made of a few degrees of freedom. In fact, a very large number of spatially organized dynamical variables may yield unpredictable evolution even in the absence of Lyapunov instability. Such a mechanism may emerge from the combination of spatial extension and nonlinearity. Moreover, spatial extension allows one to introduce naturally disorder, or heterogeneity of the interactions as important ingredients for complex evolution. It is worth to point out that the models discussed in these lectures share such features, despite they have been inspired by quite different physical and biological problems. Along these lectures we describe also some of the technical tools employed for the study of such models, e.g., Lyapunov stability analysis, unpredictability indicators for “stable chaos,” hydrodynamic description of transport in low spatial dimension, spectral decomposition of stochastic dynamics on directed networks, etc
A statistical physics perspective on criticality in financial markets
International Nuclear Information System (INIS)
Bury, Thomas
2013-01-01
Stock markets are complex systems exhibiting collective phenomena and particular features such as synchronization, fluctuations distributed as power-laws, non-random structures and similarity to neural networks. Such specific properties suggest that markets operate at a very special point. Financial markets are believed to be critical by analogy to physical systems, but little statistically founded evidence has been given. Through a data-based methodology and comparison to simulations inspired by the statistical physics of complex systems, we show that the Dow Jones and index sets are not rigorously critical. However, financial systems are closer to criticality in the crash neighborhood. (paper)
Statistical learning across development: Flexible yet constrained
Directory of Open Access Journals (Sweden)
Lauren eKrogh
2013-01-01
Full Text Available Much research in the past two decades has documented infants’ and adults' ability to extract statistical regularities from auditory input. Importantly, recent research has extended these findings to the visual domain, demonstrating learners' sensitivity to statistical patterns within visual arrays and sequences of shapes. In this review we discuss both auditory and visual statistical learning to elucidate both the generality of and constraints on statistical learning. The review first outlines the major findings of the statistical learning literature with infants, followed by discussion of statistical learning across domains, modalities, and development. The second part of this review considers constraints on statistical learning. The discussion focuses on two categories of constraint: constraints on the types of input over which statistical learning operates and constraints based on the state of the learner. The review concludes with a discussion of possible mechanisms underlying statistical learning.
International Nuclear Information System (INIS)
Weathers, J.B.; Luck, R.; Weathers, J.W.
2009-01-01
The complexity of mathematical models used by practicing engineers is increasing due to the growing availability of sophisticated mathematical modeling tools and ever-improving computational power. For this reason, the need to define a well-structured process for validating these models against experimental results has become a pressing issue in the engineering community. This validation process is partially characterized by the uncertainties associated with the modeling effort as well as the experimental results. The net impact of the uncertainties on the validation effort is assessed through the 'noise level of the validation procedure', which can be defined as an estimate of the 95% confidence uncertainty bounds for the comparison error between actual experimental results and model-based predictions of the same quantities of interest. Although general descriptions associated with the construction of the noise level using multivariate statistics exists in the literature, a detailed procedure outlining how to account for the systematic and random uncertainties is not available. In this paper, the methodology used to derive the covariance matrix associated with the multivariate normal pdf based on random and systematic uncertainties is examined, and a procedure used to estimate this covariance matrix using Monte Carlo analysis is presented. The covariance matrices are then used to construct approximate 95% confidence constant probability contours associated with comparison error results for a practical example. In addition, the example is used to show the drawbacks of using a first-order sensitivity analysis when nonlinear local sensitivity coefficients exist. Finally, the example is used to show the connection between the noise level of the validation exercise calculated using multivariate and univariate statistics.
Energy Technology Data Exchange (ETDEWEB)
Weathers, J.B. [Shock, Noise, and Vibration Group, Northrop Grumman Shipbuilding, P.O. Box 149, Pascagoula, MS 39568 (United States)], E-mail: James.Weathers@ngc.com; Luck, R. [Department of Mechanical Engineering, Mississippi State University, 210 Carpenter Engineering Building, P.O. Box ME, Mississippi State, MS 39762-5925 (United States)], E-mail: Luck@me.msstate.edu; Weathers, J.W. [Structural Analysis Group, Northrop Grumman Shipbuilding, P.O. Box 149, Pascagoula, MS 39568 (United States)], E-mail: Jeffrey.Weathers@ngc.com
2009-11-15
The complexity of mathematical models used by practicing engineers is increasing due to the growing availability of sophisticated mathematical modeling tools and ever-improving computational power. For this reason, the need to define a well-structured process for validating these models against experimental results has become a pressing issue in the engineering community. This validation process is partially characterized by the uncertainties associated with the modeling effort as well as the experimental results. The net impact of the uncertainties on the validation effort is assessed through the 'noise level of the validation procedure', which can be defined as an estimate of the 95% confidence uncertainty bounds for the comparison error between actual experimental results and model-based predictions of the same quantities of interest. Although general descriptions associated with the construction of the noise level using multivariate statistics exists in the literature, a detailed procedure outlining how to account for the systematic and random uncertainties is not available. In this paper, the methodology used to derive the covariance matrix associated with the multivariate normal pdf based on random and systematic uncertainties is examined, and a procedure used to estimate this covariance matrix using Monte Carlo analysis is presented. The covariance matrices are then used to construct approximate 95% confidence constant probability contours associated with comparison error results for a practical example. In addition, the example is used to show the drawbacks of using a first-order sensitivity analysis when nonlinear local sensitivity coefficients exist. Finally, the example is used to show the connection between the noise level of the validation exercise calculated using multivariate and univariate statistics.
Generalized statistical convergence of order β for sequences of fuzzy numbers
Altınok, Hıfsı; Karakaş, Abdulkadir; Altın, Yavuz
2018-01-01
In the present paper, we introduce the concepts of Δm-statistical convergence of order β for sequences of fuzzy numbers and strongly Δm-summable of order β for sequences of fuzzy numbers by using a modulus function f and taking supremum on metric d for 0 < β ≤ 1 and give some inclusion relations between them.
Statistical physics and computational methods for evolutionary game theory
Javarone, Marco Alberto
2018-01-01
This book presents an introduction to Evolutionary Game Theory (EGT) which is an emerging field in the area of complex systems attracting the attention of researchers from disparate scientific communities. EGT allows one to represent and study several complex phenomena, such as the emergence of cooperation in social systems, the role of conformity in shaping the equilibrium of a population, and the dynamics in biological and ecological systems. Since EGT models belong to the area of complex systems, statistical physics constitutes a fundamental ingredient for investigating their behavior. At the same time, the complexity of some EGT models, such as those realized by means of agent-based methods, often require the implementation of numerical simulations. Therefore, beyond providing an introduction to EGT, this book gives a brief overview of the main statistical physics tools (such as phase transitions and the Ising model) and computational strategies for simulating evolutionary games (such as Monte Carlo algor...
Statistical Challenges in Modeling Big Brain Signals
Yu, Zhaoxia; Pluta, Dustin; Shen, Tong; Chen, Chuansheng; Xue, Gui; Ombao, Hernando
2017-01-01
Brain signal data are inherently big: massive in amount, complex in structure, and high in dimensions. These characteristics impose great challenges for statistical inference and learning. Here we review several key challenges, discuss possible
Statistical learning methods: Basics, control and performance
Energy Technology Data Exchange (ETDEWEB)
Zimmermann, J. [Max-Planck-Institut fuer Physik, Foehringer Ring 6, 80805 Munich (Germany)]. E-mail: zimmerm@mppmu.mpg.de
2006-04-01
The basics of statistical learning are reviewed with a special emphasis on general principles and problems for all different types of learning methods. Different aspects of controlling these methods in a physically adequate way will be discussed. All principles and guidelines will be exercised on examples for statistical learning methods in high energy and astrophysics. These examples prove in addition that statistical learning methods very often lead to a remarkable performance gain compared to the competing classical algorithms.
Statistical learning methods: Basics, control and performance
International Nuclear Information System (INIS)
Zimmermann, J.
2006-01-01
The basics of statistical learning are reviewed with a special emphasis on general principles and problems for all different types of learning methods. Different aspects of controlling these methods in a physically adequate way will be discussed. All principles and guidelines will be exercised on examples for statistical learning methods in high energy and astrophysics. These examples prove in addition that statistical learning methods very often lead to a remarkable performance gain compared to the competing classical algorithms
Dexter, Franklin; Shafer, Steven L
2017-03-01
Considerable attention has been drawn to poor reproducibility in the biomedical literature. One explanation is inadequate reporting of statistical methods by authors and inadequate assessment of statistical reporting and methods during peer review. In this narrative review, we examine scientific studies of several well-publicized efforts to improve statistical reporting. We also review several retrospective assessments of the impact of these efforts. These studies show that instructions to authors and statistical checklists are not sufficient; no findings suggested that either improves the quality of statistical methods and reporting. Second, even basic statistics, such as power analyses, are frequently missing or incorrectly performed. Third, statistical review is needed for all papers that involve data analysis. A consistent finding in the studies was that nonstatistical reviewers (eg, "scientific reviewers") and journal editors generally poorly assess statistical quality. We finish by discussing our experience with statistical review at Anesthesia & Analgesia from 2006 to 2016.
Statistics a guide to the use of statistical methods in the physical sciences
Barlow, Roger J
1989-01-01
The Manchester Physics Series General Editors: D. J. Sandiford; F. Mandl; A. C. Phillips Department of Physics and Astronomy, University of Manchester Properties of Matter B. H. Flowers and E. Mendoza Optics Second Edition F. G. Smith and J. H. Thomson Statistical Physics Second Edition F. Mandl Electromagnetism Second Edition I. S. Grant and W. R. Phillips Statistics R. J. Barlow Solid State Physics Second Edition J. R. Hook and H. E. Hall Quantum Mechanics F. Mandl Particle Physics Second Edition B. R. Martin and G. Shaw The Physics of Stars Second Edition A.C. Phillips Computing for Scienti
Introduction to Focus Issue: Complex network perspectives on flow systems.
Donner, Reik V; Hernández-García, Emilio; Ser-Giacomi, Enrico
2017-03-01
During the last few years, complex network approaches have demonstrated their great potentials as versatile tools for exploring the structural as well as dynamical properties of dynamical systems from a variety of different fields. Among others, recent successful examples include (i) functional (correlation) network approaches to infer hidden statistical interrelationships between macroscopic regions of the human brain or the Earth's climate system, (ii) Lagrangian flow networks allowing to trace dynamically relevant fluid-flow structures in atmosphere, ocean or, more general, the phase space of complex systems, and (iii) time series networks unveiling fundamental organization principles of dynamical systems. In this spirit, complex network approaches have proven useful for data-driven learning of dynamical processes (like those acting within and between sub-components of the Earth's climate system) that are hidden to other analysis techniques. This Focus Issue presents a collection of contributions addressing the description of flows and associated transport processes from the network point of view and its relationship to other approaches which deal with fluid transport and mixing and/or use complex network techniques.
Refined generalized multiscale entropy analysis for physiological signals
Liu, Yunxiao; Lin, Youfang; Wang, Jing; Shang, Pengjian
2018-01-01
Multiscale entropy analysis has become a prevalent complexity measurement and been successfully applied in various fields. However, it only takes into account the information of mean values (first moment) in coarse-graining procedure. Then generalized multiscale entropy (MSEn) considering higher moments to coarse-grain a time series was proposed and MSEσ2 has been implemented. However, the MSEσ2 sometimes may yield an imprecise estimation of entropy or undefined entropy, and reduce statistical reliability of sample entropy estimation as scale factor increases. For this purpose, we developed the refined model, RMSEσ2, to improve MSEσ2. Simulations on both white noise and 1 / f noise show that RMSEσ2 provides higher entropy reliability and reduces the occurrence of undefined entropy, especially suitable for short time series. Besides, we discuss the effect on RMSEσ2 analysis from outliers, data loss and other concepts in signal processing. We apply the proposed model to evaluate the complexity of heartbeat interval time series derived from healthy young and elderly subjects, patients with congestive heart failure and patients with atrial fibrillation respectively, compared to several popular complexity metrics. The results demonstrate that RMSEσ2 measured complexity (a) decreases with aging and diseases, and (b) gives significant discrimination between different physiological/pathological states, which may facilitate clinical application.
DEFF Research Database (Denmark)
Sjöstrand, Karl; Cardenas, Valerie A.; Larsen, Rasmus
2008-01-01
regression to address this issue, allowing for a gradual introduction of correlation information into the model. We make the connections between ridge regression and voxel-wise procedures explicit and discuss relations to other statistical methods. Results are given on an in-vivo data set of deformation......Whole-brain morphometry denotes a group of methods with the aim of relating clinical and cognitive measurements to regions of the brain. Typically, such methods require the statistical analysis of a data set with many variables (voxels and exogenous variables) paired with few observations (subjects...
Generalized memory associativity in a network model for the neuroses
Wedemann, Roseli S.; Donangelo, Raul; de Carvalho, Luís A. V.
2009-03-01
We review concepts introduced in earlier work, where a neural network mechanism describes some mental processes in neurotic pathology and psychoanalytic working-through, as associative memory functioning, according to the findings of Freud. We developed a complex network model, where modules corresponding to sensorial and symbolic memories interact, representing unconscious and conscious mental processes. The model illustrates Freud's idea that consciousness is related to symbolic and linguistic memory activity in the brain. We have introduced a generalization of the Boltzmann machine to model memory associativity. Model behavior is illustrated with simulations and some of its properties are analyzed with methods from statistical mechanics.
A Model Fit Statistic for Generalized Partial Credit Model
Liang, Tie; Wells, Craig S.
2009-01-01
Investigating the fit of a parametric model is an important part of the measurement process when implementing item response theory (IRT), but research examining it is limited. A general nonparametric approach for detecting model misfit, introduced by J. Douglas and A. S. Cohen (2001), has exhibited promising results for the two-parameter logistic…
1989 lectures in complex systems
International Nuclear Information System (INIS)
Jen, E.
1990-01-01
This report contains papers on the following topics: Lectures on a Theory of Computation and Complexity over the Reals; Algorithmic Information Content, Church-Turing Thesis, Physical Entroph, and Maxwell's Demon; Physical Measures of Complexity; An Introduction to Chaos and Prediction; Hamiltonian Chaos in Nonlinear Polarized Optical Beam; Chemical Oscillators and Nonlinear Chemical Dynamics; Isotropic Navier-Stokes Turbulence. I. Qualitative Features and Basic Equations; Isotropic Navier-Stokes Turbulence. II. Statistical Approximation Methods; Lattice Gases; Data-Parallel Computation and the Connection Machine; Preimages and Forecasting for Cellular Automata; Lattice-Gas Models for Multiphase Flows and Magnetohydrodynamics; Probabilistic Cellular Automata: Some Statistical Mechanical Considerations; Complexity Due to Disorder and Frustration; Self-Organization by Simulated Evolution; Theoretical Immunology; Morphogenesis by Cell Intercalation; and Theoretical Physics Meets Experimental Neurobiology
Statistical hypothesis testing with SAS and R
Taeger, Dirk
2014-01-01
A comprehensive guide to statistical hypothesis testing with examples in SAS and R When analyzing datasets the following questions often arise:Is there a short hand procedure for a statistical test available in SAS or R?If so, how do I use it?If not, how do I program the test myself? This book answers these questions and provides an overview of the most commonstatistical test problems in a comprehensive way, making it easy to find and performan appropriate statistical test. A general summary of statistical test theory is presented, along with a basicdescription for each test, including the
Estimation of global network statistics from incomplete data.
Directory of Open Access Journals (Sweden)
Catherine A Bliss
Full Text Available Complex networks underlie an enormous variety of social, biological, physical, and virtual systems. A profound complication for the science of complex networks is that in most cases, observing all nodes and all network interactions is impossible. Previous work addressing the impacts of partial network data is surprisingly limited, focuses primarily on missing nodes, and suggests that network statistics derived from subsampled data are not suitable estimators for the same network statistics describing the overall network topology. We generate scaling methods to predict true network statistics, including the degree distribution, from only partial knowledge of nodes, links, or weights. Our methods are transparent and do not assume a known generating process for the network, thus enabling prediction of network statistics for a wide variety of applications. We validate analytical results on four simulated network classes and empirical data sets of various sizes. We perform subsampling experiments by varying proportions of sampled data and demonstrate that our scaling methods can provide very good estimates of true network statistics while acknowledging limits. Lastly, we apply our techniques to a set of rich and evolving large-scale social networks, Twitter reply networks. Based on 100 million tweets, we use our scaling techniques to propose a statistical characterization of the Twitter Interactome from September 2008 to November 2008. Our treatment allows us to find support for Dunbar's hypothesis in detecting an upper threshold for the number of active social contacts that individuals maintain over the course of one week.
Generalized connectivity of graphs
Li, Xueliang
2016-01-01
Noteworthy results, proof techniques, open problems and conjectures in generalized (edge-) connectivity are discussed in this book. Both theoretical and practical analyses for generalized (edge-) connectivity of graphs are provided. Topics covered in this book include: generalized (edge-) connectivity of graph classes, algorithms, computational complexity, sharp bounds, Nordhaus-Gaddum-type results, maximum generalized local connectivity, extremal problems, random graphs, multigraphs, relations with the Steiner tree packing problem and generalizations of connectivity. This book enables graduate students to understand and master a segment of graph theory and combinatorial optimization. Researchers in graph theory, combinatorics, combinatorial optimization, probability, computer science, discrete algorithms, complexity analysis, network design, and the information transferring models will find this book useful in their studies.
Bayesian approach to inverse statistical mechanics
Habeck, Michael
2014-05-01
Inverse statistical mechanics aims to determine particle interactions from ensemble properties. This article looks at this inverse problem from a Bayesian perspective and discusses several statistical estimators to solve it. In addition, a sequential Monte Carlo algorithm is proposed that draws the interaction parameters from their posterior probability distribution. The posterior probability involves an intractable partition function that is estimated along with the interactions. The method is illustrated for inverse problems of varying complexity, including the estimation of a temperature, the inverse Ising problem, maximum entropy fitting, and the reconstruction of molecular interaction potentials.
Niemann, Brand Lee
A major field program to study beta-mesoscale transport and dispersion over complex mountainous terrain was conducted during 1969 with the cooperation of three government agencies at the White Sands Missile Range in central Utah. The purpose of the program was to measure simultaneously on a large number of days the synoptic and mesoscale wind fields, the relative dispersion between pairs of particle trajectories and the rate of small scale turbulence dissipation. The field program included measurements during more than 60 days in the months of March, June, and November. The large quantity of data generated from this program has been processed and analyzed to provide case studies and statistics to evaluate and refine Lagrangian variable trajectory models. The case studies selected to illustrate the complexities of mesoscale transport and dispersion over complex terrain include those with terrain blocking, lee waves, and stagnation, as well as those with large vertical wind shears and horizontal wind field deformation. The statistics of relative particle dispersion were computed and compared to the classical theories of Richardson and Batchelor and the more recent theories of Lin and Kao among others. The relative particle dispersion was generally found to increase with travel time in the alongwind and crosswind directions, but in a more oscillatory than sustained or even accelerated manner as predicted by most theories, unless substantial wind shears or finite vertical separations between particles were present. The relative particle dispersion in the vertical was generally found to be small and bounded even when substantial vertical motions due to lee waves were present because of the limiting effect of stable temperature stratification. The data show that velocity shears have a more significant effect than turbulence on relative particle dispersion and that sufficient turbulence may not always be present above the planetary boundary layer for "wind direction shear
Statistical convergence of a non-positive approximation process
International Nuclear Information System (INIS)
Agratini, Octavian
2011-01-01
Highlights: → A general class of approximation processes is introduced. → The A-statistical convergence is studied. → Applications in quantum calculus are delivered. - Abstract: Starting from a general sequence of linear and positive operators of discrete type, we associate its r-th order generalization. This construction involves high order derivatives of a signal and it looses the positivity property. Considering that the initial approximation process is A-statistically uniform convergent, we prove that the property is inherited by the new sequence. Also, our result includes information about the uniform convergence. Two applications in q-Calculus are presented. We study q-analogues both of Meyer-Koenig and Zeller operators and Stancu operators.
Statistical Yearbook of Norway 2012
Energy Technology Data Exchange (ETDEWEB)
NONE
2012-07-01
The Statistical Yearbook of Norway 2012 contains statistics on Norway and main figures for the Nordic countries and other countries selected from international statistics. The international over-views are integrated with the other tables and figures. The selection of tables in this edition is mostly the same as in the 2011 edition. The yearbook's 480 tables and figures present the main trends in official statistics in most areas of society. The list of tables and figures and an index at the back of the book provide easy access to relevant information. In addition, source information and Internet addresses below the tables make the yearbook a good starting point for those who are looking for more detailed statistics. The statistics are based on data gathered in statistical surveys and from administrative data, which, in cooperation with other public institutions, have been made available for statistical purposes. Some tables have been prepared in their entirety by other public institutions. The statistics follow approved principles, standards and classifications that are in line with international recommendations and guidelines. Content: 00. General subjects; 01. Environment; 02. Population; 03. Health and social conditions; 04. Education; 05. Personal economy and housing conditions; 06. Labour market; 07. Recreational, cultural and sporting activities; 08. Prices and indices; 09. National Economy and external trade; 10. Industrial activities; 11. Financial markets; 12. Public finances; Geographical survey.(eb)
Statistical Yearbook of Norway 2012
Energy Technology Data Exchange (ETDEWEB)
NONE
2012-07-01
The Statistical Yearbook of Norway 2012 contains statistics on Norway and main figures for the Nordic countries and other countries selected from international statistics. The international over-views are integrated with the other tables and figures. The selection of tables in this edition is mostly the same as in the 2011 edition. The yearbook's 480 tables and figures present the main trends in official statistics in most areas of society. The list of tables and figures and an index at the back of the book provide easy access to relevant information. In addition, source information and Internet addresses below the tables make the yearbook a good starting point for those who are looking for more detailed statistics. The statistics are based on data gathered in statistical surveys and from administrative data, which, in cooperation with other public institutions, have been made available for statistical purposes. Some tables have been prepared in their entirety by other public institutions. The statistics follow approved principles, standards and classifications that are in line with international recommendations and guidelines. Content: 00. General subjects; 01. Environment; 02. Population; 03. Health and social conditions; 04. Education; 05. Personal economy and housing conditions; 06. Labour market; 07. Recreational, cultural and sporting activities; 08. Prices and indices; 09. National Economy and external trade; 10. Industrial activities; 11. Financial markets; 12. Public finances; Geographical survey.(eb)
Quantum-statistical kinetic equations
International Nuclear Information System (INIS)
Loss, D.; Schoeller, H.
1989-01-01
Considering a homogeneous normal quantum fluid consisting of identical interacting fermions or bosons, the authors derive an exact quantum-statistical generalized kinetic equation with a collision operator given as explicit cluster series where exchange effects are included through renormalized Liouville operators. This new result is obtained by applying a recently developed superoperator formalism (Liouville operators, cluster expansions, symmetrized projectors, P q -rule, etc.) to nonequilibrium systems described by a density operator ρ(t) which obeys the von Neumann equation. By means of this formalism a factorization theorem is proven (being essential for obtaining closed equations), and partial resummations (leading to renormalized quantities) are performed. As an illustrative application, the quantum-statistical versions (including exchange effects due to Fermi-Dirac or Bose-Einstein statistics) of the homogeneous Boltzmann (binary collisions) and Choh-Uhlenbeck (triple collisions) equations are derived
Inferring general relations between network characteristics from specific network ensembles.
Cardanobile, Stefano; Pernice, Volker; Deger, Moritz; Rotter, Stefan
2012-01-01
Different network models have been suggested for the topology underlying complex interactions in natural systems. These models are aimed at replicating specific statistical features encountered in real-world networks. However, it is rarely considered to which degree the results obtained for one particular network class can be extrapolated to real-world networks. We address this issue by comparing different classical and more recently developed network models with respect to their ability to generate networks with large structural variability. In particular, we consider the statistical constraints which the respective construction scheme imposes on the generated networks. After having identified the most variable networks, we address the issue of which constraints are common to all network classes and are thus suitable candidates for being generic statistical laws of complex networks. In fact, we find that generic, not model-related dependencies between different network characteristics do exist. This makes it possible to infer global features from local ones using regression models trained on networks with high generalization power. Our results confirm and extend previous findings regarding the synchronization properties of neural networks. Our method seems especially relevant for large networks, which are difficult to map completely, like the neural networks in the brain. The structure of such large networks cannot be fully sampled with the present technology. Our approach provides a method to estimate global properties of under-sampled networks in good approximation. Finally, we demonstrate on three different data sets (C. elegans neuronal network, R. prowazekii metabolic network, and a network of synonyms extracted from Roget's Thesaurus) that real-world networks have statistical relations compatible with those obtained using regression models.
On the generalized entropy pseudoadditivity for complex systems
International Nuclear Information System (INIS)
Wang, Qiuping A.; Nivanen, Laurent; Le Mehaute, Alain; Pezeril, Michel
2002-01-01
We show that Abe's general pseudoadditivity for entropy prescribed by thermal equilibrium in nonextensive systems holds not only for entropy, but also for energy. The application of this general pseudoadditivity to Tsallis entropy tells us that the factorization of the probability of a composite system into a product of the probabilities of the subsystems is just a consequence of the existence of thermal equilibrium and not due to the independence of the subsystems. (author)
Directory of Open Access Journals (Sweden)
Rochelle E. Tractenberg
2016-12-01
Full Text Available Statistical literacy is essential to an informed citizenry; and two emerging trends highlight a growing need for training that achieves this literacy. The first trend is towards “big” data: while automated analyses can exploit massive amounts of data, the interpretation—and possibly more importantly, the replication—of results are challenging without adequate statistical literacy. The second trend is that science and scientific publishing are struggling with insufficient/inappropriate statistical reasoning in writing, reviewing, and editing. This paper describes a model for statistical literacy (SL and its development that can support modern scientific practice. An established curriculum development and evaluation tool—the Mastery Rubric—is integrated with a new, developmental, model of statistical literacy that reflects the complexity of reasoning and habits of mind that scientists need to cultivate in order to recognize, choose, and interpret statistical methods. This developmental model provides actionable evidence, and explicit opportunities for consequential assessment that serves students, instructors, developers/reviewers/accreditors of a curriculum, and institutions. By supporting the enrichment, rather than increasing the amount, of statistical training in the basic and life sciences, this approach supports curriculum development, evaluation, and delivery to promote statistical literacy for students and a collective quantitative proficiency more broadly.
Enabling quaternion derivatives: the generalized HR calculus
Xu, Dongpo; Jahanchahi, Cyrus; Took, Clive C.; Mandic, Danilo P.
2015-01-01
Quaternion derivatives exist only for a very restricted class of analytic (regular) functions; however, in many applications, functions of interest are real-valued and hence not analytic, a typical case being the standard real mean square error objective function. The recent HR calculus is a step forward and provides a way to calculate derivatives and gradients of both analytic and non-analytic functions of quaternion variables; however, the HR calculus can become cumbersome in complex optimization problems due to the lack of rigorous product and chain rules, a consequence of the non-commutativity of quaternion algebra. To address this issue, we introduce the generalized HR (GHR) derivatives which employ quaternion rotations in a general orthogonal system and provide the left- and right-hand versions of the quaternion derivative of general functions. The GHR calculus also solves the long-standing problems of product and chain rules, mean-value theorem and Taylor's theorem in the quaternion field. At the core of the proposed GHR calculus is quaternion rotation, which makes it possible to extend the principle to other functional calculi in non-commutative settings. Examples in statistical learning theory and adaptive signal processing support the analysis. PMID:26361555
Martin, E. Dale
1989-01-01
The paper introduces a new theory of N-dimensional complex variables and analytic functions which, for N greater than 2, is both a direct generalization and a close analog of the theory of ordinary complex variables. The algebra in the present theory is a commutative ring, not a field. Functions of a three-dimensional variable were defined and the definition of the derivative then led to analytic functions.
Occupancy statistics arising from weighted particle rearrangements
International Nuclear Information System (INIS)
Huillet, Thierry
2007-01-01
The box-occupancy distributions arising from weighted rearrangements of a particle system are investigated. In the grand-canonical ensemble, they are characterized by determinantal joint probability generating functions. For doubly non-negative weight matrices, fractional occupancy statistics, generalizing Fermi-Dirac and Bose-Einstein statistics, can be defined. A spatially extended version of these balls-in-boxes problems is investigated
Testing statistical hypotheses of equivalence
Wellek, Stefan
2010-01-01
Equivalence testing has grown significantly in importance over the last two decades, especially as its relevance to a variety of applications has become understood. Yet published work on the general methodology remains scattered in specialists' journals, and for the most part, it focuses on the relatively narrow topic of bioequivalence assessment.With a far broader perspective, Testing Statistical Hypotheses of Equivalence provides the first comprehensive treatment of statistical equivalence testing. The author addresses a spectrum of specific, two-sided equivalence testing problems, from the
Fisher's Contributions to Statistics
Indian Academy of Sciences (India)
Home; Journals; Resonance – Journal of Science Education; Volume 2; Issue 9. Fisher's Contributions to Statistics. T Krishnan. General Article Volume 2 Issue 9 September 1997 pp 32-37. Fulltext. Click here to view fulltext PDF. Permanent link: https://www.ias.ac.in/article/fulltext/reso/002/09/0032-0037. Author Affiliations.
Foundation of statistical energy analysis in vibroacoustics
Le Bot, A
2015-01-01
This title deals with the statistical theory of sound and vibration. The foundation of statistical energy analysis is presented in great detail. In the modal approach, an introduction to random vibration with application to complex systems having a large number of modes is provided. For the wave approach, the phenomena of propagation, group speed, and energy transport are extensively discussed. Particular emphasis is given to the emergence of diffuse field, the central concept of the theory.
Statistical properties of deep inelastic reactions
International Nuclear Information System (INIS)
Moretto, L.G.
1983-08-01
The multifaceted aspects of deep-inelastic heavy-ion collisions are discussed in terms of the statistical equilibrium limit. It is shown that a conditional statistical equilibrium, where a number of degrees of freedom are thermalized while others are still relaxing, prevails in most of these reactions. The individual degrees of freedom that have been explored experimentally are considered in their statistical equilibrium limit, and the extent to which they appear to be thermalized is discussed. The interaction between degrees of freedom on their way towards equilibrium is shown to create complex feedback phenomena that may lead to self-regulation. A possible example of self-regulation is shown for the process of energy partition between fragments promoted by particle exchange. 35 references
A New Approach to Monte Carlo Simulations in Statistical Physics
Landau, David P.
2002-08-01
Monte Carlo simulations [1] have become a powerful tool for the study of diverse problems in statistical/condensed matter physics. Standard methods sample the probability distribution for the states of the system, most often in the canonical ensemble, and over the past several decades enormous improvements have been made in performance. Nonetheless, difficulties arise near phase transitions-due to critical slowing down near 2nd order transitions and to metastability near 1st order transitions, and these complications limit the applicability of the method. We shall describe a new Monte Carlo approach [2] that uses a random walk in energy space to determine the density of states directly. Once the density of states is known, all thermodynamic properties can be calculated. This approach can be extended to multi-dimensional parameter spaces and should be effective for systems with complex energy landscapes, e.g., spin glasses, protein folding models, etc. Generalizations should produce a broadly applicable optimization tool. 1. A Guide to Monte Carlo Simulations in Statistical Physics, D. P. Landau and K. Binder (Cambridge U. Press, Cambridge, 2000). 2. Fugao Wang and D. P. Landau, Phys. Rev. Lett. 86, 2050 (2001); Phys. Rev. E64, 056101-1 (2001).
Inductive, Analogical, and Communicative Generalization
Directory of Open Access Journals (Sweden)
Adri Smaling
2003-03-01
Full Text Available Three forms of inductive generalization - statistical generalization, variation-based generalization and theory-carried generalization - are insufficient concerning case-to-case generalization, which is a form of analogical generalization. The quality of case-to-case generalization needs to be reinforced by setting up explicit analogical argumentation. To evaluate analogical argumentation six criteria are discussed. Good analogical reasoning is an indispensable support to forms of communicative generalization - receptive and responsive (participative generalization — as well as exemplary generalization.
Complexity, Metastability and Nonextensivity
Beck, C.; Benedek, G.; Rapisarda, A.; Tsallis, C.
Work and heat fluctuations in systems with deterministic and stochastic forces / E. G. D. Cohen and R. Van Zon -- Is the entropy S[symbol] extensive or nonextensive? / C. Tsallis -- Superstatistics: recent developments and applications / C. Beck -- Two stories outside Boltzmann-Gibbs statistics: Mori's Q-phase transitions and glassy dynamics at the onset of chaos / A. Robledo, F. Baldovin and E. Mayoral -- Time-averages and the heat theorem / A. Carati -- Fundamental formulae and numerical evidences for the central limit theorem in Tsallis statistics / H. Suyari -- Generalizing the Planck distribution / A. M. C. Soma and C. Tsallis -- The physical roots of complexity: renewal or modulation? / P. Grigolini -- Nonequivalent ensembles and metastability / H. Touchette and R. S. Ellis -- Statistical physics for cosmic structures / L. Pietronero and F. Sylos Labini -- Metastability and anomalous behavior in the HMF model: connections to nonextensive thermodynamics and glassy dynamics / A. Pluchino, A. Rapisarda and V. Latora -- Vlasov analysis of relaxation and meta-equilibrium / C. Anteneodo and R. O. Vallejos -- Weak chaos in large conservative systems - infinite-range coupled standard maps / L. G. Moyano, A. P. Majtey and C. Tsallis -- Deterministc aging / E. Barkai -- Edge of chaos of the classical kicked top map: sensitivity to initial conditions / S. M. Duarte Queirós and C. Tsallis -- What entropy at the edge of chaos? / M. Lissia, M. Coraddu and R. Tonelli -- Fractal growth of carbon schwarzites / G. Benedek ... [et al.] -- Clustering and interface propagation in interacting particle dynamics / A. Provata and V. K. Noussiou -- Resonant activation and noise enhanced stability in Josephson junctions / A. L. Pankratov and B. Spagnolo -- Symmetry breaking induced directed motions / C.-H. Chang and T. Y. Tsong -- General theory of Galilean-invariant entropic lattic Boltzmann models / B. M. Boghosian -- Unifying approach to the jamming transition in granular media and
Diabetic retinopathy and complexity of retinal surgery in a general hospital.
Mijangos-Medina, Laura Fanny; Hurtado-Noriega, Blanca Esmeralda; Lima-Gómez, Virgilio
2012-01-01
Usual retinal surgery (vitrectomy or surgery for retinal detachment) may require additional procedures to deal with complex cases, which increase time and resource use and delay access to treatment. We undertook this study to identify the proportion of primary retinal surgeries that required complex procedures and the associated causes. We carried out an observational, descriptive, cross-sectional, retrospective study. Patients with primary retinal surgery were evaluated (January 2007-December 2010). The proportion and 95% confidence intervals (CI) of preoperative diagnosis and cause of the disease requiring retinal surgery as well as the causes for complex retinal surgery were identified. Complex retinal surgery was defined as that requiring lens extraction, intraocular lens implantation, heavy perfluorocarbon liquids, silicone oil tamponade or intravitreal drugs, in addition to the usual surgical retinal procedure. The proportion of complex retinal surgeries was compared among preoperative diagnoses and among causes (χ(2), odds ratio [OR]). We studied 338 eyes. Mean age of subjects was 53.7 years, and there were 49% females. The most common diagnoses were vitreous hemorrhage (27.2%) and rhegmatogenous retinal detachment (24.6%). The most common cause was diabetes (50.6%); 273 eyes required complex surgery (80.8%, 95% CI: 76.6-85). The proportion did not differ among diagnoses but was higher in diabetic retinopathy (89%, p diabetic retinopathy increased by 3-fold the probability of requiring these complex procedures. Early treatment of diabetic retinopathy may reduce the proportion of complex retinal surgery by 56%.
Gelfand, I M; Graev, M I; Vilenkin, N Y; Pyatetskii-Shapiro, I I
Volume 1 is devoted to basics of the theory of generalized functions. The first chapter contains main definitions and most important properties of generalized functions as functional on the space of smooth functions with compact support. The second chapter talks about the Fourier transform of generalized functions. In Chapter 3, definitions and properties of some important classes of generalized functions are discussed; in particular, generalized functions supported on submanifolds of lower dimension, generalized functions associated with quadratic forms, and homogeneous generalized functions are studied in detail. Many simple basic examples make this book an excellent place for a novice to get acquainted with the theory of generalized functions. A long appendix presents basics of generalized functions of complex variables.
A statistical mechanical approach to restricted integer partition functions
Zhou, Chi-Chun; Dai, Wu-Sheng
2018-05-01
The main aim of this paper is twofold: (1) suggesting a statistical mechanical approach to the calculation of the generating function of restricted integer partition functions which count the number of partitions—a way of writing an integer as a sum of other integers under certain restrictions. In this approach, the generating function of restricted integer partition functions is constructed from the canonical partition functions of various quantum gases. (2) Introducing a new type of restricted integer partition functions corresponding to general statistics which is a generalization of Gentile statistics in statistical mechanics; many kinds of restricted integer partition functions are special cases of this restricted integer partition function. Moreover, with statistical mechanics as a bridge, we reveal a mathematical fact: the generating function of restricted integer partition function is just the symmetric function which is a class of functions being invariant under the action of permutation groups. Using this approach, we provide some expressions of restricted integer partition functions as examples.
Applied statistics for social and management sciences
Miah, Abdul Quader
2016-01-01
This book addresses the application of statistical techniques and methods across a wide range of disciplines. While its main focus is on the application of statistical methods, theoretical aspects are also provided as fundamental background information. It offers a systematic interpretation of results often discovered in general descriptions of methods and techniques such as linear and non-linear regression. SPSS is also used in all the application aspects. The presentation of data in the form of tables and graphs throughout the book not only guides users, but also explains the statistical application and assists readers in interpreting important features. The analysis of statistical data is presented consistently throughout the text. Academic researchers, practitioners and other users who work with statistical data will benefit from reading Applied Statistics for Social and Management Sciences. .
Siddiqui, Maheen; Wedemann, Roseli S.; Jensen, Henrik Jeldtoft
2018-01-01
We explore statistical characteristics of avalanches associated with the dynamics of a complex-network model, where two modules corresponding to sensorial and symbolic memories interact, representing unconscious and conscious mental processes. The model illustrates Freud's ideas regarding the neuroses and that consciousness is related with symbolic and linguistic memory activity in the brain. It incorporates the Stariolo-Tsallis generalization of the Boltzmann Machine in order to model memory retrieval and associativity. In the present work, we define and measure avalanche size distributions during memory retrieval, in order to gain insight regarding basic aspects of the functioning of these complex networks. The avalanche sizes defined for our model should be related to the time consumed and also to the size of the neuronal region which is activated, during memory retrieval. This allows the qualitative comparison of the behaviour of the distribution of cluster sizes, obtained during fMRI measurements of the propagation of signals in the brain, with the distribution of avalanche sizes obtained in our simulation experiments. This comparison corroborates the indication that the Nonextensive Statistical Mechanics formalism may indeed be more well suited to model the complex networks which constitute brain and mental structure.
Statistical ecology comes of age
Gimenez, Olivier; Buckland, Stephen T.; Morgan, Byron J. T.; Bez, Nicolas; Bertrand, Sophie; Choquet, Rémi; Dray, Stéphane; Etienne, Marie-Pierre; Fewster, Rachel; Gosselin, Frédéric; Mérigot, Bastien; Monestiez, Pascal; Morales, Juan M.; Mortier, Frédéric; Munoz, François; Ovaskainen, Otso; Pavoine, Sandrine; Pradel, Roger; Schurr, Frank M.; Thomas, Len; Thuiller, Wilfried; Trenkel, Verena; de Valpine, Perry; Rexstad, Eric
2014-01-01
The desire to predict the consequences of global environmental change has been the driver towards more realistic models embracing the variability and uncertainties inherent in ecology. Statistical ecology has gelled over the past decade as a discipline that moves away from describing patterns towards modelling the ecological processes that generate these patterns. Following the fourth International Statistical Ecology Conference (1–4 July 2014) in Montpellier, France, we analyse current trends in statistical ecology. Important advances in the analysis of individual movement, and in the modelling of population dynamics and species distributions, are made possible by the increasing use of hierarchical and hidden process models. Exciting research perspectives include the development of methods to interpret citizen science data and of efficient, flexible computational algorithms for model fitting. Statistical ecology has come of age: it now provides a general and mathematically rigorous framework linking ecological theory and empirical data. PMID:25540151
Statistical ecology comes of age.
Gimenez, Olivier; Buckland, Stephen T; Morgan, Byron J T; Bez, Nicolas; Bertrand, Sophie; Choquet, Rémi; Dray, Stéphane; Etienne, Marie-Pierre; Fewster, Rachel; Gosselin, Frédéric; Mérigot, Bastien; Monestiez, Pascal; Morales, Juan M; Mortier, Frédéric; Munoz, François; Ovaskainen, Otso; Pavoine, Sandrine; Pradel, Roger; Schurr, Frank M; Thomas, Len; Thuiller, Wilfried; Trenkel, Verena; de Valpine, Perry; Rexstad, Eric
2014-12-01
The desire to predict the consequences of global environmental change has been the driver towards more realistic models embracing the variability and uncertainties inherent in ecology. Statistical ecology has gelled over the past decade as a discipline that moves away from describing patterns towards modelling the ecological processes that generate these patterns. Following the fourth International Statistical Ecology Conference (1-4 July 2014) in Montpellier, France, we analyse current trends in statistical ecology. Important advances in the analysis of individual movement, and in the modelling of population dynamics and species distributions, are made possible by the increasing use of hierarchical and hidden process models. Exciting research perspectives include the development of methods to interpret citizen science data and of efficient, flexible computational algorithms for model fitting. Statistical ecology has come of age: it now provides a general and mathematically rigorous framework linking ecological theory and empirical data.
Wang, W. L.; Tsui, K. L.; Lo, S. M.; Liu, S. B.
2018-01-01
Crowded transportation hubs such as metro stations are thought as ideal places for the development and spread of epidemics. However, for the special features of complex spatial layout, confined environment with a large number of highly mobile individuals, it is difficult to quantify human contacts in such environments, wherein disease spreading dynamics were less explored in the previous studies. Due to the heterogeneity and dynamic nature of human interactions, increasing studies proved the importance of contact distance and length of contact in transmission probabilities. In this study, we show how detailed information on contact and exposure patterns can be obtained by statistical analyses on microscopic crowd simulation data. To be specific, a pedestrian simulation model-CityFlow was employed to reproduce individuals' movements in a metro station based on site survey data, values and distributions of individual contact rate and exposure in different simulation cases were obtained and analyzed. It is interesting that Weibull distribution fitted the histogram values of individual-based exposure in each case very well. Moreover, we found both individual contact rate and exposure had linear relationship with the average crowd densities of the environments. The results obtained in this paper can provide reference to epidemic study in complex and confined transportation hubs and refine the existing disease spreading models.
Directory of Open Access Journals (Sweden)
Jinping Liu
Full Text Available Computer vision as a fast, low-cost, noncontact, and online monitoring technology has been an important tool to inspect product quality, particularly on a large-scale assembly production line. However, the current industrial vision system is far from satisfactory in the intelligent perception of complex grain images, comprising a large number of local homogeneous fragmentations or patches without distinct foreground and background. We attempt to solve this problem based on the statistical modeling of spatial structures of grain images. We present a physical explanation in advance to indicate that the spatial structures of the complex grain images are subject to a representative Weibull distribution according to the theory of sequential fragmentation, which is well known in the continued comminution of ore grinding. To delineate the spatial structure of the grain image, we present a method of multiscale and omnidirectional Gaussian derivative filtering. Then, a product quality classifier based on sparse multikernel-least squares support vector machine is proposed to solve the low-confidence classification problem of imbalanced data distribution. The proposed method is applied on the assembly line of a food-processing enterprise to classify (or identify automatically the production quality of rice. The experiments on the real application case, compared with the commonly used methods, illustrate the validity of our method.
Liu, Jinping; Tang, Zhaohui; Zhang, Jin; Chen, Qing; Xu, Pengfei; Liu, Wenzhong
2016-01-01
Computer vision as a fast, low-cost, noncontact, and online monitoring technology has been an important tool to inspect product quality, particularly on a large-scale assembly production line. However, the current industrial vision system is far from satisfactory in the intelligent perception of complex grain images, comprising a large number of local homogeneous fragmentations or patches without distinct foreground and background. We attempt to solve this problem based on the statistical modeling of spatial structures of grain images. We present a physical explanation in advance to indicate that the spatial structures of the complex grain images are subject to a representative Weibull distribution according to the theory of sequential fragmentation, which is well known in the continued comminution of ore grinding. To delineate the spatial structure of the grain image, we present a method of multiscale and omnidirectional Gaussian derivative filtering. Then, a product quality classifier based on sparse multikernel-least squares support vector machine is proposed to solve the low-confidence classification problem of imbalanced data distribution. The proposed method is applied on the assembly line of a food-processing enterprise to classify (or identify) automatically the production quality of rice. The experiments on the real application case, compared with the commonly used methods, illustrate the validity of our method.
Smith, Stephen D
2011-01-01
This book is intended as an overview of a research area that combines geometries for groups (such as Tits buildings and generalizations), topological aspects of simplicial complexes from p-subgroups of a group (in the spirit of Brown, Quillen, and Webb), and combinatorics of partially ordered sets. The material is intended to serve as an advanced graduate-level text and partly as a general reference on the research area. The treatment offers optional tracks for the reader interested in buildings, geometries for sporadic simple groups, and G-equivariant equivalences and homology for subgroup complexes.
A simple and fast representation space for classifying complex time series
International Nuclear Information System (INIS)
Zunino, Luciano; Olivares, Felipe; Bariviera, Aurelio F.; Rosso, Osvaldo A.
2017-01-01
In the context of time series analysis considerable effort has been directed towards the implementation of efficient discriminating statistical quantifiers. Very recently, a simple and fast representation space has been introduced, namely the number of turning points versus the Abbe value. It is able to separate time series from stationary and non-stationary processes with long-range dependences. In this work we show that this bidimensional approach is useful for distinguishing complex time series: different sets of financial and physiological data are efficiently discriminated. Additionally, a multiscale generalization that takes into account the multiple time scales often involved in complex systems has been also proposed. This multiscale analysis is essential to reach a higher discriminative power between physiological time series in health and disease. - Highlights: • A bidimensional scheme has been tested for classification purposes. • A multiscale generalization is introduced. • Several practical applications confirm its usefulness. • Different sets of financial and physiological data are efficiently distinguished. • This multiscale bidimensional approach has high potential as discriminative tool.
A simple and fast representation space for classifying complex time series
Energy Technology Data Exchange (ETDEWEB)
Zunino, Luciano, E-mail: lucianoz@ciop.unlp.edu.ar [Centro de Investigaciones Ópticas (CONICET La Plata – CIC), C.C. 3, 1897 Gonnet (Argentina); Departamento de Ciencias Básicas, Facultad de Ingeniería, Universidad Nacional de La Plata (UNLP), 1900 La Plata (Argentina); Olivares, Felipe, E-mail: olivaresfe@gmail.com [Instituto de Física, Pontificia Universidad Católica de Valparaíso (PUCV), 23-40025 Valparaíso (Chile); Bariviera, Aurelio F., E-mail: aurelio.fernandez@urv.cat [Department of Business, Universitat Rovira i Virgili, Av. Universitat 1, 43204 Reus (Spain); Rosso, Osvaldo A., E-mail: oarosso@gmail.com [Instituto de Física, Universidade Federal de Alagoas (UFAL), BR 104 Norte km 97, 57072-970, Maceió, Alagoas (Brazil); Instituto Tecnológico de Buenos Aires (ITBA) and CONICET, C1106ACD, Av. Eduardo Madero 399, Ciudad Autónoma de Buenos Aires (Argentina); Complex Systems Group, Facultad de Ingeniería y Ciencias Aplicadas, Universidad de los Andes, Av. Mons. Álvaro del Portillo 12.455, Las Condes, Santiago (Chile)
2017-03-18
In the context of time series analysis considerable effort has been directed towards the implementation of efficient discriminating statistical quantifiers. Very recently, a simple and fast representation space has been introduced, namely the number of turning points versus the Abbe value. It is able to separate time series from stationary and non-stationary processes with long-range dependences. In this work we show that this bidimensional approach is useful for distinguishing complex time series: different sets of financial and physiological data are efficiently discriminated. Additionally, a multiscale generalization that takes into account the multiple time scales often involved in complex systems has been also proposed. This multiscale analysis is essential to reach a higher discriminative power between physiological time series in health and disease. - Highlights: • A bidimensional scheme has been tested for classification purposes. • A multiscale generalization is introduced. • Several practical applications confirm its usefulness. • Different sets of financial and physiological data are efficiently distinguished. • This multiscale bidimensional approach has high potential as discriminative tool.
Fade statistics of M-turbulent optical links
DEFF Research Database (Denmark)
Jurado-Navas, Antonio; Maria Garrido-Balsells, Jose; Castillo-Vazquez, Miguel
2017-01-01
A new and generalized statistical model, called Malaga or simply M distribution, has been derived recently to characterize the irradiance fluctuations of an unbounded optical wavefront propagating through a turbulent medium under all irradiance fluctuation conditions. The aforementioned model...... extends and unifies in a simple analytical closed-form expression most of the proposed statistical models for free-space optical (FSO) communications widely employed until now in the scientific literature. Based on that M model, we have studied some important features associated to its fade statistics...
Small violations of particle statistics
International Nuclear Information System (INIS)
Greenberg, O.W.
1992-01-01
This paper reports on the particle statistics menagerie for identical particles (in 3 + 1 dimensions) which consists of fermions (all states totally antisymmetric), bosons (all states totally symmetric), parafermions of order p (all representations of the symmetric group with Young tableaux having at most p boxes in a row) and parabosons of order p (all representations with at most p boxes in a column). p = 1 for parafermions is the same as Fermi, and p = 1 for parabosons is the same as Bose. These possibilities were derived in a general way by Doplicher, Haag and Roberts, who found one other case, infinite statistics for which all representations of the symmetric group occur, but did not give an algebra which leads to this statistics
Statistical Description of Segregation in a Powder Mixture
DEFF Research Database (Denmark)
Chapiro, Alexander; Stenby, Erling Halfdan
1996-01-01
In this paper we apply the statistical mechanics of powders to describe a segregated state in a mixture of grains of different sizes. Variation of the density of a packing with depth arising due to changes of particle configurations is studied. The statistical mechanics of powders is generalized...
Statistical Model of Extreme Shear
DEFF Research Database (Denmark)
Hansen, Kurt Schaldemose; Larsen, Gunner Chr.
2005-01-01
In order to continue cost-optimisation of modern large wind turbines, it is important to continuously increase the knowledge of wind field parameters relevant to design loads. This paper presents a general statistical model that offers site-specific prediction of the probability density function...... by a model that, on a statistically consistent basis, describes the most likely spatial shape of an extreme wind shear event. Predictions from the model have been compared with results from an extreme value data analysis, based on a large number of full-scale measurements recorded with a high sampling rate...
What do we gain from simplicity versus complexity in species distribution models?
Merow, Cory; Smith, Matthew J.; Edwards, Thomas C.; Guisan, Antoine; McMahon, Sean M.; Normand, Signe; Thuiller, Wilfried; Wuest, Rafael O.; Zimmermann, Niklaus E.; Elith, Jane
2014-01-01
Species distribution models (SDMs) are widely used to explain and predict species ranges and environmental niches. They are most commonly constructed by inferring species' occurrence–environment relationships using statistical and machine-learning methods. The variety of methods that can be used to construct SDMs (e.g. generalized linear/additive models, tree-based models, maximum entropy, etc.), and the variety of ways that such models can be implemented, permits substantial flexibility in SDM complexity. Building models with an appropriate amount of complexity for the study objectives is critical for robust inference. We characterize complexity as the shape of the inferred occurrence–environment relationships and the number of parameters used to describe them, and search for insights into whether additional complexity is informative or superfluous. By building ‘under fit’ models, having insufficient flexibility to describe observed occurrence–environment relationships, we risk misunderstanding the factors shaping species distributions. By building ‘over fit’ models, with excessive flexibility, we risk inadvertently ascribing pattern to noise or building opaque models. However, model selection can be challenging, especially when comparing models constructed under different modeling approaches. Here we argue for a more pragmatic approach: researchers should constrain the complexity of their models based on study objective, attributes of the data, and an understanding of how these interact with the underlying biological processes. We discuss guidelines for balancing under fitting with over fitting and consequently how complexity affects decisions made during model building. Although some generalities are possible, our discussion reflects differences in opinions that favor simpler versus more complex models. We conclude that combining insights from both simple and complex SDM building approaches best advances our knowledge of current and future species
Statistical analysis of magnetically soft particles in magnetorheological elastomers
Gundermann, T.; Cremer, P.; Löwen, H.; Menzel, A. M.; Odenbach, S.
2017-04-01
The physical properties of magnetorheological elastomers (MRE) are a complex issue and can be influenced and controlled in many ways, e.g. by applying a magnetic field, by external mechanical stimuli, or by an electric potential. In general, the response of MRE materials to these stimuli is crucially dependent on the distribution of the magnetic particles inside the elastomer. Specific knowledge of the interactions between particles or particle clusters is of high relevance for understanding the macroscopic rheological properties and provides an important input for theoretical calculations. In order to gain a better insight into the correlation between the macroscopic effects and microstructure and to generate a database for theoretical analysis, x-ray micro-computed tomography (X-μCT) investigations as a base for a statistical analysis of the particle configurations were carried out. Different MREs with quantities of 2-15 wt% (0.27-2.3 vol%) of iron powder and different allocations of the particles inside the matrix were prepared. The X-μCT results were edited by an image processing software regarding the geometrical properties of the particles with and without the influence of an external magnetic field. Pair correlation functions for the positions of the particles inside the elastomer were calculated to statistically characterize the distributions of the particles in the samples.
Projection operator techniques in nonequilibrium statistical mechanics
International Nuclear Information System (INIS)
Grabert, H.
1982-01-01
This book is an introduction to the application of the projection operator technique to the statistical mechanics of irreversible processes. After a general introduction to the projection operator technique and statistical thermodynamics the Fokker-Planck and the master equation approach are described together with the response theory. Then, as applications the damped harmonic oscillator, simple fluids, and the spin relaxation are considered. (HSI)
Statistical mechanics of low-density parity-check codes
Energy Technology Data Exchange (ETDEWEB)
Kabashima, Yoshiyuki [Department of Computational Intelligence and Systems Science, Tokyo Institute of Technology, Yokohama 2268502 (Japan); Saad, David [Neural Computing Research Group, Aston University, Birmingham B4 7ET (United Kingdom)
2004-02-13
We review recent theoretical progress on the statistical mechanics of error correcting codes, focusing on low-density parity-check (LDPC) codes in general, and on Gallager and MacKay-Neal codes in particular. By exploiting the relation between LDPC codes and Ising spin systems with multi-spin interactions, one can carry out a statistical mechanics based analysis that determines the practical and theoretical limitations of various code constructions, corresponding to dynamical and thermodynamical transitions, respectively, as well as the behaviour of error-exponents averaged over the corresponding code ensemble as a function of channel noise. We also contrast the results obtained using methods of statistical mechanics with those derived in the information theory literature, and show how these methods can be generalized to include other channel types and related communication problems. (topical review)
Statistical mechanics of low-density parity-check codes
International Nuclear Information System (INIS)
Kabashima, Yoshiyuki; Saad, David
2004-01-01
We review recent theoretical progress on the statistical mechanics of error correcting codes, focusing on low-density parity-check (LDPC) codes in general, and on Gallager and MacKay-Neal codes in particular. By exploiting the relation between LDPC codes and Ising spin systems with multi-spin interactions, one can carry out a statistical mechanics based analysis that determines the practical and theoretical limitations of various code constructions, corresponding to dynamical and thermodynamical transitions, respectively, as well as the behaviour of error-exponents averaged over the corresponding code ensemble as a function of channel noise. We also contrast the results obtained using methods of statistical mechanics with those derived in the information theory literature, and show how these methods can be generalized to include other channel types and related communication problems. (topical review)
Comparison of nonstationary generalized logistic models based on Monte Carlo simulation
Directory of Open Access Journals (Sweden)
S. Kim
2015-06-01
Full Text Available Recently, the evidences of climate change have been observed in hydrologic data such as rainfall and flow data. The time-dependent characteristics of statistics in hydrologic data are widely defined as nonstationarity. Therefore, various nonstationary GEV and generalized Pareto models have been suggested for frequency analysis of nonstationary annual maximum and POT (peak-over-threshold data, respectively. However, the alternative models are required for nonstatinoary frequency analysis because of analyzing the complex characteristics of nonstationary data based on climate change. This study proposed the nonstationary generalized logistic model including time-dependent parameters. The parameters of proposed model are estimated using the method of maximum likelihood based on the Newton-Raphson method. In addition, the proposed model is compared by Monte Carlo simulation to investigate the characteristics of models and applicability.
Density by Moduli and Lacunary Statistical Convergence
Directory of Open Access Journals (Sweden)
Vinod K. Bhardwaj
2016-01-01
Full Text Available We have introduced and studied a new concept of f-lacunary statistical convergence, where f is an unbounded modulus. It is shown that, under certain conditions on a modulus f, the concepts of lacunary strong convergence with respect to a modulus f and f-lacunary statistical convergence are equivalent on bounded sequences. We further characterize those θ for which Sθf=Sf, where Sθf and Sf denote the sets of all f-lacunary statistically convergent sequences and f-statistically convergent sequences, respectively. A general description of inclusion between two arbitrary lacunary methods of f-statistical convergence is given. Finally, we give an Sθf-analog of the Cauchy criterion for convergence and a Tauberian theorem for Sθf-convergence is also proved.
Mantovani, Daniela; Sutherland, Holly
2003-01-01
This paper reports an exercise to validate EUROMOD output for 1998 by comparing income statistics calculated from the baseline micro-output with comparable statistics from other sources, including the European Community Household Panel. The main potential reasons for discrepancies are identified. While there are some specific national issues that arise, there are two main general points to consider in interpreting EUROMOD estimates of social indicators across EU member States: (a) the method ...
Statistical considerations on safety analysis
International Nuclear Information System (INIS)
Pal, L.; Makai, M.
2004-01-01
The authors have investigated the statistical methods applied to safety analysis of nuclear reactors and arrived at alarming conclusions: a series of calculations with the generally appreciated safety code ATHLET were carried out to ascertain the stability of the results against input uncertainties in a simple experimental situation. Scrutinizing those calculations, we came to the conclusion that the ATHLET results may exhibit chaotic behavior. A further conclusion is that the technological limits are incorrectly set when the output variables are correlated. Another formerly unnoticed conclusion of the previous ATHLET calculations that certain innocent looking parameters (like wall roughness factor, the number of bubbles per unit volume, the number of droplets per unit volume) can influence considerably such output parameters as water levels. The authors are concerned with the statistical foundation of present day safety analysis practices and can only hope that their own misjudgment will be dispelled. Until then, the authors suggest applying correct statistical methods in safety analysis even if it makes the analysis more expensive. It would be desirable to continue exploring the role of internal parameters (wall roughness factor, steam-water surface in thermal hydraulics codes, homogenization methods in neutronics codes) in system safety codes and to study their effects on the analysis. In the validation and verification process of a code one carries out a series of computations. The input data are not precisely determined because measured data have an error, calculated data are often obtained from a more or less accurate model. Some users of large codes are content with comparing the nominal output obtained from the nominal input, whereas all the possible inputs should be taken into account when judging safety. At the same time, any statement concerning safety must be aleatory, and its merit can be judged only when the probability is known with which the
Directory of Open Access Journals (Sweden)
Peter E. Land
2018-05-01
Full Text Available Uncertainty estimation is crucial to establishing confidence in any data analysis, and this is especially true for Essential Climate Variables, including ocean colour. Methods for deriving uncertainty vary greatly across data types, so a generic statistics-based approach applicable to multiple data types is an advantage to simplify the use and understanding of uncertainty data. Progress towards rigorous uncertainty analysis of ocean colour has been slow, in part because of the complexity of ocean colour processing. Here, we present a general approach to uncertainty characterisation, using a database of satellite-in situ matchups to generate a statistical model of satellite uncertainty as a function of its contributing variables. With an example NASA MODIS-Aqua chlorophyll-a matchups database mostly covering the north Atlantic, we demonstrate a model that explains 67% of the squared error in log(chlorophyll-a as a potentially correctable bias, with the remaining uncertainty being characterised as standard deviation and standard error at each pixel. The method is quite general, depending only on the existence of a suitable database of matchups or reference values, and can be applied to other sensors and data types such as other satellite observed Essential Climate Variables, empirical algorithms derived from in situ data, or even model data.
Statistical Challenges in Modeling Big Brain Signals
Yu, Zhaoxia
2017-11-01
Brain signal data are inherently big: massive in amount, complex in structure, and high in dimensions. These characteristics impose great challenges for statistical inference and learning. Here we review several key challenges, discuss possible solutions, and highlight future research directions.
Emberson, Lauren L; Rubinstein, Dani Y
2016-08-01
The influence of statistical information on behavior (either through learning or adaptation) is quickly becoming foundational to many domains of cognitive psychology and cognitive neuroscience, from language comprehension to visual development. We investigate a central problem impacting these diverse fields: when encountering input with rich statistical information, are there any constraints on learning? This paper examines learning outcomes when adult learners are given statistical information across multiple levels of abstraction simultaneously: from abstract, semantic categories of everyday objects to individual viewpoints on these objects. After revealing statistical learning of abstract, semantic categories with scrambled individual exemplars (Exp. 1), participants viewed pictures where the categories as well as the individual objects predicted picture order (e.g., bird1-dog1, bird2-dog2). Our findings suggest that participants preferentially encode the relationships between the individual objects, even in the presence of statistical regularities linking semantic categories (Exps. 2 and 3). In a final experiment we investigate whether learners are biased towards learning object-level regularities or simply construct the most detailed model given the data (and therefore best able to predict the specifics of the upcoming stimulus) by investigating whether participants preferentially learn from the statistical regularities linking individual snapshots of objects or the relationship between the objects themselves (e.g., bird_picture1-dog_picture1, bird_picture2-dog_picture2). We find that participants fail to learn the relationships between individual snapshots, suggesting a bias towards object-level statistical regularities as opposed to merely constructing the most complete model of the input. This work moves beyond the previous existence proofs that statistical learning is possible at both very high and very low levels of abstraction (categories vs. individual
Statistical time lags in ac discharges
International Nuclear Information System (INIS)
Sobota, A; Kanters, J H M; Van Veldhuizen, E M; Haverlag, M; Manders, F
2011-01-01
The paper presents statistical time lags measured for breakdown events in near-atmospheric pressure argon and xenon. Ac voltage at 100, 400 and 800 kHz was used to drive the breakdown processes, and the voltage amplitude slope was varied between 10 and 1280 V ms -1 . The values obtained for the statistical time lags are roughly between 1 and 150 ms. It is shown that the statistical time lags in ac-driven discharges follow the same general trends as the discharges driven by voltage of monotonic slope. In addition, the validity of the Cobine-Easton expression is tested at an alternating voltage form.
Statistical time lags in ac discharges
Energy Technology Data Exchange (ETDEWEB)
Sobota, A; Kanters, J H M; Van Veldhuizen, E M; Haverlag, M [Eindhoven University of Technology, Department of Applied Physics, Postbus 513, 5600MB Eindhoven (Netherlands); Manders, F, E-mail: a.sobota@tue.nl [Philips Lighting, LightLabs, Mathildelaan 1, 5600JM Eindhoven (Netherlands)
2011-04-06
The paper presents statistical time lags measured for breakdown events in near-atmospheric pressure argon and xenon. Ac voltage at 100, 400 and 800 kHz was used to drive the breakdown processes, and the voltage amplitude slope was varied between 10 and 1280 V ms{sup -1}. The values obtained for the statistical time lags are roughly between 1 and 150 ms. It is shown that the statistical time lags in ac-driven discharges follow the same general trends as the discharges driven by voltage of monotonic slope. In addition, the validity of the Cobine-Easton expression is tested at an alternating voltage form.
Goodman, Joseph W.
2000-07-01
The Wiley Classics Library consists of selected books that have become recognized classics in their respective fields. With these new unabridged and inexpensive editions, Wiley hopes to extend the life of these important works by making them available to future generations of mathematicians and scientists. Currently available in the Series: T. W. Anderson The Statistical Analysis of Time Series T. S. Arthanari & Yadolah Dodge Mathematical Programming in Statistics Emil Artin Geometric Algebra Norman T. J. Bailey The Elements of Stochastic Processes with Applications to the Natural Sciences Robert G. Bartle The Elements of Integration and Lebesgue Measure George E. P. Box & Norman R. Draper Evolutionary Operation: A Statistical Method for Process Improvement George E. P. Box & George C. Tiao Bayesian Inference in Statistical Analysis R. W. Carter Finite Groups of Lie Type: Conjugacy Classes and Complex Characters R. W. Carter Simple Groups of Lie Type William G. Cochran & Gertrude M. Cox Experimental Designs, Second Edition Richard Courant Differential and Integral Calculus, Volume I RIchard Courant Differential and Integral Calculus, Volume II Richard Courant & D. Hilbert Methods of Mathematical Physics, Volume I Richard Courant & D. Hilbert Methods of Mathematical Physics, Volume II D. R. Cox Planning of Experiments Harold S. M. Coxeter Introduction to Geometry, Second Edition Charles W. Curtis & Irving Reiner Representation Theory of Finite Groups and Associative Algebras Charles W. Curtis & Irving Reiner Methods of Representation Theory with Applications to Finite Groups and Orders, Volume I Charles W. Curtis & Irving Reiner Methods of Representation Theory with Applications to Finite Groups and Orders, Volume II Cuthbert Daniel Fitting Equations to Data: Computer Analysis of Multifactor Data, Second Edition Bruno de Finetti Theory of Probability, Volume I Bruno de Finetti Theory of Probability, Volume 2 W. Edwards Deming Sample Design in Business Research
Statistical learning and selective inference.
Taylor, Jonathan; Tibshirani, Robert J
2015-06-23
We describe the problem of "selective inference." This addresses the following challenge: Having mined a set of data to find potential associations, how do we properly assess the strength of these associations? The fact that we have "cherry-picked"--searched for the strongest associations--means that we must set a higher bar for declaring significant the associations that we see. This challenge becomes more important in the era of big data and complex statistical modeling. The cherry tree (dataset) can be very large and the tools for cherry picking (statistical learning methods) are now very sophisticated. We describe some recent new developments in selective inference and illustrate their use in forward stepwise regression, the lasso, and principal components analysis.
Using statistics to understand the environment
Cook, Penny A
2000-01-01
Using Statistics to Understand the Environment covers all the basic tests required for environmental practicals and projects and points the way to the more advanced techniques that may be needed in more complex research designs. Following an introduction to project design, the book covers methods to describe data, to examine differences between samples, and to identify relationships and associations between variables.Featuring: worked examples covering a wide range of environmental topics, drawings and icons, chapter summaries, a glossary of statistical terms and a further reading section, this book focuses on the needs of the researcher rather than on the mathematics behind the tests.
Nonextensive statistical mechanics of ionic solutions
International Nuclear Information System (INIS)
Varela, L.M.; Carrete, J.; Munoz-Sola, R.; Rodriguez, J.R.; Gallego, J.
2007-01-01
Classical mean-field Poisson-Boltzmann theory of ionic solutions is revisited in the theoretical framework of nonextensive Tsallis statistics. The nonextensive equivalent of Poisson-Boltzmann equation is formulated revisiting the statistical mechanics of liquids and the Debye-Hueckel framework is shown to be valid for highly diluted solutions even under circumstances where nonextensive thermostatistics must be applied. The lowest order corrections associated to nonadditive effects are identified for both symmetric and asymmetric electrolytes and the behavior of the average electrostatic potential in a homogeneous system is analytically and numerically analyzed for various values of the complexity measurement nonextensive parameter q
Are the products of statistical learning abstract or stimulus-specific?
Directory of Open Access Journals (Sweden)
Athena eVouloumanos
2012-03-01
Full Text Available Learners segment potential lexical units from syllable streams when statistically variable transitional probabilities between adjacent syllables are the only cues to word boundaries. Here we examine the nature of the representations that result from statistical learning by assessing learners’ ability to generalize across acoustically different stimuli. In three experiments, we investigate limitations on the outcome of statistical learning by considering two possibilities: that the products of statistical segmentation processes are abstract and generalizable representations, or, alternatively, that products of statistical learning are stimulus-bound and restricted to perceptually similar instances. In Experiment 1, learners segmented units from statistically predictable streams, and recognized these units when they were acoustically transformed by temporal reversals. In Experiment 2, learners were able to segment units from temporally reversed syllable streams, but were only able to generalize in conditions of mild acoustic transformation. In Experiment 3, learners were able to recognize statistically segmented units after a voice change but were unable to do so when the novel voice was mildly distorted. Together these results suggest that representations that result from statistical learning can be abstracted to some degree, but not in all listening conditions.
A statistical approach to plasma profile analysis
International Nuclear Information System (INIS)
Kardaun, O.J.W.F.; McCarthy, P.J.; Lackner, K.; Riedel, K.S.
1990-05-01
A general statistical approach to the parameterisation and analysis of tokamak profiles is presented. The modelling of the profile dependence on both the radius and the plasma parameters is discussed, and pertinent, classical as well as robust, methods of estimation are reviewed. Special attention is given to statistical tests for discriminating between the various models, and to the construction of confidence intervals for the parameterised profiles and the associated global quantities. The statistical approach is shown to provide a rigorous approach to the empirical testing of plasma profile invariance. (orig.)
Statistical Analysis of Research Data | Center for Cancer Research
Recent advances in cancer biology have resulted in the need for increased statistical analysis of research data. The Statistical Analysis of Research Data (SARD) course will be held on April 5-6, 2018 from 9 a.m.-5 p.m. at the National Institutes of Health's Natcher Conference Center, Balcony C on the Bethesda Campus. SARD is designed to provide an overview on the general principles of statistical analysis of research data. The first day will feature univariate data analysis, including descriptive statistics, probability distributions, one- and two-sample inferential statistics.
"Statistical Techniques for Particle Physics" (2/4)
CERN. Geneva
2009-01-01
This series will consist of four 1-hour lectures on statistics for particle physics. The goal will be to build up to techniques meant for dealing with problems of realistic complexity while maintaining a formal approach. I will also try to incorporate usage of common tools like ROOT, RooFit, and the newly developed RooStats framework into the lectures. The first lecture will begin with a review the basic principles of probability, some terminology, and the three main approaches towards statistical inference (Frequentist, Bayesian, and Likelihood-based). I will then outline the statistical basis for multivariate analysis techniques (the Neyman-Pearson lemma) and the motivation for machine learning algorithms. Later, I will extend simple hypothesis testing to the case in which the statistical model has one or many parameters (the Neyman Construction and the Feldman-Cousins technique). From there I will outline techniques to incorporate background uncertainties. If time allows, I will touch on the statist...
"Statistical Techniques for Particle Physics" (1/4)
CERN. Geneva
2009-01-01
This series will consist of four 1-hour lectures on statistics for particle physics. The goal will be to build up to techniques meant for dealing with problems of realistic complexity while maintaining a formal approach. I will also try to incorporate usage of common tools like ROOT, RooFit, and the newly developed RooStats framework into the lectures. The first lecture will begin with a review the basic principles of probability, some terminology, and the three main approaches towards statistical inference (Frequentist, Bayesian, and Likelihood-based). I will then outline the statistical basis for multivariate analysis techniques (the Neyman-Pearson lemma) and the motivation for machine learning algorithms. Later, I will extend simple hypothesis testing to the case in which the statistical model has one or many parameters (the Neyman Construction and the Feldman-Cousins technique). From there I will outline techniques to incorporate background uncertainties. If time allows, I will touch on the statist...
"Statistical Techniques for Particle Physics" (4/4)
CERN. Geneva
2009-01-01
This series will consist of four 1-hour lectures on statistics for particle physics. The goal will be to build up to techniques meant for dealing with problems of realistic complexity while maintaining a formal approach. I will also try to incorporate usage of common tools like ROOT, RooFit, and the newly developed RooStats framework into the lectures. The first lecture will begin with a review the basic principles of probability, some terminology, and the three main approaches towards statistical inference (Frequentist, Bayesian, and Likelihood-based). I will then outline the statistical basis for multivariate analysis techniques (the Neyman-Pearson lemma) and the motivation for machine learning algorithms. Later, I will extend simple hypothesis testing to the case in which the statistical model has one or many parameters (the Neyman Construction and the Feldman-Cousins technique). From there I will outline techniques to incorporate background uncertainties. If time allows, I will touch on the statist...