WorldWideScience

Sample records for related systems statistics

  1. Calculating statistical distributions from operator relations: The statistical distributions of various intermediate statistics

    International Nuclear Information System (INIS)

    Dai, Wu-Sheng; Xie, Mi

    2013-01-01

    In this paper, we give a general discussion on the calculation of the statistical distribution from a given operator relation of creation, annihilation, and number operators. Our result shows that as long as the relation between the number operator and the creation and annihilation operators can be expressed as a † b=Λ(N) or N=Λ −1 (a † b), where N, a † , and b denote the number, creation, and annihilation operators, i.e., N is a function of quadratic product of the creation and annihilation operators, the corresponding statistical distribution is the Gentile distribution, a statistical distribution in which the maximum occupation number is an arbitrary integer. As examples, we discuss the statistical distributions corresponding to various operator relations. In particular, besides the Bose–Einstein and Fermi–Dirac cases, we discuss the statistical distributions for various schemes of intermediate statistics, especially various q-deformation schemes. Our result shows that the statistical distributions corresponding to various q-deformation schemes are various Gentile distributions with different maximum occupation numbers which are determined by the deformation parameter q. This result shows that the results given in much literature on the q-deformation distribution are inaccurate or incomplete. -- Highlights: ► A general discussion on calculating statistical distribution from relations of creation, annihilation, and number operators. ► A systemic study on the statistical distributions corresponding to various q-deformation schemes. ► Arguing that many results of q-deformation distributions in literature are inaccurate or incomplete

  2. Uncertainties Related to Extreme Event Statistics of Sewer System Surcharge and Overflow

    DEFF Research Database (Denmark)

    Schaarup-Jensen, Kjeld; Johansen, C.; Thorndahl, Søren Liedtke

    2005-01-01

    Today it is common practice - in the major part of Europe - to base design of sewer systems in urban areas on recommended minimum values of flooding frequencies related to either pipe top level, basement level in buildings or level of road surfaces. Thus storm water runoff in sewer systems is only...... proceeding in an acceptable manner, if flooding of these levels is having an average return period bigger than a predefined value. This practice is also often used in functional analysis of existing sewer systems. If a sewer system can fulfil recommended flooding frequencies or not, can only be verified...... by performing long term simulations - using a sewer flow simulation model - and draw up extreme event statistics from the model simulations. In this context it is important to realize that uncertainties related to the input parameters of rainfall runoff models will give rise to uncertainties related...

  3. Properties of incident reporting systems in relation to statistical trend and pattern analysis

    International Nuclear Information System (INIS)

    Kalfsbeek, H.W.; Arsenis, S.P.

    1990-01-01

    This paper describes the properties deemed desirable for an incident reporting system in order to render it useful for extracting valid statistical trend and pattern information. The perspective under which a data collection system is seen in this paper is the following: data are essentially gathered on a set of variables describing an event or incident (the items featuring on a reporting format) in order to learn about (multiple) dependencies (called interactions) between these variables. Hence, the necessary features of the data source are highlighted and potential problem sources limiting the validity of the results to be obtained are identified. In this frame, important issues are the reporting completeness, related to the reporting criteria and reporting frequency, and of course the reporting contents and quality. The choice of the report items (the variables) and their categorization (code dictionary) may influence (bias) the insights gained from trend and pattern analyses, as may the presence or absence of a structure for correlating the reported issues within an incident. The issues addressed in this paper are brought in relation to some real world reporting systems on safety related events in Nuclear Power Plants, so that their possibilities and limitations with regard to statistical trend and pattern analysis become manifest

  4. Stochastic simulations for the time evolution of systems which obey generalized statistics: fractional exclusion statistics and Gentile's statistics

    International Nuclear Information System (INIS)

    Nemnes, G A; Anghel, D V

    2010-01-01

    We present a stochastic method for the simulation of the time evolution in systems which obey generalized statistics, namely fractional exclusion statistics and Gentile's statistics. The transition rates are derived in the framework of canonical ensembles. This approach introduces a tool for describing interacting fermionic and bosonic systems in non-equilibrium as ideal FES systems, in a computationally efficient manner. The two types of statistics are analyzed comparatively, indicating their intrinsic thermodynamic differences and revealing key aspects related to the species size

  5. Statistics and Corporate Environmental Management: Relations and Problems

    DEFF Research Database (Denmark)

    Madsen, Henning; Ulhøi, John Parm

    1997-01-01

    Statistical methods have long been used to analyse the macroeconomic consequences of environmentally damaging activities, political actions to control, prevent, or reduce these damages, and environmental problems in the natural environment. Up to now, however, they have had a limited and not very...... specific use in corporate environmental management systems. This paper will address some of the special problems related to the use of statistical techniques in corporate environmental management systems. One important aspect of this is the interaction of internal decisions and activities with conditions...

  6. Functional statistics and related fields

    CERN Document Server

    Bongiorno, Enea; Cao, Ricardo; Vieu, Philippe

    2017-01-01

    This volume collects latest methodological and applied contributions on functional, high-dimensional and other complex data, related statistical models and tools as well as on operator-based statistics. It contains selected and refereed contributions presented at the Fourth International Workshop on Functional and Operatorial Statistics (IWFOS 2017) held in A Coruña, Spain, from 15 to 17 June 2017. The series of IWFOS workshops was initiated by the Working Group on Functional and Operatorial Statistics at the University of Toulouse in 2008. Since then, many of the major advances in functional statistics and related fields have been periodically presented and discussed at the IWFOS workshops. .

  7. Statistics and Corporate Environmental Management: Relations and Problems

    DEFF Research Database (Denmark)

    Madsen, Henning; Ulhøi, John Parm

    1997-01-01

    Statistical methods have long been used to analyse the macroeconomic consequences of environmentally damaging activities, political actions to control, prevent, or reduce these damages, and environmental problems in the natural environment. Up to now, however, they have had a limited and not very...... in the external environment. The nature and extent of the practical use of quantitative techniques in corporate environmental management systems is discussed on the basis of a number of company surveys in four European countries.......Statistical methods have long been used to analyse the macroeconomic consequences of environmentally damaging activities, political actions to control, prevent, or reduce these damages, and environmental problems in the natural environment. Up to now, however, they have had a limited and not very...... specific use in corporate environmental management systems. This paper will address some of the special problems related to the use of statistical techniques in corporate environmental management systems. One important aspect of this is the interaction of internal decisions and activities with conditions...

  8. Features of statistical dynamics in a finite system

    International Nuclear Information System (INIS)

    Yan, Shiwei; Sakata, Fumihiko; Zhuo Yizhong

    2002-01-01

    We study features of statistical dynamics in a finite Hamilton system composed of a relevant one degree of freedom coupled to an irrelevant multidegree of freedom system through a weak interaction. Special attention is paid on how the statistical dynamics changes depending on the number of degrees of freedom in the irrelevant system. It is found that the macrolevel statistical aspects are strongly related to an appearance of the microlevel chaotic motion, and a dissipation of the relevant motion is realized passing through three distinct stages: dephasing, statistical relaxation, and equilibrium regimes. It is clarified that the dynamical description and the conventional transport approach provide us with almost the same macrolevel and microlevel mechanisms only for the system with a very large number of irrelevant degrees of freedom. It is also shown that the statistical relaxation in the finite system is an anomalous diffusion and the fluctuation effects have a finite correlation time

  9. Relative effects of statistical preprocessing and postprocessing on a regional hydrological ensemble prediction system

    Science.gov (United States)

    Sharma, Sanjib; Siddique, Ridwan; Reed, Seann; Ahnert, Peter; Mendoza, Pablo; Mejia, Alfonso

    2018-03-01

    The relative roles of statistical weather preprocessing and streamflow postprocessing in hydrological ensemble forecasting at short- to medium-range forecast lead times (day 1-7) are investigated. For this purpose, a regional hydrologic ensemble prediction system (RHEPS) is developed and implemented. The RHEPS is comprised of the following components: (i) hydrometeorological observations (multisensor precipitation estimates, gridded surface temperature, and gauged streamflow); (ii) weather ensemble forecasts (precipitation and near-surface temperature) from the National Centers for Environmental Prediction 11-member Global Ensemble Forecast System Reforecast version 2 (GEFSRv2); (iii) NOAA's Hydrology Laboratory-Research Distributed Hydrologic Model (HL-RDHM); (iv) heteroscedastic censored logistic regression (HCLR) as the statistical preprocessor; (v) two statistical postprocessors, an autoregressive model with a single exogenous variable (ARX(1,1)) and quantile regression (QR); and (vi) a comprehensive verification strategy. To implement the RHEPS, 1 to 7 days weather forecasts from the GEFSRv2 are used to force HL-RDHM and generate raw ensemble streamflow forecasts. Forecasting experiments are conducted in four nested basins in the US Middle Atlantic region, ranging in size from 381 to 12 362 km2. Results show that the HCLR preprocessed ensemble precipitation forecasts have greater skill than the raw forecasts. These improvements are more noticeable in the warm season at the longer lead times (> 3 days). Both postprocessors, ARX(1,1) and QR, show gains in skill relative to the raw ensemble streamflow forecasts, particularly in the cool season, but QR outperforms ARX(1,1). The scenarios that implement preprocessing and postprocessing separately tend to perform similarly, although the postprocessing-alone scenario is often more effective. The scenario involving both preprocessing and postprocessing consistently outperforms the other scenarios. In some cases

  10. Statistical equilibrium and symplectic geometry in general relativity

    International Nuclear Information System (INIS)

    Iglesias, P.

    1981-09-01

    A geometrical construction is given of the statistical equilibrium states of a system of particles in the gravitational field in general relativity. By a method of localization variables, the expression of thermodynamic values is given and the compatibility of this description is shown with a macroscopic model of a relativistic continuous medium for a given value of the free-energy function [fr

  11. Statistical mechanics in the context of special relativity.

    Science.gov (United States)

    Kaniadakis, G

    2002-11-01

    In Ref. [Physica A 296, 405 (2001)], starting from the one parameter deformation of the exponential function exp(kappa)(x)=(sqrt[1+kappa(2)x(2)]+kappax)(1/kappa), a statistical mechanics has been constructed which reduces to the ordinary Boltzmann-Gibbs statistical mechanics as the deformation parameter kappa approaches to zero. The distribution f=exp(kappa)(-beta E+betamu) obtained within this statistical mechanics shows a power law tail and depends on the nonspecified parameter beta, containing all the information about the temperature of the system. On the other hand, the entropic form S(kappa)= integral d(3)p(c(kappa) f(1+kappa)+c(-kappa) f(1-kappa)), which after maximization produces the distribution f and reduces to the standard Boltzmann-Shannon entropy S0 as kappa-->0, contains the coefficient c(kappa) whose expression involves, beside the Boltzmann constant, another nonspecified parameter alpha. In the present effort we show that S(kappa) is the unique existing entropy obtained by a continuous deformation of S0 and preserving unaltered its fundamental properties of concavity, additivity, and extensivity. These properties of S(kappa) permit to determine unequivocally the values of the above mentioned parameters beta and alpha. Subsequently, we explain the origin of the deformation mechanism introduced by kappa and show that this deformation emerges naturally within the Einstein special relativity. Furthermore, we extend the theory in order to treat statistical systems in a time dependent and relativistic context. Then, we show that it is possible to determine in a self consistent scheme within the special relativity the values of the free parameter kappa which results to depend on the light speed c and reduces to zero as c--> infinity recovering in this way the ordinary statistical mechanics and thermodynamics. The statistical mechanics here presented, does not contain free parameters, preserves unaltered the mathematical and epistemological structure of

  12. Statistical evaluation of design-error related accidents

    International Nuclear Information System (INIS)

    Ott, K.O.; Marchaterre, J.F.

    1980-01-01

    In a recently published paper (Campbell and Ott, 1979), a general methodology was proposed for the statistical evaluation of design-error related accidents. The evaluation aims at an estimate of the combined residual frequency of yet unknown types of accidents lurking in a certain technological system. Here, the original methodology is extended, as to apply to a variety of systems that evolves during the development of large-scale technologies. A special categorization of incidents and accidents is introduced to define the events that should be jointly analyzed. The resulting formalism is applied to the development of the nuclear power reactor technology, considering serious accidents that involve in the accident-progression a particular design inadequacy

  13. Thermal equilibrium and statistical thermometers in special relativity.

    Science.gov (United States)

    Cubero, David; Casado-Pascual, Jesús; Dunkel, Jörn; Talkner, Peter; Hänggi, Peter

    2007-10-26

    There is an intense debate in the recent literature about the correct generalization of Maxwell's velocity distribution in special relativity. The most frequently discussed candidate distributions include the Jüttner function as well as modifications thereof. Here we report results from fully relativistic one-dimensional molecular dynamics simulations that resolve the ambiguity. The numerical evidence unequivocally favors the Jüttner distribution. Moreover, our simulations illustrate that the concept of "thermal equilibrium" extends naturally to special relativity only if a many-particle system is spatially confined. They make evident that "temperature" can be statistically defined and measured in an observer frame independent way.

  14. Some uncertainty results obtained by the statistical version of the KARATE code system related to core design and safety analysis

    Energy Technology Data Exchange (ETDEWEB)

    Panka, Istvan; Hegyi, Gyoergy; Maraczy, Csaba; Temesvari, Emese [Hungarian Academy of Sciences, Budapest (Hungary). Reactor Analysis Dept.

    2017-11-15

    The best-estimate KARATE code system has been widely used for core design calculations and simulations of slow transients of VVER reactors. Recently there has been an increasing need for assessing the uncertainties of such calculations by propagating the basic input uncertainties of the models through the full calculation chain. In order to determine the uncertainties of quantities of interest during the burnup, the statistical version of the KARATE code system has been elaborated. In the first part of the paper, the main features of the new code system are discussed. The applied statistical method is based on Monte-Carlo sampling of the considered input data taking into account mainly the covariance matrices of the cross sections and/or the technological uncertainties. In the second part of the paper, only the uncertainties of cross sections are considered and an equilibrium cycle related to a VVER-440 type reactor is investigated. The burnup dependence of the uncertainties of some safety related parameters (e.g. critical boron concentration, rod worth, feedback coefficients, assembly-wise radial power and burnup distribution) are discussed and compared to the recently used limits.

  15. Statistically significant relational data mining :

    Energy Technology Data Exchange (ETDEWEB)

    Berry, Jonathan W.; Leung, Vitus Joseph; Phillips, Cynthia Ann; Pinar, Ali; Robinson, David Gerald; Berger-Wolf, Tanya; Bhowmick, Sanjukta; Casleton, Emily; Kaiser, Mark; Nordman, Daniel J.; Wilson, Alyson G.

    2014-02-01

    This report summarizes the work performed under the project (3z(BStatitically significant relational data mining.(3y (BThe goal of the project was to add more statistical rigor to the fairly ad hoc area of data mining on graphs. Our goal was to develop better algorithms and better ways to evaluate algorithm quality. We concetrated on algorithms for community detection, approximate pattern matching, and graph similarity measures. Approximate pattern matching involves finding an instance of a relatively small pattern, expressed with tolerance, in a large graph of data observed with uncertainty. This report gathers the abstracts and references for the eight refereed publications that have appeared as part of this work. We then archive three pieces of research that have not yet been published. The first is theoretical and experimental evidence that a popular statistical measure for comparison of community assignments favors over-resolved communities over approximations to a ground truth. The second are statistically motivated methods for measuring the quality of an approximate match of a small pattern in a large graph. The third is a new probabilistic random graph model. Statisticians favor these models for graph analysis. The new local structure graph model overcomes some of the issues with popular models such as exponential random graph models and latent variable models.

  16. Fundamental link between system theory and statistical mechanics

    International Nuclear Information System (INIS)

    Atmanspacher, H.; Scheingraber, H.

    1987-01-01

    A fundamental link between system theory and statistical mechanics has been found to be established by the Kolmogorov entropy. By this quantity the temporal evolution of dynamical systems can be classified into regular, chaotic, and stochastic processes. Since K represents a measure for the internal information creation rate of dynamical systems, it provides an approach to irreversibility. The formal relationship to statistical mechanics is derived by means of an operator formalism originally introduced by Prigogine. For a Liouville operator L and an information operator M tilde acting on a distribution in phase space, it is shown that i[L, M tilde] = KI (I = identity operator). As a first consequence of this equivalence, a relation is obtained between the chaotic correlation time of a system and Prigogine's concept of a finite duration of presence. Finally, the existence of chaos in quantum systems is discussed with respect to the existence of a quantum mechanical time operator

  17. Nonequilibrium statistical physics of small systems: fluctuation relations and beyond (annual reviews of nonlinear dynamics and complexity (vch))

    CERN Document Server

    2013-01-01

    This book offers a comprehensive picture of nonequilibrium phenomena in nanoscale systems. Written by internationally recognized experts in the field, this book strikes a balance between theory and experiment, and includes in-depth introductions to nonequilibrium fluctuation relations, nonlinear dynamics and transport, single molecule experiments, and molecular diffusion in nanopores. The authors explore the application of these concepts to nano- and biosystems by cross-linking key methods and ideas from nonequilibrium statistical physics, thermodynamics, stochastic theory, and dynamical s

  18. PRIS-STATISTICS: Power Reactor Information System Statistical Reports. User's Manual

    International Nuclear Information System (INIS)

    2013-01-01

    The IAEA developed the Power Reactor Information System (PRIS)-Statistics application to assist PRIS end users with generating statistical reports from PRIS data. Statistical reports provide an overview of the status, specification and performance results of every nuclear power reactor in the world. This user's manual was prepared to facilitate the use of the PRIS-Statistics application and to provide guidelines and detailed information for each report in the application. Statistical reports support analyses of nuclear power development and strategies, and the evaluation of nuclear power plant performance. The PRIS database can be used for comprehensive trend analyses and benchmarking against best performers and industrial standards.

  19. Automated statistical modeling of analytical measurement systems

    International Nuclear Information System (INIS)

    Jacobson, J.J.

    1992-01-01

    The statistical modeling of analytical measurement systems at the Idaho Chemical Processing Plant (ICPP) has been completely automated through computer software. The statistical modeling of analytical measurement systems is one part of a complete quality control program used by the Remote Analytical Laboratory (RAL) at the ICPP. The quality control program is an integration of automated data input, measurement system calibration, database management, and statistical process control. The quality control program and statistical modeling program meet the guidelines set forth by the American Society for Testing Materials and American National Standards Institute. A statistical model is a set of mathematical equations describing any systematic bias inherent in a measurement system and the precision of a measurement system. A statistical model is developed from data generated from the analysis of control standards. Control standards are samples which are made up at precise known levels by an independent laboratory and submitted to the RAL. The RAL analysts who process control standards do not know the values of those control standards. The object behind statistical modeling is to describe real process samples in terms of their bias and precision and, to verify that a measurement system is operating satisfactorily. The processing of control standards gives us this ability

  20. Statistical evaluation of design-error related nuclear reactor accidents

    International Nuclear Information System (INIS)

    Ott, K.O.; Marchaterre, J.F.

    1981-01-01

    In this paper, general methodology for the statistical evaluation of design-error related accidents is proposed that can be applied to a variety of systems that evolves during the development of large-scale technologies. The evaluation aims at an estimate of the combined ''residual'' frequency of yet unknown types of accidents ''lurking'' in a certain technological system. A special categorization in incidents and accidents is introduced to define the events that should be jointly analyzed. The resulting formalism is applied to the development of U.S. nuclear power reactor technology, considering serious accidents (category 2 events) that involved, in the accident progression, a particular design inadequacy. 9 refs

  1. Nonequilibrium thermodynamics and fluctuation relations for small systems

    International Nuclear Information System (INIS)

    Cao Liang; Ke Pu; Qiao Li-Yan; Zheng Zhi-Gang

    2014-01-01

    In this review, we give a retrospect of the recent progress in nonequilibrium statistical mechanics and thermodynamics in small dynamical systems. For systems with only a few number of particles, fluctuations and nonlinearity become significant and contribute to the nonequilibrium behaviors of the systems, hence the statistical properties and thermodynamics should be carefully studied. We review recent developments of this topic by starting from the Gallavotti—Cohen fluctuation theorem, and then to the Evans—Searles transient fluctuation theorem, Jarzynski free-energy equality, and the Crooks fluctuation relation. We also investigate the nonequilibrium free energy theorem for trajectories involving changes of the heat bath temperature and propose a generalized free-energy relation. It should be noticed that the non-Markovian property of the heat bath may lead to the violation of the free-energy relation. (topical review - statistical physics and complex systems)

  2. Sex differences in discriminative power of volleyball game-related statistics.

    Science.gov (United States)

    João, Paulo Vicente; Leite, Nuno; Mesquita, Isabel; Sampaio, Jaime

    2010-12-01

    To identify sex differences in volleyball game-related statistics, the game-related statistics of several World Championships in 2007 (N=132) were analyzed using the software VIS from the International Volleyball Federation. Discriminant analysis was used to identify the game-related statistics which better discriminated performances by sex. Analysis yielded an emphasis on fault serves (SC = -.40), shot spikes (SC = .40), and reception digs (SC = .31). Specific robust numbers represent that considerable variability was evident in the game-related statistics profile, as men's volleyball games were better associated with terminal actions (errors of service), and women's volleyball games were characterized by continuous actions (in defense and attack). These differences may be related to the anthropometric and physiological differences between women and men and their influence on performance profiles.

  3. Statistical imitation system using relational interest points and Gaussian mixture models

    CSIR Research Space (South Africa)

    Claassens, J

    2009-11-01

    Full Text Available The author proposes an imitation system that uses relational interest points (RIPs) and Gaussian mixture models (GMMs) to characterize a behaviour. The system's structure is inspired by the Robot Programming by Demonstration (RDP) paradigm...

  4. Nuclear material statistical accountancy system

    International Nuclear Information System (INIS)

    Argentest, F.; Casilli, T.; Franklin, M.

    1979-01-01

    The statistical accountancy system developed at JRC Ispra is refered as 'NUMSAS', ie Nuclear Material Statistical Accountancy System. The principal feature of NUMSAS is that in addition to an ordinary material balance calcultation, NUMSAS can calculate an estimate of the standard deviation of the measurement error accumulated in the material balance calculation. The purpose of the report is to describe in detail, the statistical model on wich the standard deviation calculation is based; the computational formula which is used by NUMSAS in calculating the standard deviation and the information about nuclear material measurements and the plant measurement system which are required as data for NUMSAS. The material balance records require processing and interpretation before the material balance calculation is begun. The material balance calculation is the last of four phases of data processing undertaken by NUMSAS. Each of these phases is implemented by a different computer program. The activities which are carried out in each phase can be summarised as follows; the pre-processing phase; the selection and up-date phase; the transformation phase, and the computation phase

  5. Spectral statistics of chaotic many-body systems

    International Nuclear Information System (INIS)

    Dubertrand, Rémy; Müller, Sebastian

    2016-01-01

    We derive a trace formula that expresses the level density of chaotic many-body systems as a smooth term plus a sum over contributions associated to solutions of the nonlinear Schrödinger (or Gross–Pitaevski) equation. Our formula applies to bosonic systems with discretised positions, such as the Bose–Hubbard model, in the semiclassical limit as well as in the limit where the number of particles is taken to infinity. We use the trace formula to investigate the spectral statistics of these systems, by studying interference between solutions of the nonlinear Schrödinger equation. We show that in the limits taken the statistics of fully chaotic many-particle systems becomes universal and agrees with predictions from the Wigner–Dyson ensembles of random matrix theory. The conditions for Wigner–Dyson statistics involve a gap in the spectrum of the Frobenius–Perron operator, leaving the possibility of different statistics for systems with weaker chaotic properties. (paper)

  6. Gibbs' theorem for open systems with incomplete statistics

    International Nuclear Information System (INIS)

    Bagci, G.B.

    2009-01-01

    Gibbs' theorem, which is originally intended for canonical ensembles with complete statistics has been generalized to open systems with incomplete statistics. As a result of this generalization, it is shown that the stationary equilibrium distribution of inverse power law form associated with the incomplete statistics has maximum entropy even for open systems with energy or matter influx. The renormalized entropy definition given in this paper can also serve as a measure of self-organization in open systems described by incomplete statistics.

  7. Management system of occupational diseases in Korea: statistics, report and monitoring system.

    Science.gov (United States)

    Rhee, Kyung Yong; Choe, Seong Weon

    2010-12-01

    The management system of occupational diseases in Korea can be assessed from the perspective of a surveillance system. Workers' compensation insurance reports are used to produce official statistics on occupational diseases in Korea. National working conditions surveys are used to monitor the magnitude of work-related symptoms and signs in the labor force. A health examination program was introduced to detect occupational diseases through both selective and mass screening programs. The Working Environment Measurement Institution assesses workers' exposure to hazards in the workplace. Government regulates that the employer should do health examinations and working conditions measurement through contracted private agencies and following the Occupational Safety and Health Act. It is hoped that these institutions may be able to effectively detect and monitor occupational diseases and hazards in the workplace. In view of this, the occupational management system in Korea is well designed, except for the national survey system. In the future, national surveys for detection of hazards and ill-health outcomes in workers should be developed. The existing surveillance system for occupational disease can be improved by providing more refined information through statistical analysis of surveillance data.

  8. Statistical methods for spatio-temporal systems

    CERN Document Server

    Finkenstadt, Barbel

    2006-01-01

    Statistical Methods for Spatio-Temporal Systems presents current statistical research issues on spatio-temporal data modeling and will promote advances in research and a greater understanding between the mechanistic and the statistical modeling communities.Contributed by leading researchers in the field, each self-contained chapter starts with an introduction of the topic and progresses to recent research results. Presenting specific examples of epidemic data of bovine tuberculosis, gastroenteric disease, and the U.K. foot-and-mouth outbreak, the first chapter uses stochastic models, such as point process models, to provide the probabilistic backbone that facilitates statistical inference from data. The next chapter discusses the critical issue of modeling random growth objects in diverse biological systems, such as bacteria colonies, tumors, and plant populations. The subsequent chapter examines data transformation tools using examples from ecology and air quality data, followed by a chapter on space-time co...

  9. Zubarev's Nonequilibrium Statistical Operator Method in the Generalized Statistics of Multiparticle Systems

    Science.gov (United States)

    Glushak, P. A.; Markiv, B. B.; Tokarchuk, M. V.

    2018-01-01

    We present a generalization of Zubarev's nonequilibrium statistical operator method based on the principle of maximum Renyi entropy. In the framework of this approach, we obtain transport equations for the basic set of parameters of the reduced description of nonequilibrium processes in a classical system of interacting particles using Liouville equations with fractional derivatives. For a classical systems of particles in a medium with a fractal structure, we obtain a non-Markovian diffusion equation with fractional spatial derivatives. For a concrete model of the frequency dependence of a memory function, we obtain generalized Kettano-type diffusion equation with the spatial and temporal fractality taken into account. We present a generalization of nonequilibrium thermofield dynamics in Zubarev's nonequilibrium statistical operator method in the framework of Renyi statistics.

  10. Statistical properties of dynamical systems – Simulation and abstract computation

    International Nuclear Information System (INIS)

    Galatolo, Stefano; Hoyrup, Mathieu; Rojas, Cristóbal

    2012-01-01

    Highlights: ► A survey on results about computation and computability on the statistical properties of dynamical systems. ► Computability and non-computability results for invariant measures. ► A short proof for the computability of the convergence speed of ergodic averages. ► A kind of “constructive” version of the pointwise ergodic theorem. - Abstract: We survey an area of recent development, relating dynamics to theoretical computer science. We discuss some aspects of the theoretical simulation and computation of the long term behavior of dynamical systems. We will focus on the statistical limiting behavior and invariant measures. We present a general method allowing the algorithmic approximation at any given accuracy of invariant measures. The method can be applied in many interesting cases, as we shall explain. On the other hand, we exhibit some examples where the algorithmic approximation of invariant measures is not possible. We also explain how it is possible to compute the speed of convergence of ergodic averages (when the system is known exactly) and how this entails the computation of arbitrarily good approximations of points of the space having typical statistical behaviour (a sort of constructive version of the pointwise ergodic theorem).

  11. Equilibrium statistical mechanics for self-gravitating systems: local ergodicity and extended Boltzmann-Gibbs/White-Narayan statistics

    Science.gov (United States)

    He, Ping

    2012-01-01

    The long-standing puzzle surrounding the statistical mechanics of self-gravitating systems has not yet been solved successfully. We formulate a systematic theoretical framework of entropy-based statistical mechanics for spherically symmetric collisionless self-gravitating systems. We use an approach that is very different from that of the conventional statistical mechanics of short-range interaction systems. We demonstrate that the equilibrium states of self-gravitating systems consist of both mechanical and statistical equilibria, with the former characterized by a series of velocity-moment equations and the latter by statistical equilibrium equations, which should be derived from the entropy principle. The velocity-moment equations of all orders are derived from the steady-state collisionless Boltzmann equation. We point out that the ergodicity is invalid for the whole self-gravitating system, but it can be re-established locally. Based on the local ergodicity, using Fermi-Dirac-like statistics, with the non-degenerate condition and the spatial independence of the local microstates, we rederive the Boltzmann-Gibbs entropy. This is consistent with the validity of the collisionless Boltzmann equation, and should be the correct entropy form for collisionless self-gravitating systems. Apart from the usual constraints of mass and energy conservation, we demonstrate that the series of moment or virialization equations must be included as additional constraints on the entropy functional when performing the variational calculus; this is an extension to the original prescription by White & Narayan. Any possible velocity distribution can be produced by the statistical-mechanical approach that we have developed with the extended Boltzmann-Gibbs/White-Narayan statistics. Finally, we discuss the questions of negative specific heat and ensemble inequivalence for self-gravitating systems.

  12. Topics in computer simulations of statistical systems

    International Nuclear Information System (INIS)

    Salvador, R.S.

    1987-01-01

    Several computer simulations studying a variety of topics in statistical mechanics and lattice gauge theories are performed. The first study describes a Monte Carlo simulation performed on Ising systems defined on Sierpinsky carpets of dimensions between one and four. The critical coupling and the exponent γ are measured as a function of dimension. The Ising gauge theory in d = 4 - epsilon, for epsilon → 0 + , is then studied by performing a Monte Carlo simulation for the theory defined on fractals. A high statistics Monte Carlo simulation for the three-dimensional Ising model is presented for lattices of sizes 8 3 to 44 3 . All the data obtained agrees completely, within statistical errors, with the forms predicted by finite-sizing scaling. Finally, a method to estimate numerically the partition function of statistical systems is developed

  13. Statistical Thermodynamics of Disperse Systems

    DEFF Research Database (Denmark)

    Shapiro, Alexander

    1996-01-01

    Principles of statistical physics are applied for the description of thermodynamic equilibrium in disperse systems. The cells of disperse systems are shown to possess a number of non-standard thermodynamic parameters. A random distribution of these parameters in the system is determined....... On the basis of this distribution, it is established that the disperse system has an additional degree of freedom called the macro-entropy. A large set of bounded ideal disperse systems allows exact evaluation of thermodynamic characteristics. The theory developed is applied to the description of equilibrium...

  14. Obtaining Internet Flow Statistics by Volunteer-Based System

    DEFF Research Database (Denmark)

    Pedersen, Jens Myrup; Bujlow, Tomasz

    2012-01-01

    In this paper we demonstrate how the Volunteer Based System for Research on the Internet, developed at Aalborg University, can be used for creating statistics of Internet usage. Since the data is collected on individual machines, the statistics can be made on the basis of both individual users......, and average flow durations. The paper is concluded with a discussion on what further statistics can be made, and the further development of the system....

  15. Bridging Weighted Rules and Graph Random Walks for Statistical Relational Models

    Directory of Open Access Journals (Sweden)

    Seyed Mehran Kazemi

    2018-02-01

    Full Text Available The aim of statistical relational learning is to learn statistical models from relational or graph-structured data. Three main statistical relational learning paradigms include weighted rule learning, random walks on graphs, and tensor factorization. These paradigms have been mostly developed and studied in isolation for many years, with few works attempting at understanding the relationship among them or combining them. In this article, we study the relationship between the path ranking algorithm (PRA, one of the most well-known relational learning methods in the graph random walk paradigm, and relational logistic regression (RLR, one of the recent developments in weighted rule learning. We provide a simple way to normalize relations and prove that relational logistic regression using normalized relations generalizes the path ranking algorithm. This result provides a better understanding of relational learning, especially for the weighted rule learning and graph random walk paradigms. It opens up the possibility of using the more flexible RLR rules within PRA models and even generalizing both by including normalized and unnormalized relations in the same model.

  16. A statistical model for instable thermodynamical systems

    International Nuclear Information System (INIS)

    Sommer, Jens-Uwe

    2003-01-01

    A generic model is presented for statistical systems which display thermodynamic features in contrast to our everyday experience, such as infinite and negative heat capacities. Such system are instable in terms of classical equilibrium thermodynamics. Using our statistical model, we are able to investigate states of instable systems which are undefined in the framework of equilibrium thermodynamics. We show that a region of negative heat capacity in the adiabatic environment, leads to a first order like phase transition when the system is coupled to a heat reservoir. This phase transition takes place without a phase coexistence. Nevertheless, all intermediate states are stable due to fluctuations. When two instable system are brought in thermal contact, the temperature of the composed system is lower than the minimum temperature of the individual systems. Generally, the equilibrium states of instable system cannot be simply decomposed into equilibrium states of the individual systems. The properties of instable system depend on the environment, ensemble equivalence is broken

  17. Statistical inference for noisy nonlinear ecological dynamic systems.

    Science.gov (United States)

    Wood, Simon N

    2010-08-26

    Chaotic ecological dynamic systems defy conventional statistical analysis. Systems with near-chaotic dynamics are little better. Such systems are almost invariably driven by endogenous dynamic processes plus demographic and environmental process noise, and are only observable with error. Their sensitivity to history means that minute changes in the driving noise realization, or the system parameters, will cause drastic changes in the system trajectory. This sensitivity is inherited and amplified by the joint probability density of the observable data and the process noise, rendering it useless as the basis for obtaining measures of statistical fit. Because the joint density is the basis for the fit measures used by all conventional statistical methods, this is a major theoretical shortcoming. The inability to make well-founded statistical inferences about biological dynamic models in the chaotic and near-chaotic regimes, other than on an ad hoc basis, leaves dynamic theory without the methods of quantitative validation that are essential tools in the rest of biological science. Here I show that this impasse can be resolved in a simple and general manner, using a method that requires only the ability to simulate the observed data on a system from the dynamic model about which inferences are required. The raw data series are reduced to phase-insensitive summary statistics, quantifying local dynamic structure and the distribution of observations. Simulation is used to obtain the mean and the covariance matrix of the statistics, given model parameters, allowing the construction of a 'synthetic likelihood' that assesses model fit. This likelihood can be explored using a straightforward Markov chain Monte Carlo sampler, but one further post-processing step returns pure likelihood-based inference. I apply the method to establish the dynamic nature of the fluctuations in Nicholson's classic blowfly experiments.

  18. TRANSIT TIMING OBSERVATIONS FROM KEPLER. VI. POTENTIALLY INTERESTING CANDIDATE SYSTEMS FROM FOURIER-BASED STATISTICAL TESTS

    International Nuclear Information System (INIS)

    Steffen, Jason H.; Ford, Eric B.; Rowe, Jason F.; Borucki, William J.; Bryson, Steve; Caldwell, Douglas A.; Jenkins, Jon M.; Koch, David G.; Sanderfer, Dwight T.; Seader, Shawn; Twicken, Joseph D.; Fabrycky, Daniel C.; Holman, Matthew J.; Welsh, William F.; Batalha, Natalie M.; Ciardi, David R.; Kjeldsen, Hans; Prša, Andrej

    2012-01-01

    We analyze the deviations of transit times from a linear ephemeris for the Kepler Objects of Interest (KOI) through quarter six of science data. We conduct two statistical tests for all KOIs and a related statistical test for all pairs of KOIs in multi-transiting systems. These tests identify several systems which show potentially interesting transit timing variations (TTVs). Strong TTV systems have been valuable for the confirmation of planets and their mass measurements. Many of the systems identified in this study should prove fruitful for detailed TTV studies.

  19. Transit timing observations from Kepler. VI. Potentially interesting candidate systems from fourier-based statistical tests

    DEFF Research Database (Denmark)

    Steffen, J.H.; Ford, E.B.; Rowe, J.F.

    2012-01-01

    We analyze the deviations of transit times from a linear ephemeris for the Kepler Objects of Interest (KOI) through quarter six of science data. We conduct two statistical tests for all KOIs and a related statistical test for all pairs of KOIs in multi-transiting systems. These tests identify...... several systems which show potentially interesting transit timing variations (TTVs). Strong TTV systems have been valuable for the confirmation of planets and their mass measurements. Many of the systems identified in this study should prove fruitful for detailed TTV studies....

  20. Statistical physics of complex systems a concise introduction

    CERN Document Server

    Bertin, Eric

    2016-01-01

    This course-tested primer provides graduate students and non-specialists with a basic understanding of the concepts and methods of statistical physics and demonstrates their wide range of applications to interdisciplinary topics in the field of complex system sciences, including selected aspects of theoretical modeling in biology and the social sciences. Generally speaking, the goals of statistical physics may be summarized as follows: on the one hand to study systems composed of a large number of interacting units, and on the other to predict the macroscopic, collective behavior of the system considered from the perspective of the microscopic laws governing the dynamics of the individual entities. These two goals are essentially also shared by what is now called 'complex systems science', and as such, systems studied in the framework of statistical physics may be considered to be among the simplest examples of complex systems – while also offering a rather well developed mathematical treatment. The second ...

  1. Flow Equation Approach to the Statistics of Nonlinear Dynamical Systems

    Science.gov (United States)

    Marston, J. B.; Hastings, M. B.

    2005-03-01

    The probability distribution function of non-linear dynamical systems is governed by a linear framework that resembles quantum many-body theory, in which stochastic forcing and/or averaging over initial conditions play the role of non-zero . Besides the well-known Fokker-Planck approach, there is a related Hopf functional methodootnotetextUriel Frisch, Turbulence: The Legacy of A. N. Kolmogorov (Cambridge University Press, 1995) chapter 9.5.; in both formalisms, zero modes of linear operators describe the stationary non-equilibrium statistics. To access the statistics, we investigate the method of continuous unitary transformationsootnotetextS. D. Glazek and K. G. Wilson, Phys. Rev. D 48, 5863 (1993); Phys. Rev. D 49, 4214 (1994). (also known as the flow equation approachootnotetextF. Wegner, Ann. Phys. 3, 77 (1994).), suitably generalized to the diagonalization of non-Hermitian matrices. Comparison to the more traditional cumulant expansion method is illustrated with low-dimensional attractors. The treatment of high-dimensional dynamical systems is also discussed.

  2. MANAGERIAL DECISION IN INNOVATIVE EDUCATION SYSTEMS STATISTICAL SURVEY BASED ON SAMPLE THEORY

    Directory of Open Access Journals (Sweden)

    Gheorghe SĂVOIU

    2012-12-01

    Full Text Available Before formulating the statistical hypotheses and the econometrictesting itself, a breakdown of some of the technical issues is required, which are related to managerial decision in innovative educational systems, the educational managerial phenomenon tested through statistical and mathematical methods, respectively the significant difference in perceiving the current qualities, knowledge, experience, behaviour and desirable health, obtained through a questionnaire applied to a stratified population at the end,in the educational environment, either with educational activities, or with simultaneously managerial and educational activities. The details having to do with research focused on the survey theory, turning into a working tool the questionnaires and statistical data that are processed from those questionnaires, are summarized below.

  3. Haldane's statistical interactions and universal properties of anyon systems

    International Nuclear Information System (INIS)

    Protogenov, A.

    1995-03-01

    The exclusion principle of fractional statistics proposed by Haldane is applied to systems with internal degrees of freedom. The symmetry of these systems is included in the statistical interaction matrix which contains the Cartan matrix of Lie algebras. The solutions of the equations for the statistical weights, which coincide with the thermodynamic Bethe ansatz equations are determined in the high temperature limit by the squares of q-deformed dimensions of irreducible representations. The entropy and other thermodynamic properties of anyon systems in this limit are completely characterized by the algebraic structure of symmetry in the universal form. (author). 39 refs

  4. A statistical investigation of the mass discrepancy-acceleration relation

    Science.gov (United States)

    Desmond, Harry

    2017-02-01

    We use the mass discrepancy-acceleration relation (the correlation between the ratio of total-to-visible mass and acceleration in galaxies; MDAR) to test the galaxy-halo connection. We analyse the MDAR using a set of 16 statistics that quantify its four most important features: shape, scatter, the presence of a `characteristic acceleration scale', and the correlation of its residuals with other galaxy properties. We construct an empirical framework for the galaxy-halo connection in LCDM to generate predictions for these statistics, starting with conventional correlations (halo abundance matching; AM) and introducing more where required. Comparing to the SPARC data, we find that: (1) the approximate shape of the MDAR is readily reproduced by AM, and there is no evidence that the acceleration at which dark matter becomes negligible has less spread in the data than in AM mocks; (2) even under conservative assumptions, AM significantly overpredicts the scatter in the relation and its normalization at low acceleration, and furthermore positions dark matter too close to galaxies' centres on average; (3) the MDAR affords 2σ evidence for an anticorrelation of galaxy size and Hubble type with halo mass or concentration at fixed stellar mass. Our analysis lays the groundwork for a bottom-up determination of the galaxy-halo connection from relations such as the MDAR, provides concrete statistical tests for specific galaxy formation models, and brings into sharper focus the relative evidence accorded by galaxy kinematics to LCDM and modified gravity alternatives.

  5. Quantifying fluctuations in economic systems by adapting methods of statistical physics

    Science.gov (United States)

    Stanley, H. E.; Gopikrishnan, P.; Plerou, V.; Amaral, L. A. N.

    2000-12-01

    The emerging subfield of econophysics explores the degree to which certain concepts and methods from statistical physics can be appropriately modified and adapted to provide new insights into questions that have been the focus of interest in the economics community. Here we give a brief overview of two examples of research topics that are receiving recent attention. A first topic is the characterization of the dynamics of stock price fluctuations. For example, we investigate the relation between trading activity - measured by the number of transactions NΔ t - and the price change GΔ t for a given stock, over a time interval [t, t+ Δt] . We relate the time-dependent standard deviation of price fluctuations - volatility - to two microscopic quantities: the number of transactions NΔ t in Δ t and the variance WΔ t2 of the price changes for all transactions in Δ t. Our work indicates that while the pronounced tails in the distribution of price fluctuations arise from WΔ t, the long-range correlations found in ∣ GΔ t∣ are largely due to NΔ t. We also investigate the relation between price fluctuations and the number of shares QΔ t traded in Δ t. We find that the distribution of QΔ t is consistent with a stable Lévy distribution, suggesting a Lévy scaling relationship between QΔ t and NΔ t, which would provide one explanation for volume-volatility co-movement. A second topic concerns cross-correlations between the price fluctuations of different stocks. We adapt a conceptual framework, random matrix theory (RMT), first used in physics to interpret statistical properties of nuclear energy spectra. RMT makes predictions for the statistical properties of matrices that are universal, that is, do not depend on the interactions between the elements comprising the system. In physics systems, deviations from the predictions of RMT provide clues regarding the mechanisms controlling the dynamics of a given system, so this framework can be of potential value if

  6. Statistics of Shared Components in Complex Component Systems

    Science.gov (United States)

    Mazzolini, Andrea; Gherardi, Marco; Caselle, Michele; Cosentino Lagomarsino, Marco; Osella, Matteo

    2018-04-01

    Many complex systems are modular. Such systems can be represented as "component systems," i.e., sets of elementary components, such as LEGO bricks in LEGO sets. The bricks found in a LEGO set reflect a target architecture, which can be built following a set-specific list of instructions. In other component systems, instead, the underlying functional design and constraints are not obvious a priori, and their detection is often a challenge of both scientific and practical importance, requiring a clear understanding of component statistics. Importantly, some quantitative invariants appear to be common to many component systems, most notably a common broad distribution of component abundances, which often resembles the well-known Zipf's law. Such "laws" affect in a general and nontrivial way the component statistics, potentially hindering the identification of system-specific functional constraints or generative processes. Here, we specifically focus on the statistics of shared components, i.e., the distribution of the number of components shared by different system realizations, such as the common bricks found in different LEGO sets. To account for the effects of component heterogeneity, we consider a simple null model, which builds system realizations by random draws from a universe of possible components. Under general assumptions on abundance heterogeneity, we provide analytical estimates of component occurrence, which quantify exhaustively the statistics of shared components. Surprisingly, this simple null model can positively explain important features of empirical component-occurrence distributions obtained from large-scale data on bacterial genomes, LEGO sets, and book chapters. Specific architectural features and functional constraints can be detected from occurrence patterns as deviations from these null predictions, as we show for the illustrative case of the "core" genome in bacteria.

  7. Thermodynamic Bethe ansatz with Haldane statistics

    International Nuclear Information System (INIS)

    Bytsko, A.G.; Fring, A.

    1998-01-01

    We derive the thermodynamic Bethe ansatz equation for the situation in which the statistical interaction of a multi-particle system is governed by Haldane statistics. We formulate a macroscopical equivalence principle for such systems. Particular CDD ambiguities play a distinguished role in compensating the ambiguity in the exclusion statistics. We derive Y-systems related to generalized statistics. We discuss several fermionic, bosonic and anyonic versions of affine Toda field theories and Calogero-Sutherland type models in the context of generalized statistics. (orig.)

  8. A computerized diagnostic system for nuclear plant control rooms based on statistical quality control

    International Nuclear Information System (INIS)

    Heising, C.D.; Grenzebach, W.S.

    1990-01-01

    In engineering science, statistical quality control techniques have traditionally been applied to control manufacturing processes. An application to commercial nuclear power plant maintenance and control is presented that can greatly improve safety. As a demonstration of such an approach to plant maintenance and control, a specific system is analyzed: the reactor coolant pumps of the St. Lucie Unit 2 nuclear power plant located in Florida. A 30-day history of the four pumps prior to a plant shutdown caused by pump failure and a related fire within the containment was analyzed. Statistical quality control charts of recorded variables were constructed for each pump, which were shown to go out of statistical control many days before the plant trip. The analysis shows that statistical process control methods can be applied as an early warning system capable of identifying significant equipment problems well in advance of traditional control room alarm indicators

  9. Quantum level statistics of pseudointegrable billiards

    International Nuclear Information System (INIS)

    Cheon, T.; Cohen, T.D.

    1989-01-01

    We study the spectral statistics of systems of two-dimensional pseudointegrable billiards. These systems are classically nonergodic, but nonseparable. It is found that such systems possess quantum spectra which are closely simulated by the Gaussian orthogonal ensemble. We discuss the implications of these results on the conjectured relation between classical chaos and quantum level statistics. We emphasize the importance of the semiclassical nature of any such relation

  10. Statistical physics of networks, information and complex systems

    Energy Technology Data Exchange (ETDEWEB)

    Ecke, Robert E [Los Alamos National Laboratory

    2009-01-01

    In this project we explore the mathematical methods and concepts of statistical physics that are fmding abundant applications across the scientific and technological spectrum from soft condensed matter systems and bio-infonnatics to economic and social systems. Our approach exploits the considerable similarity of concepts between statistical physics and computer science, allowing for a powerful multi-disciplinary approach that draws its strength from cross-fertilization and mUltiple interactions of researchers with different backgrounds. The work on this project takes advantage of the newly appreciated connection between computer science and statistics and addresses important problems in data storage, decoding, optimization, the infonnation processing properties of the brain, the interface between quantum and classical infonnation science, the verification of large software programs, modeling of complex systems including disease epidemiology, resource distribution issues, and the nature of highly fluctuating complex systems. Common themes that the project has been emphasizing are (i) neural computation, (ii) network theory and its applications, and (iii) a statistical physics approach to infonnation theory. The project's efforts focus on the general problem of optimization and variational techniques, algorithm development and infonnation theoretic approaches to quantum systems. These efforts are responsible for fruitful collaborations and the nucleation of science efforts that span multiple divisions such as EES, CCS, 0 , T, ISR and P. This project supports the DOE mission in Energy Security and Nuclear Non-Proliferation by developing novel infonnation science tools for communication, sensing, and interacting complex networks such as the internet or energy distribution system. The work also supports programs in Threat Reduction and Homeland Security.

  11. Statistical Modeling of Large Wind Plant System's Generation - A Case Study

    International Nuclear Information System (INIS)

    Sabolic, D.

    2014-01-01

    This paper presents simplistic, yet very accurate, descriptive statistical models of various static and dynamic parameters of energy output from a large system of wind plants operated by Bonneville Power Administration (BPA), USA. The system's size at the end of 2013 was 4515 MW of installed capacity. The 5-minute readings from the beginning of 2007 to the end of 2013, recorded and published by BPA, were used to derive a number of experimental distributions, which were then used to devise theoretic statistical models with merely one or two parameters. In spite of the simplicity, they reproduced experimental data with great accuracy, which was checked by rigorous tests of goodness-of-fit. Statistical distribution functions were obtained for the following wind generation-related quantities: total generation as percentage of total installed capacity; change in total generation power in 5, 10, 15, 20, 25, 30, 45, and 60 minutes as percentage of total installed capacity; duration of intervals with total generated power, expressed as percentage of total installed capacity, lower than certain pre-specified level. Limitation of total installed wind plant capacity, when it is determined by regulation demand from wind plants, is discussed, too. The models presented here can be utilized in analyses related to power system economics/policy, which is also briefly discussed in the paper. (author).

  12. Statistics of Shared Components in Complex Component Systems

    Directory of Open Access Journals (Sweden)

    Andrea Mazzolini

    2018-04-01

    Full Text Available Many complex systems are modular. Such systems can be represented as “component systems,” i.e., sets of elementary components, such as LEGO bricks in LEGO sets. The bricks found in a LEGO set reflect a target architecture, which can be built following a set-specific list of instructions. In other component systems, instead, the underlying functional design and constraints are not obvious a priori, and their detection is often a challenge of both scientific and practical importance, requiring a clear understanding of component statistics. Importantly, some quantitative invariants appear to be common to many component systems, most notably a common broad distribution of component abundances, which often resembles the well-known Zipf’s law. Such “laws” affect in a general and nontrivial way the component statistics, potentially hindering the identification of system-specific functional constraints or generative processes. Here, we specifically focus on the statistics of shared components, i.e., the distribution of the number of components shared by different system realizations, such as the common bricks found in different LEGO sets. To account for the effects of component heterogeneity, we consider a simple null model, which builds system realizations by random draws from a universe of possible components. Under general assumptions on abundance heterogeneity, we provide analytical estimates of component occurrence, which quantify exhaustively the statistics of shared components. Surprisingly, this simple null model can positively explain important features of empirical component-occurrence distributions obtained from large-scale data on bacterial genomes, LEGO sets, and book chapters. Specific architectural features and functional constraints can be detected from occurrence patterns as deviations from these null predictions, as we show for the illustrative case of the “core” genome in bacteria.

  13. Statistical Model Checking for Biological Systems

    DEFF Research Database (Denmark)

    David, Alexandre; Larsen, Kim Guldstrand; Legay, Axel

    2014-01-01

    Statistical Model Checking (SMC) is a highly scalable simulation-based verification approach for testing and estimating the probability that a stochastic system satisfies a given linear temporal property. The technique has been applied to (discrete and continuous time) Markov chains, stochastic...

  14. An application of an optimal statistic for characterizing relative orientations

    Science.gov (United States)

    Jow, Dylan L.; Hill, Ryley; Scott, Douglas; Soler, J. D.; Martin, P. G.; Devlin, M. J.; Fissel, L. M.; Poidevin, F.

    2018-02-01

    We present the projected Rayleigh statistic (PRS), a modification of the classic Rayleigh statistic, as a test for non-uniform relative orientation between two pseudo-vector fields. In the application here, this gives an effective way of investigating whether polarization pseudo-vectors (spin-2 quantities) are preferentially parallel or perpendicular to filaments in the interstellar medium. For example, there are other potential applications in astrophysics, e.g. when comparing small-scale orientations with larger scale shear patterns. We compare the efficiency of the PRS against histogram binning methods that have previously been used for characterizing the relative orientations of gas column density structures with the magnetic field projected on the plane of the sky. We examine data for the Vela C molecular cloud, where the column density is inferred from Herschel submillimetre observations, and the magnetic field from observations by the Balloon-borne Large-Aperture Submillimetre Telescope in the 250-, 350- and 500-μm wavelength bands. We find that the PRS has greater statistical power than approaches that bin the relative orientation angles, as it makes more efficient use of the information contained in the data. In particular, the use of the PRS to test for preferential alignment results in a higher statistical significance, in each of the four Vela C regions, with the greatest increase being by a factor 1.3 in the South-Nest region in the 250 - μ m band.

  15. Statistical estimation Monte Carlo for unreliability evaluation of highly reliable system

    International Nuclear Information System (INIS)

    Xiao Gang; Su Guanghui; Jia Dounan; Li Tianduo

    2000-01-01

    Based on analog Monte Carlo simulation, statistical Monte Carlo methods for unreliable evaluation of highly reliable system are constructed, including direct statistical estimation Monte Carlo method and weighted statistical estimation Monte Carlo method. The basal element is given, and the statistical estimation Monte Carlo estimators are derived. Direct Monte Carlo simulation method, bounding-sampling method, forced transitions Monte Carlo method, direct statistical estimation Monte Carlo and weighted statistical estimation Monte Carlo are used to evaluate unreliability of a same system. By comparing, weighted statistical estimation Monte Carlo estimator has smallest variance, and has highest calculating efficiency

  16. Characterization of a Compton suppression system and the applicability of Poisson statistics

    International Nuclear Information System (INIS)

    Nicholson, G.; Landsberger, S.; Welch, L.

    2008-01-01

    The Compton suppression system (CSS) has been thoroughly characterized at the University of Texas' Nuclear Engineering Teaching Laboratory (NETL). Effects of dead-time, sample displacement from primary detector, and primary energy detector position relative to the active shield detector have been measured and analyzed. Also, the applicability of Poisson counting statistics to Compton suppression spectroscopy has been evaluated. (author)

  17. Statistical language learning in neonates revealed by event-related brain potentials

    Directory of Open Access Journals (Sweden)

    Näätänen Risto

    2009-03-01

    Full Text Available Abstract Background Statistical learning is a candidate for one of the basic prerequisites underlying the expeditious acquisition of spoken language. Infants from 8 months of age exhibit this form of learning to segment fluent speech into distinct words. To test the statistical learning skills at birth, we recorded event-related brain responses of sleeping neonates while they were listening to a stream of syllables containing statistical cues to word boundaries. Results We found evidence that sleeping neonates are able to automatically extract statistical properties of the speech input and thus detect the word boundaries in a continuous stream of syllables containing no morphological cues. Syllable-specific event-related brain responses found in two separate studies demonstrated that the neonatal brain treated the syllables differently according to their position within pseudowords. Conclusion These results demonstrate that neonates can efficiently learn transitional probabilities or frequencies of co-occurrence between different syllables, enabling them to detect word boundaries and in this way isolate single words out of fluent natural speech. The ability to adopt statistical structures from speech may play a fundamental role as one of the earliest prerequisites of language acquisition.

  18. Official Statistics and Statistics Education: Bridging the Gap

    Directory of Open Access Journals (Sweden)

    Gal Iddo

    2017-03-01

    Full Text Available This article aims to challenge official statistics providers and statistics educators to ponder on how to help non-specialist adult users of statistics develop those aspects of statistical literacy that pertain to official statistics. We first document the gap in the literature in terms of the conceptual basis and educational materials needed for such an undertaking. We then review skills and competencies that may help adults to make sense of statistical information in areas of importance to society. Based on this review, we identify six elements related to official statistics about which non-specialist adult users should possess knowledge in order to be considered literate in official statistics: (1 the system of official statistics and its work principles; (2 the nature of statistics about society; (3 indicators; (4 statistical techniques and big ideas; (5 research methods and data sources; and (6 awareness and skills for citizens’ access to statistical reports. Based on this ad hoc typology, we discuss directions that official statistics providers, in cooperation with statistics educators, could take in order to (1 advance the conceptualization of skills needed to understand official statistics, and (2 expand educational activities and services, specifically by developing a collaborative digital textbook and a modular online course, to improve public capacity for understanding of official statistics.

  19. Statistical mechanics of socio-economic systems with heterogeneous agents

    International Nuclear Information System (INIS)

    De Martino, Andrea; Marsili, Matteo

    2006-01-01

    We review the statistical mechanics approach to the study of the emerging collective behaviour of systems of heterogeneous interacting agents. The general framework is presented through examples in such contexts as ecosystem dynamics and traffic modelling. We then focus on the analysis of the optimal properties of large random resource-allocation problems and on Minority Games and related models of speculative trading in financial markets, discussing a number of extensions including multi-asset models, majority games and models with asymmetric information. Finally, we summarize the main conclusions and outline the major open problems and limitations of the approach. (topical review)

  20. Multivariate statistical pattern recognition system for reactor noise analysis

    International Nuclear Information System (INIS)

    Gonzalez, R.C.; Howington, L.C.; Sides, W.H. Jr.; Kryter, R.C.

    1976-01-01

    A multivariate statistical pattern recognition system for reactor noise analysis was developed. The basis of the system is a transformation for decoupling correlated variables and algorithms for inferring probability density functions. The system is adaptable to a variety of statistical properties of the data, and it has learning, tracking, and updating capabilities. System design emphasizes control of the false-alarm rate. The ability of the system to learn normal patterns of reactor behavior and to recognize deviations from these patterns was evaluated by experiments at the ORNL High-Flux Isotope Reactor (HFIR). Power perturbations of less than 0.1 percent of the mean value in selected frequency ranges were detected by the system

  1. Multivariate statistical pattern recognition system for reactor noise analysis

    International Nuclear Information System (INIS)

    Gonzalez, R.C.; Howington, L.C.; Sides, W.H. Jr.; Kryter, R.C.

    1975-01-01

    A multivariate statistical pattern recognition system for reactor noise analysis was developed. The basis of the system is a transformation for decoupling correlated variables and algorithms for inferring probability density functions. The system is adaptable to a variety of statistical properties of the data, and it has learning, tracking, and updating capabilities. System design emphasizes control of the false-alarm rate. The ability of the system to learn normal patterns of reactor behavior and to recognize deviations from these patterns was evaluated by experiments at the ORNL High-Flux Isotope Reactor (HFIR). Power perturbations of less than 0.1 percent of the mean value in selected frequency ranges were detected by the system. 19 references

  2. Statistical quasi-particle theory for open quantum systems

    Science.gov (United States)

    Zhang, Hou-Dao; Xu, Rui-Xue; Zheng, Xiao; Yan, YiJing

    2018-04-01

    This paper presents a comprehensive account on the recently developed dissipaton-equation-of-motion (DEOM) theory. This is a statistical quasi-particle theory for quantum dissipative dynamics. It accurately describes the influence of bulk environments, with a few number of quasi-particles, the dissipatons. The novel dissipaton algebra is then followed, which readily bridges the Schrödinger equation to the DEOM theory. As a fundamental theory of quantum mechanics in open systems, DEOM characterizes both the stationary and dynamic properties of system-and-bath interferences. It treats not only the quantum dissipative systems of primary interest, but also the hybrid environment dynamics that could be experimentally measurable. Examples are the linear or nonlinear Fano interferences and the Herzberg-Teller vibronic couplings in optical spectroscopies. This review covers the DEOM construction, the underlying dissipaton algebra and theorems, the physical meanings of dynamical variables, the possible identifications of dissipatons, and some recent advancements in efficient DEOM evaluations on various problems. The relations of the present theory to other nonperturbative methods are also critically presented.

  3. Statistical mechanics of complex neural systems and high dimensional data

    International Nuclear Information System (INIS)

    Advani, Madhu; Lahiri, Subhaneil; Ganguli, Surya

    2013-01-01

    Recent experimental advances in neuroscience have opened new vistas into the immense complexity of neuronal networks. This proliferation of data challenges us on two parallel fronts. First, how can we form adequate theoretical frameworks for understanding how dynamical network processes cooperate across widely disparate spatiotemporal scales to solve important computational problems? Second, how can we extract meaningful models of neuronal systems from high dimensional datasets? To aid in these challenges, we give a pedagogical review of a collection of ideas and theoretical methods arising at the intersection of statistical physics, computer science and neurobiology. We introduce the interrelated replica and cavity methods, which originated in statistical physics as powerful ways to quantitatively analyze large highly heterogeneous systems of many interacting degrees of freedom. We also introduce the closely related notion of message passing in graphical models, which originated in computer science as a distributed algorithm capable of solving large inference and optimization problems involving many coupled variables. We then show how both the statistical physics and computer science perspectives can be applied in a wide diversity of contexts to problems arising in theoretical neuroscience and data analysis. Along the way we discuss spin glasses, learning theory, illusions of structure in noise, random matrices, dimensionality reduction and compressed sensing, all within the unified formalism of the replica method. Moreover, we review recent conceptual connections between message passing in graphical models, and neural computation and learning. Overall, these ideas illustrate how statistical physics and computer science might provide a lens through which we can uncover emergent computational functions buried deep within the dynamical complexities of neuronal networks. (paper)

  4. Phase flow and statistical structure of Galton-board systems

    International Nuclear Information System (INIS)

    Lue, A.; Brenner, H.

    1993-01-01

    Galton boards, found in museum exhibits devoted to science and technology, are often used to demonstrate visually the ubiquity of so-called ''laws of probability'' via an experimental realization of normal distributions. A detailed theoretical study of Galton-board phase-space dynamics and statistical behavior is presented. The study is based on a simple inelastic-collision model employing a particle fall- ing through a spatially periodic lattice of rigid, convex scatterers. We show that such systems exhibit indeterminate behavior through the presence of strange attractors or strange repellers in phase space; nevertheless, we also show that these systems exhibit regular and predictable behavior under specific circumstances. Phase-space strange attractors, periodic attractors, and strange repellers are present in numerical simulations, confirming results anticipated from geometric analysis. The system's geometry (dictated by lattice geometry and density as well as the direction of gravity) is observed to play a dominant role in stability, phase-flow topology, and statistical observations. Smale horseshoes appear to exist in the low-lattice-density limit and may exist in other regimes. These horseshoes are generated by homoclinic orbits whose existence is dictated by system characteristics. The horseshoes lead directly to deterministic chaos in the system. Strong evidence exists for ergodicity in all attractors. Phase-space complexities are manifested at all observed levels, particularly statistical ones. Consequently, statistical observations are critically dependent upon system details. Under well-defined circumstances, these observations display behavior which does not constitute a realization of the ''laws of probability.''

  5. Recent Advances in System Reliability Signatures, Multi-state Systems and Statistical Inference

    CERN Document Server

    Frenkel, Ilia

    2012-01-01

    Recent Advances in System Reliability discusses developments in modern reliability theory such as signatures, multi-state systems and statistical inference. It describes the latest achievements in these fields, and covers the application of these achievements to reliability engineering practice. The chapters cover a wide range of new theoretical subjects and have been written by leading experts in reliability theory and its applications.  The topics include: concepts and different definitions of signatures (D-spectra),  their  properties and applications  to  reliability of coherent systems and network-type structures; Lz-transform of Markov stochastic process and its application to multi-state system reliability analysis; methods for cost-reliability and cost-availability analysis of multi-state systems; optimal replacement and protection strategy; and statistical inference. Recent Advances in System Reliability presents many examples to illustrate the theoretical results. Real world multi-state systems...

  6. Non-statistical behavior of coupled optical systems

    International Nuclear Information System (INIS)

    Perez, G.; Pando Lambruschini, C.; Sinha, S.; Cerdeira, H.A.

    1991-10-01

    We study globally coupled chaotic maps modeling an optical system, and find clear evidence of non-statistical behavior: the mean square deviation (MSD) of the mean field saturates with respect to increase in the number of elements coupled, after a critical value, and its distribution is clearly non-Gaussian. We also find that the power spectrum of the mean field displays well defined peaks, indicating a subtle coherence among different elements, even in the ''turbulent'' phase. This system is a physically realistic model that may be experimentally realizable. It is also a higher dimensional example (as each individual element is given by a complex map). Its study confirms that the phenomena observed in a wide class of coupled one-dimensional maps are present here as well. This gives more evidence to believe that such non-statistical behavior is probably generic in globally coupled systems. We also investigate the influence of parametric fluctuations on the MSD. (author). 10 refs, 7 figs, 1 tab

  7. Former Prisoner of War Statistical Tracking System

    Data.gov (United States)

    Department of Veterans Affairs — The Former Prisoner of War (POW) Statistical Tracking System database is a registry designed to comply with Public Law 97-37, the Former Prisoner of War Benefits Act...

  8. Tandem mass spectrometry of human tryptic blood peptides calculated by a statistical algorithm and captured by a relational database with exploration by a general statistical analysis system.

    Science.gov (United States)

    Bowden, Peter; Beavis, Ron; Marshall, John

    2009-11-02

    A goodness of fit test may be used to assign tandem mass spectra of peptides to amino acid sequences and to directly calculate the expected probability of mis-identification. The product of the peptide expectation values directly yields the probability that the parent protein has been mis-identified. A relational database could capture the mass spectral data, the best fit results, and permit subsequent calculations by a general statistical analysis system. The many files of the Hupo blood protein data correlated by X!TANDEM against the proteins of ENSEMBL were collected into a relational database. A redundant set of 247,077 proteins and peptides were correlated by X!TANDEM, and that was collapsed to a set of 34,956 peptides from 13,379 distinct proteins. About 6875 distinct proteins were only represented by a single distinct peptide, 2866 proteins showed 2 distinct peptides, and 3454 proteins showed at least three distinct peptides by X!TANDEM. More than 99% of the peptides were associated with proteins that had cumulative expectation values, i.e. probability of false positive identification, of one in one hundred or less. The distribution of peptides per protein from X!TANDEM was significantly different than those expected from random assignment of peptides.

  9. Statistical mechanics for a system with imperfections: pt. 1

    International Nuclear Information System (INIS)

    Choh, S.T.; Kahng, W.H.; Um, C.I.

    1982-01-01

    Statistical mechanics is extended to treat a system where parts of the Hamiltonian are randomly varying. As the starting point of the theory, the statistical correlation among energy levels is neglected, allowing use of the central limit theorem of the probability theory. (Author)

  10. Statistical analysis of the uncertainty related to flood hazard appraisal

    Science.gov (United States)

    Notaro, Vincenza; Freni, Gabriele

    2015-12-01

    The estimation of flood hazard frequency statistics for an urban catchment is of great interest in practice. It provides the evaluation of potential flood risk and related damage and supports decision making for flood risk management. Flood risk is usually defined as function of the probability, that a system deficiency can cause flooding (hazard), and the expected damage, due to the flooding magnitude (damage), taking into account both the exposure and the vulnerability of the goods at risk. The expected flood damage can be evaluated by an a priori estimation of potential damage caused by flooding or by interpolating real damage data. With regard to flood hazard appraisal several procedures propose to identify some hazard indicator (HI) such as flood depth or the combination of flood depth and velocity and to assess the flood hazard corresponding to the analyzed area comparing the HI variables with user-defined threshold values or curves (penalty curves or matrixes). However, flooding data are usually unavailable or piecemeal allowing for carrying out a reliable flood hazard analysis, therefore hazard analysis is often performed by means of mathematical simulations aimed at evaluating water levels and flow velocities over catchment surface. As results a great part of the uncertainties intrinsic to flood risk appraisal can be related to the hazard evaluation due to the uncertainty inherent to modeling results and to the subjectivity of the user defined hazard thresholds applied to link flood depth to a hazard level. In the present work, a statistical methodology was proposed for evaluating and reducing the uncertainties connected with hazard level estimation. The methodology has been applied to a real urban watershed as case study.

  11. Coordination of the National Statistical System in the Information Security Context

    Directory of Open Access Journals (Sweden)

    O. H.

    2017-12-01

    Full Text Available The need for building the national statistical system (NSS as the framework for coordination of statistical works is substantiated. NSS is defined on the basis of system approach. It is emphasized that the essential conditions underlying NSS are strategic planning, reliance on internationally adopted methods and due consideration to country-specific environment. The role of the state coordination policy in organizing statistical activities in the NSS framework is highlighted, key objectives of the integrated national policy on coordination of statistical activities are given. Threats arising from non-existence of NSS in a country are shown: “irregular” pattern of statistical activities, resulting from absence of common legal, methodological and organizational grounds; high costs involved in the finished information product in parallel with its low quality; impossibility of administering the statistical information security in a coherent manner, i. e. keeping with the rules on confidentiality of data, preventing intentional distortion of information and keeping with the rules of treatment with data making the state secret. An extensive review of NSS functional objectives is made: to ensure the system development of the official statistics; to ensure confidentiality and protection of individual data; to establish interdepartmental mechanisms for control and protection of secret statistical information; to broaden and regulate the access to statistical data and their effective use. The need for creating the National Statistical Commission is grounded.

  12. Effective control of complex turbulent dynamical systems through statistical functionals.

    Science.gov (United States)

    Majda, Andrew J; Qi, Di

    2017-05-30

    Turbulent dynamical systems characterized by both a high-dimensional phase space and a large number of instabilities are ubiquitous among complex systems in science and engineering, including climate, material, and neural science. Control of these complex systems is a grand challenge, for example, in mitigating the effects of climate change or safe design of technology with fully developed shear turbulence. Control of flows in the transition to turbulence, where there is a small dimension of instabilities about a basic mean state, is an important and successful discipline. In complex turbulent dynamical systems, it is impossible to track and control the large dimension of instabilities, which strongly interact and exchange energy, and new control strategies are needed. The goal of this paper is to propose an effective statistical control strategy for complex turbulent dynamical systems based on a recent statistical energy principle and statistical linear response theory. We illustrate the potential practical efficiency and verify this effective statistical control strategy on the 40D Lorenz 1996 model in forcing regimes with various types of fully turbulent dynamics with nearly one-half of the phase space unstable.

  13. Statistical methods to monitor the West Valley off-gas system

    International Nuclear Information System (INIS)

    Eggett, D.L.

    1990-01-01

    This paper reports on the of-gas system for the ceramic melter operated at the West Valley Demonstration Project at West Valley, NY, monitored during melter operation. A one-at-a-time method of monitoring the parameters of the off-gas system is not statistically sound. Therefore, multivariate statistical methods appropriate for the monitoring of many correlated parameters will be used. Monitoring a large number of parameters increases the probability of a false out-of-control signal. If the parameters being monitored are statistically independent, the control limits can be easily adjusted to obtain the desired probability of a false out-of-control signal. The principal component (PC) scores have desirable statistical properties when the original variables are distributed as multivariate normals. Two statistics derived from the PC scores and used to form multivariate control charts are outlined and their distributional properties reviewed

  14. Statistics of multi-tube detecting systems

    International Nuclear Information System (INIS)

    Grau Carles, P.; Grau Malonda, A.

    1994-01-01

    In this paper three new statistical theorems are demonstrated and applied. These theorems simplify very much the obtention of the formulae to compute the counting efficiency when the detection system is formed by several photomultipliers associated in coincidence and sume. These theorems are applied to several photomultiplier arrangements in order to show their potential and the application. way

  15. Energy-level statistics and time relaxation in quantum systems

    International Nuclear Information System (INIS)

    Gruver, J.L.; Cerdeira, H.A.; Aliaga, J.; Mello, P.A.; Proto, A.N.

    1997-05-01

    We study a quantum-mechanical system, prepared, at t = 0, in a model state, that subsequently decays into a sea of other states whose energy levels form a discrete spectrum with given statistical properties. An important quantity is the survival probability P(t), defined as the probability, at time t, to find the system in the original model state. Our main purpose is to analyze the influence of the discreteness and statistical properties of the spectrum on the behavior of P(t). Since P(t) itself is a statistical quantity, we restrict our attention to its ensemble average , which is calculated analytically using random-matrix techniques, within certain approximations discussed in the text. We find, for , an exponential decay, followed by a revival, governed by the two-point structure of the statistical spectrum, thus giving a nonzero asymptotic value for large t's. The analytic result compares well with a number of computer simulations, over a time range discussed in the text. (author). 17 refs, 1 fig

  16. Statistical regularities in art: Relations with visual coding and perception.

    Science.gov (United States)

    Graham, Daniel J; Redies, Christoph

    2010-07-21

    Since at least 1935, vision researchers have used art stimuli to test human response to complex scenes. This is sensible given the "inherent interestingness" of art and its relation to the natural visual world. The use of art stimuli has remained popular, especially in eye tracking studies. Moreover, stimuli in common use by vision scientists are inspired by the work of famous artists (e.g., Mondrians). Artworks are also popular in vision science as illustrations of a host of visual phenomena, such as depth cues and surface properties. However, until recently, there has been scant consideration of the spatial, luminance, and color statistics of artwork, and even less study of ways that regularities in such statistics could affect visual processing. Furthermore, the relationship between regularities in art images and those in natural scenes has received little or no attention. In the past few years, there has been a concerted effort to study statistical regularities in art as they relate to neural coding and visual perception, and art stimuli have begun to be studied in rigorous ways, as natural scenes have been. In this minireview, we summarize quantitative studies of links between regular statistics in artwork and processing in the visual stream. The results of these studies suggest that art is especially germane to understanding human visual coding and perception, and it therefore warrants wider study. Copyright 2010 Elsevier Ltd. All rights reserved.

  17. Managing risk in statistics - "Relative risk" | Durrheim | South African ...

    African Journals Online (AJOL)

    South African Family Practice. Journal Home · ABOUT THIS JOURNAL · Advanced Search · Current Issue · Archives · Journal Home > Vol 45, No 8 (2003) >. Log in or Register to get access to full text downloads. Username, Password, Remember me, or Register. Managing risk in statistics - "Relative risk". DN Durrheim ...

  18. Statistical analysis of the Ft. Calhoun reactor coolant pump system

    International Nuclear Information System (INIS)

    Patel, Bimal; Heising, C.D.

    1997-01-01

    In engineering science, statistical quality control techniques have traditionally been applied to control manufacturing processes. An application to commercial nuclear power plant maintenance and control is presented that can greatly improve plant safety. As a demonstration of such an approach, a specific system is analyzed: the reactor coolant pumps (RCPs) of the Ft. Calhoun nuclear power plant. This research uses capability analysis, Shewhart X-bar, R charts, canonical correlation methods, and design of experiments to analyze the process for the state of statistical control. The results obtained show that six out of ten parameters are under control specification limits and four parameters are not in the state of statistical control. The analysis shows that statistical process control methods can be applied as an early warning system capable of identifying significant equipment problems well in advance of traditional control room alarm indicators. Such a system would provide operators with ample time to respond to possible emergency situations and thus improve plant safety and reliability. (Author)

  19. WATER POLO GAME-RELATED STATISTICS IN WOMEN'S INTERNATIONAL CHAMPIONSHIPS: DIFFERENCES AND DISCRIMINATORY POWER

    Directory of Open Access Journals (Sweden)

    Yolanda Escalante

    2012-09-01

    Full Text Available The aims of this study were (i to compare women's water polo game-related statistics by match outcome (winning and losing teams and phase (preliminary, classificatory, and semi-final/bronze medal/gold medal, and (ii identify characteristics that discriminate performances for each phase. The game-related statistics of the 124 women's matches played in five International Championships (World and European Championships were analyzed. Differences between winning and losing teams in each phase were determined using the chi-squared. A discriminant analysis was then performed according to context in each of the three phases. It was found that the game-related statistics differentiate the winning from the losing teams in each phase of an international championship. The differentiating variables were both offensive (centre goals, power-play goals, counterattack goal, assists, offensive fouls, steals, blocked shots, and won sprints and defensive (goalkeeper-blocked shots, goalkeeper-blocked inferiority shots, and goalkeeper-blocked 5-m shots. The discriminant analysis showed the game-related statistics to discriminate performance in all phases: preliminary, classificatory, and final phases (92%, 90%, and 83%, respectively. Two variables were discriminatory by match outcome (winning or losing teams in all three phases: goals and goalkeeper-blocked shots

  20. Statistical thermodynamics understanding the properties of macroscopic systems

    CERN Document Server

    Fai, Lukong Cornelius

    2012-01-01

    Basic Principles of Statistical PhysicsMicroscopic and Macroscopic Description of StatesBasic PostulatesGibbs Ergodic AssumptionGibbsian EnsemblesExperimental Basis of Statistical MechanicsDefinition of Expectation ValuesErgodic Principle and Expectation ValuesProperties of Distribution FunctionRelative Fluctuation of an Additive Macroscopic ParameterLiouville TheoremGibbs Microcanonical EnsembleMicrocanonical Distribution in Quantum MechanicsDensity MatrixDensity Matrix in Energy RepresentationEntropyThermodynamic FunctionsTemperatureAdiabatic ProcessesPressureThermodynamic IdentityLaws of Th

  1. Revisiting Classification of Eating Disorders-toward Diagnostic and Statistical Manual of Mental Disorders-5 and International Statistical Classification of Diseases and Related Health Problems-11.

    Science.gov (United States)

    Goyal, Shrigopal; Balhara, Yatan Pal Singh; Khandelwal, S K

    2012-07-01

    Two of the most commonly used nosological systems- International Statistical Classification of Diseases and Related Health Problems (ICD)-10 and Diagnostic and Statistical Manual of Mental Disorders (DSM)-IV are under revision. This process has generated a lot of interesting debates with regards to future of the current diagnostic categories. In fact, the status of categorical approach in the upcoming versions of ICD and DSM is also being debated. The current article focuses on the debate with regards to the eating disorders. The existing classification of eating disorders has been criticized for its limitations. A host of new diagnostic categories have been recommended for inclusion in the upcoming revisions. Also the structure of the existing categories has also been put under scrutiny.

  2. Infrastructure requirement of knowledge management system model of statistical learning tool (SLT) for education community

    Science.gov (United States)

    Abdullah, Rusli; Samah, Bahaman Abu; Bolong, Jusang; D'Silva, Jeffrey Lawrence; Shaffril, Hayrol Azril Mohamed

    2014-09-01

    Today, teaching and learning (T&L) using technology as tool is becoming more important especially in the field of statistics as a part of the subject matter in higher education system environment. Eventhough, there are many types of technology of statistical learnig tool (SLT) which can be used to support and enhance T&L environment, however, there is lack of a common standard knowledge management as a knowledge portal for guidance especially in relation to infrastructure requirement of SLT in servicing the community of user (CoU) such as educators, students and other parties who are interested in performing this technology as a tool for their T&L. Therefore, there is a need of a common standard infrastructure requirement of knowledge portal in helping CoU for managing of statistical knowledge in acquiring, storing, desseminating and applying of the statistical knowedge for their specific purposes. Futhermore, by having this infrastructure requirement of knowledge portal model of SLT as a guidance in promoting knowledge of best practise among the CoU, it can also enhance the quality and productivity of their work towards excellence of statistical knowledge application in education system environment.

  3. Statistics of multi-tube detecting systems

    International Nuclear Information System (INIS)

    Grau Carles, P.; Grau Malonda, A.

    1994-01-01

    In this paper three new statistical theorems are demonstrated and applied. These theorems simplify very much the obtention of the formulae to compute the counting efficiency when the detection system is formed by several photomultipliers associated in coincidence and sum. These theorems are applied to several photomultiplier arrangements in order to show their potential and the application way. (Author) 6 refs

  4. REANALYSIS OF F-STATISTIC GRAVITATIONAL-WAVE SEARCHES WITH THE HIGHER CRITICISM STATISTIC

    International Nuclear Information System (INIS)

    Bennett, M. F.; Melatos, A.; Delaigle, A.; Hall, P.

    2013-01-01

    We propose a new method of gravitational-wave detection using a modified form of higher criticism, a statistical technique introduced by Donoho and Jin. Higher criticism is designed to detect a group of sparse, weak sources, none of which are strong enough to be reliably estimated or detected individually. We apply higher criticism as a second-pass method to synthetic F-statistic and C-statistic data for a monochromatic periodic source in a binary system and quantify the improvement relative to the first-pass methods. We find that higher criticism on C-statistic data is more sensitive by ∼6% than the C-statistic alone under optimal conditions (i.e., binary orbit known exactly) and the relative advantage increases as the error in the orbital parameters increases. Higher criticism is robust even when the source is not monochromatic (e.g., phase-wandering in an accreting system). Applying higher criticism to a phase-wandering source over multiple time intervals gives a ∼> 30% increase in detectability with few assumptions about the frequency evolution. By contrast, in all-sky searches for unknown periodic sources, which are dominated by the brightest source, second-pass higher criticism does not provide any benefits over a first-pass search.

  5. Statistical theory of signal detection

    CERN Document Server

    Helstrom, Carl Wilhelm; Costrell, L; Kandiah, K

    1968-01-01

    Statistical Theory of Signal Detection, Second Edition provides an elementary introduction to the theory of statistical testing of hypotheses that is related to the detection of signals in radar and communications technology. This book presents a comprehensive survey of digital communication systems. Organized into 11 chapters, this edition begins with an overview of the theory of signal detection and the typical detection problem. This text then examines the goals of the detection system, which are defined through an analogy with the testing of statistical hypotheses. Other chapters consider

  6. PREFACE: Advanced many-body and statistical methods in mesoscopic systems

    Science.gov (United States)

    Anghel, Dragos Victor; Sabin Delion, Doru; Sorin Paraoanu, Gheorghe

    2012-02-01

    It has increasingly been realized in recent times that the borders separating various subfields of physics are largely artificial. This is the case for nanoscale physics, physics of lower-dimensional systems and nuclear physics, where the advanced techniques of many-body theory developed in recent times could provide a unifying framework for these disciplines under the general name of mesoscopic physics. Other fields, such as quantum optics and quantum information, are increasingly using related methods. The 6-day conference 'Advanced many-body and statistical methods in mesoscopic systems' that took place in Constanta, Romania, between 27 June and 2 July 2011 was, we believe, a successful attempt at bridging an impressive list of topical research areas: foundations of quantum physics, equilibrium and non-equilibrium quantum statistics/fractional statistics, quantum transport, phases and phase transitions in mesoscopic systems/superfluidity and superconductivity, quantum electromechanical systems, quantum dissipation, dephasing, noise and decoherence, quantum information, spin systems and their dynamics, fundamental symmetries in mesoscopic systems, phase transitions, exactly solvable methods for mesoscopic systems, various extension of the random phase approximation, open quantum systems, clustering, decay and fission modes and systematic versus random behaviour of nuclear spectra. This event brought together participants from seventeen countries and five continents. Each of the participants brought considerable expertise in his/her field of research and, at the same time, was exposed to the newest results and methods coming from the other, seemingly remote, disciplines. The talks touched on subjects that are at the forefront of topical research areas and we hope that the resulting cross-fertilization of ideas will lead to new, interesting results from which everybody will benefit. We are grateful for the financial and organizational support from IFIN-HH, Ovidius

  7. A new formalism for non extensive physical systems: Tsallis Thermo statistics

    International Nuclear Information System (INIS)

    Tirnakli, U.; Bueyuekkilic, F.; Demirhan, D.

    1999-01-01

    Although Boltzmann-Gibbs (BG) statistics provides a suitable tool which enables us to handle a large number of physical systems satisfactorily, it has some basic restrictions. Recently a non extensive thermo statistics has been proposed by C.Tsallis to handle the non extensive physical systems and up to now, besides the generalization of some of the conventional concepts, the formalism has been prosperous in some of the physical applications. In this study, our effort is to introduce Tsallis thermo statistics in some details and to emphasize its achievements on physical systems by noting the recent developments on this line

  8. Statistical modeling to support power system planning

    Science.gov (United States)

    Staid, Andrea

    This dissertation focuses on data-analytic approaches that improve our understanding of power system applications to promote better decision-making. It tackles issues of risk analysis, uncertainty management, resource estimation, and the impacts of climate change. Tools of data mining and statistical modeling are used to bring new insight to a variety of complex problems facing today's power system. The overarching goal of this research is to improve the understanding of the power system risk environment for improved operation, investment, and planning decisions. The first chapter introduces some challenges faced in planning for a sustainable power system. Chapter 2 analyzes the driving factors behind the disparity in wind energy investments among states with a goal of determining the impact that state-level policies have on incentivizing wind energy. Findings show that policy differences do not explain the disparities; physical and geographical factors are more important. Chapter 3 extends conventional wind forecasting to a risk-based focus of predicting maximum wind speeds, which are dangerous for offshore operations. Statistical models are presented that issue probabilistic predictions for the highest wind speed expected in a three-hour interval. These models achieve a high degree of accuracy and their use can improve safety and reliability in practice. Chapter 4 examines the challenges of wind power estimation for onshore wind farms. Several methods for wind power resource assessment are compared, and the weaknesses of the Jensen model are demonstrated. For two onshore farms, statistical models outperform other methods, even when very little information is known about the wind farm. Lastly, chapter 5 focuses on the power system more broadly in the context of the risks expected from tropical cyclones in a changing climate. Risks to U.S. power system infrastructure are simulated under different scenarios of tropical cyclone behavior that may result from climate

  9. Statistical mechanics in the context of special relativity. II.

    Science.gov (United States)

    Kaniadakis, G

    2005-09-01

    The special relativity laws emerge as one-parameter (light speed) generalizations of the corresponding laws of classical physics. These generalizations, imposed by the Lorentz transformations, affect both the definition of the various physical observables (e.g., momentum, energy, etc.), as well as the mathematical apparatus of the theory. Here, following the general lines of [Phys. Rev. E 66, 056125 (2002)], we show that the Lorentz transformations impose also a proper one-parameter generalization of the classical Boltzmann-Gibbs-Shannon entropy. The obtained relativistic entropy permits us to construct a coherent and self-consistent relativistic statistical theory, preserving the main features of the ordinary statistical theory, which is recovered in the classical limit. The predicted distribution function is a one-parameter continuous deformation of the classical Maxwell-Boltzmann distribution and has a simple analytic form, showing power law tails in accordance with the experimental evidence. Furthermore, this statistical mechanics can be obtained as the stationary case of a generalized kinetic theory governed by an evolution equation obeying the H theorem and reproducing the Boltzmann equation of the ordinary kinetics in the classical limit.

  10. Electrical Conductivity of Charged Particle Systems and Zubarev's Nonequilibrium Statistical Operator Method

    Science.gov (United States)

    Röpke, G.

    2018-01-01

    One of the fundamental problems in physics that are not yet rigorously solved is the statistical mechanics of nonequilibrium processes. An important contribution to describing irreversible behavior starting from reversible Hamiltonian dynamics was given by D. N. Zubarev, who invented the method of the nonequilibrium statistical operator. We discuss this approach, in particular, the extended von Neumann equation, and as an example consider the electrical conductivity of a system of charged particles. We consider the selection of the set of relevant observables. We show the relation between kinetic theory and linear response theory. Using thermodynamic Green's functions, we present a systematic treatment of correlation functions, but the convergence needs investigation. We compare different expressions for the conductivity and list open questions.

  11. Research and Development of Statistical Analysis Software System of Maize Seedling Experiment

    OpenAIRE

    Hui Cao

    2014-01-01

    In this study, software engineer measures were used to develop a set of software system for maize seedling experiments statistics and analysis works. During development works, B/S structure software design method was used and a set of statistics indicators for maize seedling evaluation were established. The experiments results indicated that this set of software system could finish quality statistics and analysis for maize seedling very well. The development of this software system explored a...

  12. Statistical thermodynamics

    International Nuclear Information System (INIS)

    Lim, Gyeong Hui

    2008-03-01

    This book consists of 15 chapters, which are basic conception and meaning of statistical thermodynamics, Maxwell-Boltzmann's statistics, ensemble, thermodynamics function and fluctuation, statistical dynamics with independent particle system, ideal molecular system, chemical equilibrium and chemical reaction rate in ideal gas mixture, classical statistical thermodynamics, ideal lattice model, lattice statistics and nonideal lattice model, imperfect gas theory on liquid, theory on solution, statistical thermodynamics of interface, statistical thermodynamics of a high molecule system and quantum statistics

  13. Statistical mechanics of systems of unbounded spins

    Energy Technology Data Exchange (ETDEWEB)

    Lebowitz, J L [Yeshiva Univ., New York (USA). Belfer Graduate School of Science; Presutti, E [L' Aquila Univ. (Italy). Istituto di Matematica

    1976-11-01

    We develop the statistical mechanics of unbounded n-component spin systems interacting via potentials which are superstable and strongly tempered. The uniqueness of the equilibrium state is then proven for one component ferromagnetic spins whose free energy is differentiable with respect to the magnetic field.

  14. Statistical analysis of the Ft. Calhoun reactor coolant pump system

    International Nuclear Information System (INIS)

    Heising, Carolyn D.

    1998-01-01

    In engineering science, statistical quality control techniques have traditionally been applied to control manufacturing processes. An application to commercial nuclear power plant maintenance and control is presented that can greatly improve plant safety. As a demonstration of such an approach to plant maintenance and control, a specific system is analyzed: the reactor coolant pumps (RCPs) of the Ft. Calhoun nuclear power plant. This research uses capability analysis, Shewhart X-bar, R-charts, canonical correlation methods, and design of experiments to analyze the process for the state of statistical control. The results obtained show that six out of ten parameters are under control specifications limits and four parameters are not in the state of statistical control. The analysis shows that statistical process control methods can be applied as an early warning system capable of identifying significant equipment problems well in advance of traditional control room alarm indicators Such a system would provide operators with ample time to respond to possible emergency situations and thus improve plant safety and reliability. (author)

  15. Statistical mechanics of nonequilibrium liquids

    CERN Document Server

    Evans, Denis J; Craig, D P; McWeeny, R

    1990-01-01

    Statistical Mechanics of Nonequilibrium Liquids deals with theoretical rheology. The book discusses nonlinear response of systems and outlines the statistical mechanical theory. In discussing the framework of nonequilibrium statistical mechanics, the book explains the derivation of a nonequilibrium analogue of the Gibbsian basis for equilibrium statistical mechanics. The book reviews the linear irreversible thermodynamics, the Liouville equation, and the Irving-Kirkwood procedure. The text then explains the Green-Kubo relations used in linear transport coefficients, the linear response theory,

  16. A Concise Introduction to the Statistical Physics of Complex Systems

    CERN Document Server

    Bertin, Eric

    2012-01-01

    This concise primer (based on lectures given at summer schools on complex systems and on a masters degree course in complex systems modeling) will provide graduate students and newcomers to the field with the basic knowledge of the concepts and methods of statistical physics and its potential for application to interdisciplinary topics.  Indeed, in recent years, statistical physics has begun to attract the interest of a broad community of researchers in the field of complex system sciences, ranging from biology to the social sciences, economics and computer science. More generally, a growing number of graduate students and researchers feel the need to learn some basic concepts and questions originating in other disciplines without necessarily having to master all of the corresponding technicalities and jargon. Generally speaking, the goals of statistical physics may be summarized as follows: on the one hand to study systems composed of a large number of interacting ‘entities’, and on the other to predict...

  17. Data analysis using the Gnu R system for statistical computation

    Energy Technology Data Exchange (ETDEWEB)

    Simone, James; /Fermilab

    2011-07-01

    R is a language system for statistical computation. It is widely used in statistics, bioinformatics, machine learning, data mining, quantitative finance, and the analysis of clinical drug trials. Among the advantages of R are: it has become the standard language for developing statistical techniques, it is being actively developed by a large and growing global user community, it is open source software, it is highly portable (Linux, OS-X and Windows), it has a built-in documentation system, it produces high quality graphics and it is easily extensible with over four thousand extension library packages available covering statistics and applications. This report gives a very brief introduction to R with some examples using lattice QCD simulation results. It then discusses the development of R packages designed for chi-square minimization fits for lattice n-pt correlation functions.

  18. Statistical analysis of events related to emergency diesel generators failures in the nuclear industry

    Energy Technology Data Exchange (ETDEWEB)

    Kančev, Duško, E-mail: dusko.kancev@ec.europa.eu [European Commission, DG-JRC, Institute for Energy and Transport, P.O. Box 2, NL-1755 ZG Petten (Netherlands); Duchac, Alexander; Zerger, Benoit [European Commission, DG-JRC, Institute for Energy and Transport, P.O. Box 2, NL-1755 ZG Petten (Netherlands); Maqua, Michael [Gesellschaft für Anlagen-und-Reaktorsicherheit (GRS) mbH, Schwetnergasse 1, 50667 Köln (Germany); Wattrelos, Didier [Institut de Radioprotection et de Sûreté Nucléaire (IRSN), BP 17 - 92262 Fontenay-aux-Roses Cedex (France)

    2014-07-01

    Highlights: • Analysis of operating experience related to emergency diesel generators events at NPPs. • Four abundant operating experience databases screened. • Delineating important insights and conclusions based on the operating experience. - Abstract: This paper is aimed at studying the operating experience related to emergency diesel generators (EDGs) events at nuclear power plants collected from the past 20 years. Events related to EDGs failures and/or unavailability as well as all the supporting equipment are in the focus of the analysis. The selected operating experience was analyzed in detail in order to identify the type of failures, attributes that contributed to the failure, failure modes potential or real, discuss risk relevance, summarize important lessons learned, and provide recommendations. The study in this particular paper is tightly related to the performing of statistical analysis of the operating experience. For the purpose of this study EDG failure is defined as EDG failure to function on demand (i.e. fail to start, fail to run) or during testing, or an unavailability of an EDG, except of unavailability due to regular maintenance. The Gesellschaft für Anlagen und Reaktorsicherheit mbH (GRS) and Institut de Radioprotection et de Sûreté Nucléaire (IRSN) databases as well as the operating experience contained in the IAEA/NEA International Reporting System for Operating Experience and the U.S. Licensee Event Reports were screened. The screening methodology applied for each of the four different databases is presented. Further on, analysis aimed at delineating the causes, root causes, contributing factors and consequences are performed. A statistical analysis was performed related to the chronology of events, types of failures, the operational circumstances of detection of the failure and the affected components/subsystems. The conclusions and results of the statistical analysis are discussed. The main findings concerning the testing

  19. Statistical analysis of events related to emergency diesel generators failures in the nuclear industry

    International Nuclear Information System (INIS)

    Kančev, Duško; Duchac, Alexander; Zerger, Benoit; Maqua, Michael; Wattrelos, Didier

    2014-01-01

    Highlights: • Analysis of operating experience related to emergency diesel generators events at NPPs. • Four abundant operating experience databases screened. • Delineating important insights and conclusions based on the operating experience. - Abstract: This paper is aimed at studying the operating experience related to emergency diesel generators (EDGs) events at nuclear power plants collected from the past 20 years. Events related to EDGs failures and/or unavailability as well as all the supporting equipment are in the focus of the analysis. The selected operating experience was analyzed in detail in order to identify the type of failures, attributes that contributed to the failure, failure modes potential or real, discuss risk relevance, summarize important lessons learned, and provide recommendations. The study in this particular paper is tightly related to the performing of statistical analysis of the operating experience. For the purpose of this study EDG failure is defined as EDG failure to function on demand (i.e. fail to start, fail to run) or during testing, or an unavailability of an EDG, except of unavailability due to regular maintenance. The Gesellschaft für Anlagen und Reaktorsicherheit mbH (GRS) and Institut de Radioprotection et de Sûreté Nucléaire (IRSN) databases as well as the operating experience contained in the IAEA/NEA International Reporting System for Operating Experience and the U.S. Licensee Event Reports were screened. The screening methodology applied for each of the four different databases is presented. Further on, analysis aimed at delineating the causes, root causes, contributing factors and consequences are performed. A statistical analysis was performed related to the chronology of events, types of failures, the operational circumstances of detection of the failure and the affected components/subsystems. The conclusions and results of the statistical analysis are discussed. The main findings concerning the testing

  20. Statistical properties of chaotic dynamical systems which exhibit strange attractors

    International Nuclear Information System (INIS)

    Jensen, R.V.; Oberman, C.R.

    1981-07-01

    A path integral method is developed for the calculation of the statistical properties of turbulent dynamical systems. The method is applicable to conservative systems which exhibit a transition to stochasticity as well as dissipative systems which exhibit strange attractors. A specific dissipative mapping is considered in detail which models the dynamics of a Brownian particle in a wave field with a broad frequency spectrum. Results are presented for the low order statistical moments for three turbulent regimes which exhibit strange attractors corresponding to strong, intermediate, and weak collisional damping

  1. The Structural Reforms of the Chinese Statistical System Die Strukturreformen des chinesischen Statistiksystems

    Directory of Open Access Journals (Sweden)

    Günter Moser

    2009-04-01

    Full Text Available The quality of statistical data covering the economic and social development of the People’s Republic of China has been questioned by international and national data users for years. The reasons for this doubt lie mainly in the structure of the Chinese system of statistics. Two parallel systems exist which operate largely autonomously: the national system of statistics and the sectoral system of statistics. In the area of the national statistical system, the National Bureau of Statistics (NBS has the authority to order and collect statistics. This competence lies with the ministries and authorities below the ministerial level. This article describes and analyses these structures, the resulting problems, and the reform measures taken to date. It also aims to provide a better understanding of the statistical data about the People’s Republic of China and to enable an assessment of them within a changing structural context. In conclusion, approaches to further reforms will be provided based on the author’s long-standing experience in cooperation projects with the official Chinese statistics agencies. Die Qualität der Statistiken zur ökonomischen und sozialen Entwicklung in der Volksrepublik China ist in letzter Zeit sowohl von ausländischen wie auch von einheimischen Nutzern der Daten in Frage gestellt worden. Die Gründe dafür liegen vor allem in der Struktur des Erhebungssystems in China.

  2. A statistical approach to root system classification.

    Directory of Open Access Journals (Sweden)

    Gernot eBodner

    2013-08-01

    Full Text Available Plant root systems have a key role in ecology and agronomy. In spite of fast increase in root studies, still there is no classification that allows distinguishing among distinctive characteristics within the diversity of rooting strategies. Our hypothesis is that a multivariate approach for plant functional type identification in ecology can be applied to the classification of root systems. We demonstrate that combining principal component and cluster analysis yields a meaningful classification of rooting types based on morphological traits. The classification method presented is based on a data-defined statistical procedure without a priori decision on the classifiers. Biplot inspection is used to determine key traits and to ensure stability in cluster based grouping. The classification method is exemplified with simulated root architectures and morphological field data. Simulated root architectures showed that morphological attributes with spatial distribution parameters capture most distinctive features within root system diversity. While developmental type (tap vs. shoot-borne systems is a strong, but coarse classifier, topological traits provide the most detailed differentiation among distinctive groups. Adequacy of commonly available morphologic traits for classification is supported by field data. Three rooting types emerged from measured data, distinguished by diameter/weight, density and spatial distribution respectively. Similarity of root systems within distinctive groups was the joint result of phylogenetic relation and environmental as well as human selection pressure. We concluded that the data-define classification is appropriate for integration of knowledge obtained with different root measurement methods and at various scales. Currently root morphology is the most promising basis for classification due to widely used common measurement protocols. To capture details of root diversity efforts in architectural measurement

  3. Wind energy statistics 2011; Vindkraftsstatistik 2011

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    2012-11-01

    Wind energy statistics 2011 is the fifth publication in the annual series. The report's focus is on regional distribution, i e the number of plants and installed capacity allocated to counties and municipalities. The publication also reports a division between sea- and land-based plants and the size of wind farms in Sweden in terms of installed capacity. The publication is published in spring in report form and since 2010 statistics on number of plants, installed capacity, and regional distribution semi-annually are also presented on the Swedish Energy Agency's website. The statistics relating to installed capacity, number of wind farms and location in this publication is taken from the electricity certificate system, introduced in May 2003. Thanks to the electricity certificate system there is in principle comprehensive statistics of wind energy which in this publication is presented in different intersections. Statistics related to electricity production is taken from the Swedish Kraftnaets [Swedish national grid's] registry Cesar.

  4. Statistical analysis of complex systems with nonclassical invariant measures

    KAUST Repository

    Fratalocchi, Andrea

    2011-01-01

    I investigate the problem of finding a statistical description of a complex many-body system whose invariant measure cannot be constructed stemming from classical thermodynamics ensembles. By taking solitons as a reference system and by employing a

  5. A statistical modeling approach to build expert credit risk rating systems

    DEFF Research Database (Denmark)

    Waagepetersen, Rasmus

    2010-01-01

    This paper presents an efficient method for extracting expert knowledge when building a credit risk rating system. Experts are asked to rate a sample of counterparty cases according to creditworthiness. Next, a statistical model is used to capture the relation between the characteristics...... of a counterparty and the expert rating. For any counterparty the model can identify the rating, which would be agreed upon by the majority of experts. Furthermore, the model can quantify the concurrence among experts. The approach is illustrated by a case study regarding the construction of an application score...

  6. Influence of Signal and Noise on Statistical Fluctuation of Single-Mode Laser System

    International Nuclear Information System (INIS)

    Xu Dahai; Cheng Qinghua; Cao Li; Wu Dajin

    2006-01-01

    On the basis of calculating the steady-state mean normalized intensity fluctuation of a signal-mode laser system driven by both colored pump noise with signal modulation and the quantum noise with cross-correlation between its real and imaginary parts, we analyze the influence of modulation signal, noise, and its correlation form on the statistical fluctuation of the laser system. We have found that when the amplitude of modulation signal weakens and its frequency quickens, the statistical fluctuation will reduce rapidly. The statistical fluctuation of the laser system can be restrained by reducing the intensity of pump noise and quantum noise. Moreover, with prolonging of colored cross-correlation time, the statistical fluctuation of laser system experiences a repeated changing process, that is, from decreasing to augmenting, then to decreasing, and finally to augmenting again. With the decreasing of the value of cross-correlation coefficient, the statistical fluctuation will decrease too. When the cross-correlation form between the real part and imaginary part of quantum noise is zero correlation, the statistical fluctuation of laser system has a minimum. Compared with the influence of intensity of pump noise, the influence of intensity of quantum noise on the statistical fluctuation is smaller.

  7. Sensitivity analysis and related analysis : A survey of statistical techniques

    NARCIS (Netherlands)

    Kleijnen, J.P.C.

    1995-01-01

    This paper reviews the state of the art in five related types of analysis, namely (i) sensitivity or what-if analysis, (ii) uncertainty or risk analysis, (iii) screening, (iv) validation, and (v) optimization. The main question is: when should which type of analysis be applied; which statistical

  8. Discrete changes of current statistics in periodically driven stochastic systems

    International Nuclear Information System (INIS)

    Chernyak, Vladimir Y; Sinitsyn, N A

    2010-01-01

    We demonstrate that the counting statistics of currents in periodically driven ergodic stochastic systems can show sharp changes of some of its properties in response to continuous changes of the driving protocol. To describe this effect, we introduce a new topological phase factor in the evolution of the moment generating function which is akin to the topological geometric phase in the evolution of a periodically driven quantum mechanical system with time-reversal symmetry. This phase leads to the prediction of a sign change for the difference of the probabilities to find even and odd numbers of particles transferred in a stochastic system in response to cyclic evolution of control parameters. The driving protocols that lead to this sign change should enclose specific degeneracy points in the space of control parameters. The relation between the topology of the paths in the control parameter space and the sign changes can be described in terms of the first Stiefel–Whitney class of topological invariants. (letter)

  9. The large deviation approach to statistical mechanics

    International Nuclear Information System (INIS)

    Touchette, Hugo

    2009-01-01

    The theory of large deviations is concerned with the exponential decay of probabilities of large fluctuations in random systems. These probabilities are important in many fields of study, including statistics, finance, and engineering, as they often yield valuable information about the large fluctuations of a random system around its most probable state or trajectory. In the context of equilibrium statistical mechanics, the theory of large deviations provides exponential-order estimates of probabilities that refine and generalize Einstein's theory of fluctuations. This review explores this and other connections between large deviation theory and statistical mechanics, in an effort to show that the mathematical language of statistical mechanics is the language of large deviation theory. The first part of the review presents the basics of large deviation theory, and works out many of its classical applications related to sums of random variables and Markov processes. The second part goes through many problems and results of statistical mechanics, and shows how these can be formulated and derived within the context of large deviation theory. The problems and results treated cover a wide range of physical systems, including equilibrium many-particle systems, noise-perturbed dynamics, nonequilibrium systems, as well as multifractals, disordered systems, and chaotic systems. This review also covers many fundamental aspects of statistical mechanics, such as the derivation of variational principles characterizing equilibrium and nonequilibrium states, the breaking of the Legendre transform for nonconcave entropies, and the characterization of nonequilibrium fluctuations through fluctuation relations.

  10. The large deviation approach to statistical mechanics

    Science.gov (United States)

    Touchette, Hugo

    2009-07-01

    The theory of large deviations is concerned with the exponential decay of probabilities of large fluctuations in random systems. These probabilities are important in many fields of study, including statistics, finance, and engineering, as they often yield valuable information about the large fluctuations of a random system around its most probable state or trajectory. In the context of equilibrium statistical mechanics, the theory of large deviations provides exponential-order estimates of probabilities that refine and generalize Einstein’s theory of fluctuations. This review explores this and other connections between large deviation theory and statistical mechanics, in an effort to show that the mathematical language of statistical mechanics is the language of large deviation theory. The first part of the review presents the basics of large deviation theory, and works out many of its classical applications related to sums of random variables and Markov processes. The second part goes through many problems and results of statistical mechanics, and shows how these can be formulated and derived within the context of large deviation theory. The problems and results treated cover a wide range of physical systems, including equilibrium many-particle systems, noise-perturbed dynamics, nonequilibrium systems, as well as multifractals, disordered systems, and chaotic systems. This review also covers many fundamental aspects of statistical mechanics, such as the derivation of variational principles characterizing equilibrium and nonequilibrium states, the breaking of the Legendre transform for nonconcave entropies, and the characterization of nonequilibrium fluctuations through fluctuation relations.

  11. HEALTH CARE SYSTEM AS AN OBJECT OF STATISTICAL RESEARCH

    Directory of Open Access Journals (Sweden)

    Pavel A. Smelov

    2015-01-01

    Full Text Available The article describes the health care system of the Russian Federation as anobject of statistical analysis. The features of accounting of the health system in Russia. The article highlights the key aspects of the health system, which is characterized as fully as possible the object of study.

  12. Quantum mechanics from classical statistics

    International Nuclear Information System (INIS)

    Wetterich, C.

    2010-01-01

    Quantum mechanics can emerge from classical statistics. A typical quantum system describes an isolated subsystem of a classical statistical ensemble with infinitely many classical states. The state of this subsystem can be characterized by only a few probabilistic observables. Their expectation values define a density matrix if they obey a 'purity constraint'. Then all the usual laws of quantum mechanics follow, including Heisenberg's uncertainty relation, entanglement and a violation of Bell's inequalities. No concepts beyond classical statistics are needed for quantum physics - the differences are only apparent and result from the particularities of those classical statistical systems which admit a quantum mechanical description. Born's rule for quantum mechanical probabilities follows from the probability concept for a classical statistical ensemble. In particular, we show how the non-commuting properties of quantum operators are associated to the use of conditional probabilities within the classical system, and how a unitary time evolution reflects the isolation of the subsystem. As an illustration, we discuss a classical statistical implementation of a quantum computer.

  13. The System of Indicators for the Statistical Evaluation of Market Conjuncture

    Directory of Open Access Journals (Sweden)

    Chernenko Daryna I.

    2017-04-01

    Full Text Available The article is aimed at systematizing and improving the system of statistical indicators for the market of laboratory health services (LHS and developing methods for their calculation. In the course of formation of the system of statistical indicators for the market of LHS, allocation of nine blocks has been proposed: market size; proportionality of market; market demand; market proposal; level and dynamics of prices; variation of the LHS; dynamics, development trends, and cycles of the market; market structure; level of competition and monopolization. The proposed system of statistical indicators together with methods for their calculation should ensure studying the trends and regularities in formation of the market for laboratory health services in Ukraine.

  14. Using Relative Statistics and Approximate Disease Prevalence to Compare Screening Tests.

    Science.gov (United States)

    Samuelson, Frank; Abbey, Craig

    2016-11-01

    Schatzkin et al. and other authors demonstrated that the ratios of some conditional statistics such as the true positive fraction are equal to the ratios of unconditional statistics, such as disease detection rates, and therefore we can calculate these ratios between two screening tests on the same population even if negative test patients are not followed with a reference procedure and the true and false negative rates are unknown. We demonstrate that this same property applies to an expected utility metric. We also demonstrate that while simple estimates of relative specificities and relative areas under ROC curves (AUC) do depend on the unknown negative rates, we can write these ratios in terms of disease prevalence, and the dependence of these ratios on a posited prevalence is often weak particularly if that prevalence is small or the performance of the two screening tests is similar. Therefore we can estimate relative specificity or AUC with little loss of accuracy, if we use an approximate value of disease prevalence.

  15. Problems of a Statistical Ensemble Theory for Systems Far from Equilibrium

    Science.gov (United States)

    Ebeling, Werner

    The development of a general statistical physics of nonequilibrium systems was one of the main unfinished tasks of statistical physics of the 20th century. The aim of this work is the study of a special class of nonequilibrium systems where the formulation of an ensemble theory of some generality is possible. These are the so-called canonical-dissipative systems, where the driving terms are determined by invariants of motion. We construct canonical-dissipative systems which are ergodic on certain surfaces on the phase plane. These systems may be described by a non-equilibrium microcanocical ensemble, corresponding to an equal distribution on the target surface. Next we construct and solve Fokker-Planck equations; this leads to a kind of canonical-dissipative ensemble. In the last part we discuss the thoretical problem how to define bifurcations in the framework of nonequilibrium statistics and several possible applications.

  16. Introduction to the basic concepts of modern physics special relativity, quantum and statistical physics

    CERN Document Server

    Becchi, Carlo Maria

    2007-01-01

    These notes are designed as a text book for a course on the Modern Physics Theory for undergraduate students. The purpose is providing a rigorous and self-contained presentation of the simplest theoretical framework using elementary mathematical tools. A number of examples of relevant applications and an appropriate list of exercises and answered questions are also given. The first part is devoted to Special Relativity concerning in particular space-time relativity and relativistic kinematics. The second part deals with Schroedinger's formulation of quantum mechanics. The presentation concerns mainly one dimensional problems, in particular tunnel effect, discrete energy levels and band spectra. The third part concerns the application of Gibbs statistical methods to quantum systems and in particular to Bose and Fermi gasses.

  17. Statistical Mechanics of Disordered Systems - Series: Cambridge Series in Statistical and Probabilistic Mathematics (No. 18)

    Science.gov (United States)

    Bovier, Anton

    2006-06-01

    Our mathematical understanding of the statistical mechanics of disordered systems is going through a period of stunning progress. This self-contained book is a graduate-level introduction for mathematicians and for physicists interested in the mathematical foundations of the field, and can be used as a textbook for a two-semester course on mathematical statistical mechanics. It assumes only basic knowledge of classical physics and, on the mathematics side, a good working knowledge of graduate-level probability theory. The book starts with a concise introduction to statistical mechanics, proceeds to disordered lattice spin systems, and concludes with a presentation of the latest developments in the mathematical understanding of mean-field spin glass models. In particular, recent progress towards a rigorous understanding of the replica symmetry-breaking solutions of the Sherrington-Kirkpatrick spin glass models, due to Guerra, Aizenman-Sims-Starr and Talagrand, is reviewed in some detail. Comprehensive introduction to an active and fascinating area of research Clear exposition that builds to the state of the art in the mathematics of spin glasses Written by a well-known and active researcher in the field

  18. Statistical mechanics of driven diffusive systems

    CERN Document Server

    Schmittmann, B

    1995-01-01

    Far-from-equilibrium phenomena, while abundant in nature, are not nearly as well understood as their equilibrium counterparts. On the theoretical side, progress is slowed by the lack of a simple framework, such as the Boltzmann-Gbbs paradigm in the case of equilibrium thermodynamics. On the experimental side, the enormous structural complexity of real systems poses serious obstacles to comprehension. Similar difficulties have been overcome in equilibrium statistical mechanics by focusing on model systems. Even if they seem too simplistic for known physical systems, models give us considerable insight, provided they capture the essential physics. They serve as important theoretical testing grounds where the relationship between the generic physical behavior and the key ingredients of a successful theory can be identified and understood in detail. Within the vast realm of non-equilibrium physics, driven diffusive systems form a subset with particularly interesting properties. As a prototype model for these syst...

  19. Statistical Analysis of Hypercalcaemia Data related to Transferability

    DEFF Research Database (Denmark)

    Frølich, Anne; Nielsen, Bo Friis

    2005-01-01

    In this report we describe statistical analysis related to a study of hypercalcaemia carried out in the Copenhagen area in the ten year period from 1984 to 1994. Results from the study have previously been publised in a number of papers [3, 4, 5, 6, 7, 8, 9] and in various abstracts and posters...... at conferences during the late eighties and early nineties. In this report we give a more detailed description of many of the analysis and provide some new results primarily by simultaneous studies of several databases....

  20. Theoretical Aspects of the Patterns Recognition Statistical Theory Used for Developing the Diagnosis Algorithms for Complicated Technical Systems

    Science.gov (United States)

    Obozov, A. A.; Serpik, I. N.; Mihalchenko, G. S.; Fedyaeva, G. A.

    2017-01-01

    In the article, the problem of application of the pattern recognition (a relatively young area of engineering cybernetics) for analysis of complicated technical systems is examined. It is shown that the application of a statistical approach for hard distinguishable situations could be the most effective. The different recognition algorithms are based on Bayes approach, which estimates posteriori probabilities of a certain event and an assumed error. Application of the statistical approach to pattern recognition is possible for solving the problem of technical diagnosis complicated systems and particularly big powered marine diesel engines.

  1. Inferential, non-parametric statistics to assess the quality of probabilistic forecast systems

    NARCIS (Netherlands)

    Maia, A.H.N.; Meinke, H.B.; Lennox, S.; Stone, R.C.

    2007-01-01

    Many statistical forecast systems are available to interested users. To be useful for decision making, these systems must be based on evidence of underlying mechanisms. Once causal connections between the mechanism and its statistical manifestation have been firmly established, the forecasts must

  2. Addressing the statistical mechanics of planet orbits in the solar system

    Science.gov (United States)

    Mogavero, Federico

    2017-10-01

    The chaotic nature of planet dynamics in the solar system suggests the relevance of a statistical approach to planetary orbits. In such a statistical description, the time-dependent position and velocity of the planets are replaced by the probability density function (PDF) of their orbital elements. It is natural to set up this kind of approach in the framework of statistical mechanics. In the present paper, I focus on the collisionless excitation of eccentricities and inclinations via gravitational interactions in a planetary system. The future planet trajectories in the solar system constitute the prototype of this kind of dynamics. I thus address the statistical mechanics of the solar system planet orbits and try to reproduce the PDFs numerically constructed by Laskar (2008, Icarus, 196, 1). I show that the microcanonical ensemble of the Laplace-Lagrange theory accurately reproduces the statistics of the giant planet orbits. To model the inner planets I then investigate the ansatz of equiprobability in the phase space constrained by the secular integrals of motion. The eccentricity and inclination PDFs of Earth and Venus are reproduced with no free parameters. Within the limitations of a stationary model, the predictions also show a reasonable agreement with Mars PDFs and that of Mercury inclination. The eccentricity of Mercury demands in contrast a deeper analysis. I finally revisit the random walk approach of Laskar to the time dependence of the inner planet PDFs. Such a statistical theory could be combined with direct numerical simulations of planet trajectories in the context of planet formation, which is likely to be a chaotic process.

  3. Statistically accurate low-order models for uncertainty quantification in turbulent dynamical systems.

    Science.gov (United States)

    Sapsis, Themistoklis P; Majda, Andrew J

    2013-08-20

    A framework for low-order predictive statistical modeling and uncertainty quantification in turbulent dynamical systems is developed here. These reduced-order, modified quasilinear Gaussian (ROMQG) algorithms apply to turbulent dynamical systems in which there is significant linear instability or linear nonnormal dynamics in the unperturbed system and energy-conserving nonlinear interactions that transfer energy from the unstable modes to the stable modes where dissipation occurs, resulting in a statistical steady state; such turbulent dynamical systems are ubiquitous in geophysical and engineering turbulence. The ROMQG method involves constructing a low-order, nonlinear, dynamical system for the mean and covariance statistics in the reduced subspace that has the unperturbed statistics as a stable fixed point and optimally incorporates the indirect effect of non-Gaussian third-order statistics for the unperturbed system in a systematic calibration stage. This calibration procedure is achieved through information involving only the mean and covariance statistics for the unperturbed equilibrium. The performance of the ROMQG algorithm is assessed on two stringent test cases: the 40-mode Lorenz 96 model mimicking midlatitude atmospheric turbulence and two-layer baroclinic models for high-latitude ocean turbulence with over 125,000 degrees of freedom. In the Lorenz 96 model, the ROMQG algorithm with just a single mode captures the transient response to random or deterministic forcing. For the baroclinic ocean turbulence models, the inexpensive ROMQG algorithm with 252 modes, less than 0.2% of the total, captures the nonlinear response of the energy, the heat flux, and even the one-dimensional energy and heat flux spectra.

  4. Quantum statistics of many-particle systems

    International Nuclear Information System (INIS)

    Kraeft, W.D.; Ebeling, W.; Kremp, D.; Ropke, G.

    1986-01-01

    This paper presents the elements of quantum statistics and discusses the quantum mechanics of many-particle systems. The method of second quantization is discussed and the Bogolyubov hierarchy is examined. The general properties of the correlation function and one-particle Green's function are examined. The paper presents dynamical and thermodynamical information contained in the spectral function. An equation of motion is given for the one-particle Green's function. T-matrix and thermodynamic properties in binary collision approximation are discussed

  5. Applied systems ecology: models, data, and statistical methods

    Energy Technology Data Exchange (ETDEWEB)

    Eberhardt, L L

    1976-01-01

    In this report, systems ecology is largely equated to mathematical or computer simulation modelling. The need for models in ecology stems from the necessity to have an integrative device for the diversity of ecological data, much of which is observational, rather than experimental, as well as from the present lack of a theoretical structure for ecology. Different objectives in applied studies require specialized methods. The best predictive devices may be regression equations, often non-linear in form, extracted from much more detailed models. A variety of statistical aspects of modelling, including sampling, are discussed. Several aspects of population dynamics and food-chain kinetics are described, and it is suggested that the two presently separated approaches should be combined into a single theoretical framework. It is concluded that future efforts in systems ecology should emphasize actual data and statistical methods, as well as modelling.

  6. The system for statistical analysis of logistic information

    Directory of Open Access Journals (Sweden)

    Khayrullin Rustam Zinnatullovich

    2015-05-01

    Full Text Available The current problem for managers in logistic and trading companies is the task of improving the operational business performance and developing the logistics support of sales. The development of logistics sales supposes development and implementation of a set of works for the development of the existing warehouse facilities, including both a detailed description of the work performed, and the timing of their implementation. Logistics engineering of warehouse complex includes such tasks as: determining the number and the types of technological zones, calculation of the required number of loading-unloading places, development of storage structures, development and pre-sales preparation zones, development of specifications of storage types, selection of loading-unloading equipment, detailed planning of warehouse logistics system, creation of architectural-planning decisions, selection of information-processing equipment, etc. The currently used ERP and WMS systems did not allow us to solve the full list of logistics engineering problems. In this regard, the development of specialized software products, taking into account the specifics of warehouse logistics, and subsequent integration of these software with ERP and WMS systems seems to be a current task. In this paper we suggest a system of statistical analysis of logistics information, designed to meet the challenges of logistics engineering and planning. The system is based on the methods of statistical data processing.The proposed specialized software is designed to improve the efficiency of the operating business and the development of logistics support of sales. The system is based on the methods of statistical data processing, the methods of assessment and prediction of logistics performance, the methods for the determination and calculation of the data required for registration, storage and processing of metal products, as well as the methods for planning the reconstruction and development

  7. The relation between statistical power and inference in fMRI.

    Directory of Open Access Journals (Sweden)

    Henk R Cremers

    Full Text Available Statistically underpowered studies can result in experimental failure even when all other experimental considerations have been addressed impeccably. In fMRI the combination of a large number of dependent variables, a relatively small number of observations (subjects, and a need to correct for multiple comparisons can decrease statistical power dramatically. This problem has been clearly addressed yet remains controversial-especially in regards to the expected effect sizes in fMRI, and especially for between-subjects effects such as group comparisons and brain-behavior correlations. We aimed to clarify the power problem by considering and contrasting two simulated scenarios of such possible brain-behavior correlations: weak diffuse effects and strong localized effects. Sampling from these scenarios shows that, particularly in the weak diffuse scenario, common sample sizes (n = 20-30 display extremely low statistical power, poorly represent the actual effects in the full sample, and show large variation on subsequent replications. Empirical data from the Human Connectome Project resembles the weak diffuse scenario much more than the localized strong scenario, which underscores the extent of the power problem for many studies. Possible solutions to the power problem include increasing the sample size, using less stringent thresholds, or focusing on a region-of-interest. However, these approaches are not always feasible and some have major drawbacks. The most prominent solutions that may help address the power problem include model-based (multivariate prediction methods and meta-analyses with related synthesis-oriented approaches.

  8. Statistical trend analysis methodology for rare failures in changing technical systems

    International Nuclear Information System (INIS)

    Ott, K.O.; Hoffmann, H.J.

    1983-07-01

    A methodology for a statistical trend analysis (STA) in failure rates is presented. It applies primarily to relatively rare events in changing technologies or components. The formulation is more general and the assumptions are less restrictive than in a previously published version. Relations of the statistical analysis and probabilistic assessment (PRA) are discussed in terms of categorization of decisions for action following particular failure events. The significance of tentatively identified trends is explored. In addition to statistical tests for trend significance, a combination of STA and PRA results quantifying the trend complement is proposed. The STA approach is compared with other concepts for trend characterization. (orig.)

  9. Statistical mechanics for a class of quantum statistics

    International Nuclear Information System (INIS)

    Isakov, S.B.

    1994-01-01

    Generalized statistical distributions for identical particles are introduced for the case where filling a single-particle quantum state by particles depends on filling states of different momenta. The system of one-dimensional bosons with a two-body potential that can be solved by means of the thermodynamic Bethe ansatz is shown to be equivalent thermodynamically to a system of free particles obeying statistical distributions of the above class. The quantum statistics arising in this way are completely determined by the two-particle scattering phases of the corresponding interacting systems. An equation determining the statistical distributions for these statistics is derived

  10. GREY STATISTICS METHOD OF TECHNOLOGY SELECTION FOR ADVANCED PUBLIC TRANSPORTATION SYSTEMS

    Directory of Open Access Journals (Sweden)

    Chien Hung WEI

    2003-01-01

    Full Text Available Taiwan is involved in intelligent transportation systems planning, and is now selecting its prior focus areas for investment and development. The high social and economic impact associated with which intelligent transportation systems technology are chosen explains the efforts of various electronics and transportation corporations for developing intelligent transportation systems technology to expand their business opportunities. However, there has been no detailed research conducted with regard to selecting technology for advanced public transportation systems in Taiwan. Thus, the present paper demonstrates a grey statistics method integrated with a scenario method for solving the problem of selecting advanced public transportation systems technology for Taiwan. A comprehensive questionnaire survey was conducted to demonstrate the effectiveness of the grey statistics method. The proposed approach indicated that contactless smart card technology is the appropriate technology for Taiwan to develop in the near future. The significance of our research results implies that the grey statistics method is an effective method for selecting advanced public transportation systems technologies. We feel our information will be beneficial to the private sector for developing an appropriate intelligent transportation systems technology strategy.

  11. Nuclear multifragmentation, its relation to general physics. A rich test ground of the fundamentals of statistical mechanics

    International Nuclear Information System (INIS)

    Gross, D.H.E.

    2006-01-01

    Heat can flow from cold to hot at any phase separation even in macroscopic systems. Therefore also Lynden-Bell's famous gravo-thermal catastrophe must be reconsidered. In contrast to traditional canonical Boltzmann-Gibbs statistics this is correctly described only by microcanonical statistics. Systems studied in chemical thermodynamics (ChTh) by using canonical statistics consist of several homogeneous macroscopic phases. Evidently, macroscopic statistics as in chemistry cannot and should not be applied to non-extensive or inhomogeneous systems like nuclei or galaxies. Nuclei are small and inhomogeneous. Multifragmented nuclei are even more inhomogeneous and the fragments even smaller. Phase transitions of first order and especially phase separations therefore cannot be described by a (homogeneous) canonical ensemble. Taking this serious, fascinating perspectives open for statistical nuclear fragmentation as test ground for the basic principles of statistical mechanics, especially of phase transitions, without the use of the thermodynamic limit. Moreover, there is also a lot of similarity between the accessible phase space of fragmenting nuclei and inhomogeneous multistellar systems. This underlines the fundamental significance for statistical physics in general. (orig.)

  12. Interdisciplinary applications of statistical physics to complex systems: Seismic physics, econophysics, and sociophysics

    Science.gov (United States)

    Tenenbaum, Joel

    This thesis applies statistical physics concepts and methods to quantitatively analyze complex systems. This thesis is separated into four parts: (i) characteristics of earthquake systems (ii) memory and volatility in data time series (iii) the application of part (ii) to world financial markets, and (iv) statistical observations on the evolution of word usage. In Part I, we observe statistical patterns in the occurrence of earthquakes. We select a 14-year earthquake catalog covering the archipelago of Japan. We find that regions traditionally thought of as being too distant from one another for causal contact display remarkably high correlations, and the networks that result have a tendency to link highly connected areas with other highly connected areas. In Part II, we introduce and apply the concept of "volatility asymmetry", the primary use of which is in financial data. We explain the relation between memory and "volatility asymmetry" in terms of an asymmetry parameter lambda. We define a litmus test for determining whether lambda is statistically significant and propose a stochastic model based on this parameter and use the model to further explain empirical data. In Part III, we expand on volatility asymmetry. Importing the concepts of time dependence and universality from physics, we explore the aspects of emerging (or "transition") economies in Eastern Europe as they relate to asymmetry. We find that these emerging markets in some instances behave like developed markets and in other instances do not, and that the distinction is a matter both of country and a matter of time period, crisis periods showing different asymmetry characteristics than "healthy" periods. In Part IV, we take note of a series of findings in econophysics, showing statistical growth similarities between a variety of different areas that all have in common the fact of taking place in areas that are both (i) competing and (ii) dynamic. We show that this same growth distribution can be

  13. Statistical evaluation of major human errors during the development of new technological systems

    International Nuclear Information System (INIS)

    Campbell, G; Ott, K.O.

    1979-01-01

    Statistical procedures are presented to evaluate major human errors during the development of a new system, errors that have led or can lead to accidents or major failures. The first procedure aims at estimating the average residual occurrence rate for s or major failures after several have occurred. The procedure is solely based on the historical record. Certain idealizations are introduced that allow the application of a sound statistical evaluation procedure. These idealizations are practically realized to a sufficient degree such that the proposed estimation procedure yields meaningful results, even for situations with a sparse data base, represented by very few accidents. Under the assumption that the possible human-error-related failure times have exponential distributions, the statistical technique of isotonic regression is proposed to estimate the failure rates due to human design error at the failure times of the system. The last value in the sequence of estimates gives the residual accident chance. In addition, theactual situation is tested against the hypothesis that the failure rate of the system remains constant over time. This test determines the chance for a decreasing failure rate being incidental, rather than an indication of an actual learning process. Both techniques can be applied not merely to a single system but to an entire series of similar systems that a technology would generate, enabling the assessment of technological improvement. For the purpose of illustration, the nuclear decay of isotopes was chosen as an example, since the assumptions of the model are rigorously satisfied in this case. This application shows satisfactory agreement of the estimated and actual failure rates (which are exactly known in this example), although the estimation was deliberately based on a sparse historical record

  14. Statistics of resonances in one-dimensional continuous systems

    Indian Academy of Sciences (India)

    Vol. 73, No. 3. — journal of. September 2009 physics pp. 565–572. Statistics of resonances in one-dimensional continuous systems. JOSHUA FEINBERG. Physics Department, University of Haifa at Oranim, Tivon 36006, Israel ..... relativistic quantum mechanics (Israel Program for Scientific Translations, Jerusalem,. 1969).

  15. Spectral statistics in chiral-orthogonal disordered systems

    International Nuclear Information System (INIS)

    Evangelou, S N; Katsanos, D E

    2003-01-01

    We describe the singularities in the averaged density of states and the corresponding statistics of the energy levels in two- (2D) and three-dimensional (3D) chiral symmetric and time-reversal invariant disordered systems, realized in bipartite lattices with real off-diagonal disorder. For off-diagonal disorder of zero mean, we obtain a singular density of states in 2D which becomes much less pronounced in 3D, while the level-statistics can be described by a semi-Poisson distribution with mostly critical fractal states in 2D and Wigner surmise with mostly delocalized states in 3D. For logarithmic off-diagonal disorder of large strength, we find behaviour indistinguishable from ordinary disorder with strong localization in any dimension but in addition one-dimensional 1/ vertical bar E vertical bar Dyson-like asymptotic spectral singularities. The off-diagonal disorder is also shown to enhance the propagation of two interacting particles similarly to systems with diagonal disorder. Although disordered models with chiral symmetry differ from non-chiral ones due to the presence of spectral singularities, both share the same qualitative localization properties except at the chiral symmetry point E=0 which is critical

  16. Fractional statistics of the vortex in two-dimensional superfluids

    International Nuclear Information System (INIS)

    Chiao, R.Y.; Hansen, A.; Moulthrop, A.A.

    1985-01-01

    The quantum behavior of two identical point vortices (e.g., in a superfluid 4 He thin film) is studied. It is argued that this system obeys neither Bose nor Fermi statistics, but intermediate or theta statistics: We find that a single vortex in this system possesses quarter-fractional statistics (i.e., theta = π/2 or 3π/2). The source of the theta statistics is identified in the relative zero-point motion of the vortices

  17. Age related neuromuscular changes in sEMG of m. Tibialis Anterior using higher order statistics (Gaussianity & linearity test).

    Science.gov (United States)

    Siddiqi, Ariba; Arjunan, Sridhar P; Kumar, Dinesh K

    2016-08-01

    Age-associated changes in the surface electromyogram (sEMG) of Tibialis Anterior (TA) muscle can be attributable to neuromuscular alterations that precede strength loss. We have used our sEMG model of the Tibialis Anterior to interpret the age-related changes and compared with the experimental sEMG. Eighteen young (20-30 years) and 18 older (60-85 years) performed isometric dorsiflexion at 6 different percentage levels of maximum voluntary contractions (MVC), and their sEMG from the TA muscle was recorded. Six different age-related changes in the neuromuscular system were simulated using the sEMG model at the same MVCs as the experiment. The maximal power of the spectrum, Gaussianity and Linearity Test Statistics were computed from the simulated and experimental sEMG. A correlation analysis at α=0.05 was performed between the simulated and experimental age-related change in the sEMG features. The results show the loss in motor units was distinguished by the Gaussianity and Linearity test statistics; while the maximal power of the PSD distinguished between the muscular factors. The simulated condition of 40% loss of motor units with halved the number of fast fibers best correlated with the age-related change observed in the experimental sEMG higher order statistical features. The simulated aging condition found by this study corresponds with the moderate motor unit remodelling and negligible strength loss reported in literature for the cohorts aged 60-70 years.

  18. MAI statistics estimation and analysis in a DS-CDMA system

    Science.gov (United States)

    Alami Hassani, A.; Zouak, M.; Mrabti, M.; Abdi, F.

    2018-05-01

    A primary limitation of Direct Sequence Code Division Multiple Access DS-CDMA link performance and system capacity is multiple access interference (MAI). To examine the performance of CDMA systems in the presence of MAI, i.e., in a multiuser environment, several works assumed that the interference can be approximated by a Gaussian random variable. In this paper, we first develop a new and simple approach to characterize the MAI in a multiuser system. In addition to statistically quantifying the MAI power, the paper also proposes a statistical model for both variance and mean of the MAI for synchronous and asynchronous CDMA transmission. We show that the MAI probability density function (PDF) is Gaussian for the equal-received-energy case and validate it by computer simulations.

  19. Statistical Indicators for Religious Studies: Indicators of Level and Structure

    Science.gov (United States)

    Herteliu, Claudiu; Isaic-Maniu, Alexandru

    2009-01-01

    Using statistic indicators as vectors of information relative to the operational status of a phenomenon, including a religious one, is unanimously accepted. By introducing a system of statistic indicators we can also analyze the interfacing areas of a phenomenon. In this context, we have elaborated a system of statistic indicators specific to the…

  20. Discriminatory power of water polo game-related statistics at the 2008 Olympic Games.

    Science.gov (United States)

    Escalante, Yolanda; Saavedra, Jose M; Mansilla, Mirella; Tella, Victor

    2011-02-01

    The aims of this study were (1) to compare water polo game-related statistics by context (winning and losing teams) and sex (men and women), and (2) to identify characteristics discriminating the performances for each sex. The game-related statistics of the 64 matches (44 men's and 20 women's) played in the final phase of the Olympic Games held in Beijing in 2008 were analysed. Unpaired t-tests compared winners and losers and men and women, and confidence intervals and effect sizes of the differences were calculated. The results were subjected to a discriminant analysis to identify the differentiating game-related statistics of the winning and losing teams. The results showed the differences between winning and losing men's teams to be in both defence and offence, whereas in women's teams they were only in offence. In men's games, passing (assists), aggressive play (exclusions), centre position effectiveness (centre shots), and goalkeeper defence (goalkeeper-blocked 5-m shots) predominated, whereas in women's games the play was more dynamic (possessions). The variable that most discriminated performance in men was goalkeeper-blocked shots, and in women shooting effectiveness (shots). These results should help coaches when planning training and competition.

  1. Designing a Course in Statistics for a Learning Health Systems Training Program

    Science.gov (United States)

    Samsa, Gregory P.; LeBlanc, Thomas W.; Zaas, Aimee; Howie, Lynn; Abernethy, Amy P.

    2014-01-01

    The core pedagogic problem considered here is how to effectively teach statistics to physicians who are engaged in a "learning health system" (LHS). This is a special case of a broader issue--namely, how to effectively teach statistics to academic physicians for whom research--and thus statistics--is a requirement for professional…

  2. Statistical mechanics

    CERN Document Server

    Jana, Madhusudan

    2015-01-01

    Statistical mechanics is self sufficient, written in a lucid manner, keeping in mind the exam system of the universities. Need of study this subject and its relation to Thermodynamics is discussed in detail. Starting from Liouville theorem gradually, the Statistical Mechanics is developed thoroughly. All three types of Statistical distribution functions are derived separately with their periphery of applications and limitations. Non-interacting ideal Bose gas and Fermi gas are discussed thoroughly. Properties of Liquid He-II and the corresponding models have been depicted. White dwarfs and condensed matter physics, transport phenomenon - thermal and electrical conductivity, Hall effect, Magneto resistance, viscosity, diffusion, etc. are discussed. Basic understanding of Ising model is given to explain the phase transition. The book ends with a detailed coverage to the method of ensembles (namely Microcanonical, canonical and grand canonical) and their applications. Various numerical and conceptual problems ar...

  3. Statistical optics

    Science.gov (United States)

    Goodman, J. W.

    This book is based on the thesis that some training in the area of statistical optics should be included as a standard part of any advanced optics curriculum. Random variables are discussed, taking into account definitions of probability and random variables, distribution functions and density functions, an extension to two or more random variables, statistical averages, transformations of random variables, sums of real random variables, Gaussian random variables, complex-valued random variables, and random phasor sums. Other subjects examined are related to random processes, some first-order properties of light waves, the coherence of optical waves, some problems involving high-order coherence, effects of partial coherence on imaging systems, imaging in the presence of randomly inhomogeneous media, and fundamental limits in photoelectric detection of light. Attention is given to deterministic versus statistical phenomena and models, the Fourier transform, and the fourth-order moment of the spectrum of a detected speckle image.

  4. Quantum statistics and squeezing for a microwave-driven interacting magnon system.

    Science.gov (United States)

    Haghshenasfard, Zahra; Cottam, Michael G

    2017-02-01

    Theoretical studies are reported for the statistical properties of a microwave-driven interacting magnon system. Both the magnetic dipole-dipole and the exchange interactions are included and the theory is developed for the case of parallel pumping allowing for the inclusion of the nonlinear processes due to the four-magnon interactions. The method of second quantization is used to transform the total Hamiltonian from spin operators to boson creation and annihilation operators. By using the coherent magnon state representation we have studied the magnon occupation number and the statistical behavior of the system. In particular, it is shown that the nonlinearities introduced by the parallel pumping field and the four-magnon interactions lead to non-classical quantum statistical properties of the system, such as magnon squeezing. Also control of the collapse-and-revival phenomena for the time evolution of the average magnon number is demonstrated by varying the parallel pumping amplitude and the four-magnon coupling.

  5. A system for learning statistical motion patterns.

    Science.gov (United States)

    Hu, Weiming; Xiao, Xuejuan; Fu, Zhouyu; Xie, Dan; Tan, Tieniu; Maybank, Steve

    2006-09-01

    Analysis of motion patterns is an effective approach for anomaly detection and behavior prediction. Current approaches for the analysis of motion patterns depend on known scenes, where objects move in predefined ways. It is highly desirable to automatically construct object motion patterns which reflect the knowledge of the scene. In this paper, we present a system for automatically learning motion patterns for anomaly detection and behavior prediction based on a proposed algorithm for robustly tracking multiple objects. In the tracking algorithm, foreground pixels are clustered using a fast accurate fuzzy K-means algorithm. Growing and prediction of the cluster centroids of foreground pixels ensure that each cluster centroid is associated with a moving object in the scene. In the algorithm for learning motion patterns, trajectories are clustered hierarchically using spatial and temporal information and then each motion pattern is represented with a chain of Gaussian distributions. Based on the learned statistical motion patterns, statistical methods are used to detect anomalies and predict behaviors. Our system is tested using image sequences acquired, respectively, from a crowded real traffic scene and a model traffic scene. Experimental results show the robustness of the tracking algorithm, the efficiency of the algorithm for learning motion patterns, and the encouraging performance of algorithms for anomaly detection and behavior prediction.

  6. Subject-enabled analytics model on measurement statistics in health risk expert system for public health informatics.

    Science.gov (United States)

    Chung, Chi-Jung; Kuo, Yu-Chen; Hsieh, Yun-Yu; Li, Tsai-Chung; Lin, Cheng-Chieh; Liang, Wen-Miin; Liao, Li-Na; Li, Chia-Ing; Lin, Hsueh-Chun

    2017-11-01

    This study applied open source technology to establish a subject-enabled analytics model that can enhance measurement statistics of case studies with the public health data in cloud computing. The infrastructure of the proposed model comprises three domains: 1) the health measurement data warehouse (HMDW) for the case study repository, 2) the self-developed modules of online health risk information statistics (HRIStat) for cloud computing, and 3) the prototype of a Web-based process automation system in statistics (PASIS) for the health risk assessment of case studies with subject-enabled evaluation. The system design employed freeware including Java applications, MySQL, and R packages to drive a health risk expert system (HRES). In the design, the HRIStat modules enforce the typical analytics methods for biomedical statistics, and the PASIS interfaces enable process automation of the HRES for cloud computing. The Web-based model supports both modes, step-by-step analysis and auto-computing process, respectively for preliminary evaluation and real time computation. The proposed model was evaluated by computing prior researches in relation to the epidemiological measurement of diseases that were caused by either heavy metal exposures in the environment or clinical complications in hospital. The simulation validity was approved by the commercial statistics software. The model was installed in a stand-alone computer and in a cloud-server workstation to verify computing performance for a data amount of more than 230K sets. Both setups reached efficiency of about 10 5 sets per second. The Web-based PASIS interface can be used for cloud computing, and the HRIStat module can be flexibly expanded with advanced subjects for measurement statistics. The analytics procedure of the HRES prototype is capable of providing assessment criteria prior to estimating the potential risk to public health. Copyright © 2017 Elsevier B.V. All rights reserved.

  7. Game Related Statistics Which Discriminate Between Winning and Losing Under-16 Male Basketball Games

    Science.gov (United States)

    Lorenzo, Alberto; Gómez, Miguel Ángel; Ortega, Enrique; Ibáñez, Sergio José; Sampaio, Jaime

    2010-01-01

    The aim of the present study was to identify the game-related statistics which discriminate between winning and losing teams in under-16 years old male basketball games. The sample gathered all 122 games in the 2004 and 2005 Under-16 European Championships. The game-related statistics analysed were the free-throws (both successful and unsuccessful), 2- and 3-points field-goals (both successful and unsuccessful) offensive and defensive rebounds, blocks, assists, fouls, turnovers and steals. The winning teams exhibited lower ball possessions per game and better offensive and defensive efficacy coefficients than the losing teams. Results from discriminant analysis were statistically significant and allowed to emphasize several structure coefficients (SC). In close games (final score differences below 9 points), the discriminant variables were the turnovers (SC = -0.47) and the assists (SC = 0.33). In balanced games (final score differences between 10 and 29 points), the variables that discriminated between the groups were the successful 2-point field-goals (SC = -0.34) and defensive rebounds (SC = -0. 36); and in unbalanced games (final score differences above 30 points) the variables that best discriminated both groups were the successful 2-point field-goals (SC = 0.37). These results allowed understanding that these players' specific characteristics result in a different game-related statistical profile and helped to point out the importance of the perceptive and decision making process in practice and in competition. Key points The players' game-related statistical profile varied according to game type, game outcome and in formative categories in basketball. The results of this work help to point out the different player's performance described in U-16 men's basketball teams compared with senior and professional men's basketball teams. The results obtained enhance the importance of the perceptive and decision making process in practice and in competition. PMID

  8. A flexible statistics web processing service--added value for information systems for experiment data.

    Science.gov (United States)

    Heimann, Dennis; Nieschulze, Jens; König-Ries, Birgitta

    2010-04-20

    Data management in the life sciences has evolved from simple storage of data to complex information systems providing additional functionalities like analysis and visualization capabilities, demanding the integration of statistical tools. In many cases the used statistical tools are hard-coded within the system. That leads to an expensive integration, substitution, or extension of tools because all changes have to be done in program code. Other systems are using generic solutions for tool integration but adapting them to another system is mostly rather extensive work. This paper shows a way to provide statistical functionality over a statistics web service, which can be easily integrated in any information system and set up using XML configuration files. The statistical functionality is extendable by simply adding the description of a new application to a configuration file. The service architecture as well as the data exchange process between client and service and the adding of analysis applications to the underlying service provider are described. Furthermore a practical example demonstrates the functionality of the service.

  9. Call for civil registration and vital statistics systems experts | IDRC ...

    International Development Research Centre (IDRC) Digital Library (Canada)

    2017-06-30

    Jun 30, 2017 ... This is a call for experts in civil registration, information technology, public health, statistics, law, ... digitization (including IT systems design, and system integration and ... socio-cultural and anthropological research); and; public health. ... IDRC and key partners will showcase critical work on adaptation and ...

  10. Analysis of neutron flux measurement systems using statistical functions

    International Nuclear Information System (INIS)

    Pontes, Eduardo Winston

    1997-01-01

    This work develops an integrated analysis for neutron flux measurement systems using the concepts of cumulants and spectra. Its major contribution is the generalization of Campbell's theorem in the form of spectra in the frequency domain, and its application to the analysis of neutron flux measurement systems. Campbell's theorem, in its generalized form, constitutes an important tool, not only to find the nth-order frequency spectra of the radiation detector, but also in the system analysis. The radiation detector, an ionization chamber for neutrons, is modeled for cylindrical, plane and spherical geometries. The detector current pulses are characterized by a vector of random parameters, and the associated charges, statistical moments and frequency spectra of the resulting current are calculated. A computer program is developed for application of the proposed methodology. In order for the analysis to integrate the associated electronics, the signal processor is studied, considering analog and digital configurations. The analysis is unified by developing the concept of equivalent systems that can be used to describe the cumulants and spectra in analog or digital systems. The noise in the signal processor input stage is analysed in terms of second order spectrum. Mathematical expressions are presented for cumulants and spectra up to fourth order, for important cases of filter positioning relative to detector spectra. Unbiased conventional estimators for cumulants are used, and, to evaluate systems precision and response time, expressions are developed for their variances. Finally, some possibilities for obtaining neutron radiation flux as a function of cumulants are discussed. In summary, this work proposes some analysis tools which make possible important decisions in the design of better neutron flux measurement systems. (author)

  11. On nonequilibrium many-body systems. 1: The nonequilibrium statistical operator method

    International Nuclear Information System (INIS)

    Algarte, A.C.S.; Vasconcellos, A.R.; Luzzi, R.; Sampaio, A.J.C.

    1985-01-01

    The theoretical aspects involved in the treatment of many-body systems strongly departed from equilibrium are discussed. The nonequilibrium statistical operator (NSO) method is considered in detail. Using Jaynes' maximum entropy formalism complemented with an ad hoc hypothesis a nonequilibrium statistical operator is obtained. This approach introduces irreversibility from the outset and we recover statistical operators like those of Green-Mori and Zubarev as particular cases. The connection with Generalized Thermodynamics and the construction of nonlinear transport equations are briefly described. (Author) [pt

  12. Statistical Physics of Complex Substitutive Systems

    Science.gov (United States)

    Jin, Qing

    Diffusion processes are central to human interactions. Despite extensive studies that span multiple disciplines, our knowledge is limited to spreading processes in non-substitutive systems. Yet, a considerable number of ideas, products, and behaviors spread by substitution; to adopt a new one, agents must give up an existing one. This captures the spread of scientific constructs--forcing scientists to choose, for example, a deterministic or probabilistic worldview, as well as the adoption of durable items, such as mobile phones, cars, or homes. In this dissertation, I develop a statistical physics framework to describe, quantify, and understand substitutive systems. By empirically exploring three collected high-resolution datasets pertaining to such systems, I build a mechanistic model describing substitutions, which not only analytically predicts the universal macroscopic phenomenon discovered in the collected datasets, but also accurately captures the trajectories of individual items in a complex substitutive system, demonstrating a high degree of regularity and universality in substitutive systems. I also discuss the origins and insights of the parameters in the substitution model and possible generalization form of the mathematical framework. The systematical study of substitutive systems presented in this dissertation could potentially guide the understanding and prediction of all spreading phenomena driven by substitutions, from electric cars to scientific paradigms, and from renewable energy to new healthy habits.

  13. Renyi statistics in equilibrium statistical mechanics

    International Nuclear Information System (INIS)

    Parvan, A.S.; Biro, T.S.

    2010-01-01

    The Renyi statistics in the canonical and microcanonical ensembles is examined both in general and in particular for the ideal gas. In the microcanonical ensemble the Renyi statistics is equivalent to the Boltzmann-Gibbs statistics. By the exact analytical results for the ideal gas, it is shown that in the canonical ensemble, taking the thermodynamic limit, the Renyi statistics is also equivalent to the Boltzmann-Gibbs statistics. Furthermore it satisfies the requirements of the equilibrium thermodynamics, i.e. the thermodynamical potential of the statistical ensemble is a homogeneous function of first degree of its extensive variables of state. We conclude that the Renyi statistics arrives at the same thermodynamical relations, as those stemming from the Boltzmann-Gibbs statistics in this limit.

  14. Nonequilibrium work relation in a macroscopic system

    International Nuclear Information System (INIS)

    Sughiyama, Yuki; Ohzeki, Masayuki

    2013-01-01

    We reconsider a well-known relationship between the fluctuation theorem and the second law of thermodynamics by evaluating stochastic evolution of the density field (probability measure valued process). In order to establish a bridge between microscopic and macroscopic behaviors, we must take the thermodynamic limit of a stochastic dynamical system following the standard procedure in statistical mechanics. The thermodynamic path characterizing a dynamical behavior in the macroscopic scale can be formulated as an infimum of the action functional for the stochastic evolution of the density field. In our formulation, the second law of thermodynamics can be derived only by symmetry of the action functional without recourse to the Jarzynski equality. Our formulation leads to a nontrivial nonequilibrium work relation for metastable (quasi-stationary) states, which are peculiar in the macroscopic system. We propose a prescription for computing the free energy for metastable states based on the resultant work relation. (paper)

  15. Statistical fluctuations and correlations in hadronic equilibrium systems

    International Nuclear Information System (INIS)

    Hauer, Michael

    2010-01-01

    This thesis is dedicated to the study of fluctuation and correlation observables of hadronic equilibrium systems. The statistical hadronization model of high energy physics, in its ideal, i.e. non-interacting, gas approximation is investigated in different ensemble formulations. The hypothesis of thermal and chemical equilibrium in high energy interaction is tested against qualitative and quantitative predictions. (orig.)

  16. Statistical fluctuations and correlations in hadronic equilibrium systems

    Energy Technology Data Exchange (ETDEWEB)

    Hauer, Michael

    2010-06-17

    This thesis is dedicated to the study of fluctuation and correlation observables of hadronic equilibrium systems. The statistical hadronization model of high energy physics, in its ideal, i.e. non-interacting, gas approximation is investigated in different ensemble formulations. The hypothesis of thermal and chemical equilibrium in high energy interaction is tested against qualitative and quantitative predictions. (orig.)

  17. Photon statistics of a single-atom intracavity system involving electromagnetically induced transparency

    International Nuclear Information System (INIS)

    Rebic, S.; Parkins, A.S.; Tan, S.M.

    2002-01-01

    We explore the photon statistics of light emitted from a system comprising a single four-level atom strongly coupled to a high-finesse optical cavity mode that is driven by a coherent laser field. In the weak driving regime this system is found to exhibit a photon blockade effect. For intermediate driving strengths we find a sudden change in the photon statistics of the light emitted from the cavity. Photon antibunching switches to photon bunching over a very narrow range of intracavity photon number. It is proven that this sudden change in photon statistics occurs due to the existence of robust quantum interference of transitions between the dressed states of the atom-cavity system. Furthermore, it is shown that the strong photon bunching is a nonclassical effect for certain values of driving field strength, violating classical inequalities for field correlations

  18. Register-based statistics statistical methods for administrative data

    CERN Document Server

    Wallgren, Anders

    2014-01-01

    This book provides a comprehensive and up to date treatment of  theory and practical implementation in Register-based statistics. It begins by defining the area, before explaining how to structure such systems, as well as detailing alternative approaches. It explains how to create statistical registers, how to implement quality assurance, and the use of IT systems for register-based statistics. Further to this, clear details are given about the practicalities of implementing such statistical methods, such as protection of privacy and the coordination and coherence of such an undertaking. Thi

  19. A Review of Modeling Bioelectrochemical Systems: Engineering and Statistical Aspects

    Directory of Open Access Journals (Sweden)

    Shuai Luo

    2016-02-01

    Full Text Available Bioelectrochemical systems (BES are promising technologies to convert organic compounds in wastewater to electrical energy through a series of complex physical-chemical, biological and electrochemical processes. Representative BES such as microbial fuel cells (MFCs have been studied and advanced for energy recovery. Substantial experimental and modeling efforts have been made for investigating the processes involved in electricity generation toward the improvement of the BES performance for practical applications. However, there are many parameters that will potentially affect these processes, thereby making the optimization of system performance hard to be achieved. Mathematical models, including engineering models and statistical models, are powerful tools to help understand the interactions among the parameters in BES and perform optimization of BES configuration/operation. This review paper aims to introduce and discuss the recent developments of BES modeling from engineering and statistical aspects, including analysis on the model structure, description of application cases and sensitivity analysis of various parameters. It is expected to serves as a compass for integrating the engineering and statistical modeling strategies to improve model accuracy for BES development.

  20. Statistical reliability assessment of software-based systems

    International Nuclear Information System (INIS)

    Korhonen, J.; Pulkkinen, U.; Haapanen, P.

    1997-01-01

    Plant vendors nowadays propose software-based systems even for the most critical safety functions. The reliability estimation of safety critical software-based systems is difficult since the conventional modeling techniques do not necessarily apply to the analysis of these systems, and the quantification seems to be impossible. Due to lack of operational experience and due to the nature of software faults, the conventional reliability estimation methods can not be applied. New methods are therefore needed for the safety assessment of software-based systems. In the research project Programmable automation systems in nuclear power plants (OHA), financed together by the Finnish Centre for Radiation and Nuclear Safety (STUK), the Ministry of Trade and Industry and the Technical Research Centre of Finland (VTT), various safety assessment methods and tools for software based systems are developed and evaluated. This volume in the OHA-report series deals with the statistical reliability assessment of software based systems on the basis of dynamic test results and qualitative evidence from the system design process. Other reports to be published later on in OHA-report series will handle the diversity requirements in safety critical software-based systems, generation of test data from operational profiles and handling of programmable automation in plant PSA-studies. (orig.) (25 refs.)

  1. Nonequilibrium statistical mechanics of systems with long-range interactions

    Energy Technology Data Exchange (ETDEWEB)

    Levin, Yan, E-mail: levin@if.ufrgs.br; Pakter, Renato, E-mail: pakter@if.ufrgs.br; Rizzato, Felipe B., E-mail: rizzato@if.ufrgs.br; Teles, Tarcísio N., E-mail: tarcisio.teles@fi.infn.it; Benetti, Fernanda P.C., E-mail: fbenetti@if.ufrgs.br

    2014-02-01

    Systems with long-range (LR) forces, for which the interaction potential decays with the interparticle distance with an exponent smaller than the dimensionality of the embedding space, remain an outstanding challenge to statistical physics. The internal energy of such systems lacks extensivity and additivity. Although the extensivity can be restored by scaling the interaction potential with the number of particles, the non-additivity still remains. Lack of additivity leads to inequivalence of statistical ensembles. Before relaxing to thermodynamic equilibrium, isolated systems with LR forces become trapped in out-of-equilibrium quasi-stationary states (qSSs), the lifetime of which diverges with the number of particles. Therefore, in the thermodynamic limit LR systems will not relax to equilibrium. The qSSs are attained through the process of collisionless relaxation. Density oscillations lead to particle–wave interactions and excitation of parametric resonances. The resonant particles escape from the main cluster to form a tenuous halo. Simultaneously, this cools down the core of the distribution and dampens out the oscillations. When all the oscillations die out the ergodicity is broken and a qSS is born. In this report, we will review a theory which allows us to quantitatively predict the particle distribution in the qSS. The theory is applied to various LR interacting systems, ranging from plasmas to self-gravitating clusters and kinetic spin models.

  2. Bayesian Statistics and Uncertainty Quantification for Safety Boundary Analysis in Complex Systems

    Science.gov (United States)

    He, Yuning; Davies, Misty Dawn

    2014-01-01

    The analysis of a safety-critical system often requires detailed knowledge of safe regions and their highdimensional non-linear boundaries. We present a statistical approach to iteratively detect and characterize the boundaries, which are provided as parameterized shape candidates. Using methods from uncertainty quantification and active learning, we incrementally construct a statistical model from only few simulation runs and obtain statistically sound estimates of the shape parameters for safety boundaries.

  3. Automated Material Accounting Statistics System at Rockwell Hanford Operations

    International Nuclear Information System (INIS)

    Eggers, R.F.; Giese, E.W.; Kodman, G.P.

    1986-01-01

    The Automated Material Accounting Statistics System (AMASS) was developed under the sponsorship of the U.S. Nuclear Regulatory Commission. The AMASS was developed when it was realized that classical methods of error propagation, based only on measured quantities, did not properly control false alarm rate and that errors other than measurement errors affect inventory differences. The classical assumptions that (1) the mean value of the inventory difference (ID) for a particular nuclear material processing facility is zero, and (2) the variance of the inventory difference is due only to errors in measured quantities are overly simplistic. The AMASS provides a valuable statistical tool for estimating the true mean value and variance of the ID data produced by a particular material balance area. In addition it provides statistical methods of testing both individual and cumulative sums of IDs, taking into account the estimated mean value and total observed variance of the ID

  4. Study of energy fluctuation effect on the statistical mechanics of equilibrium systems

    International Nuclear Information System (INIS)

    Lysogorskiy, Yu V; Wang, Q A; Tayurskii, D A

    2012-01-01

    This work is devoted to the modeling of energy fluctuation effect on the behavior of small classical thermodynamic systems. It is known that when an equilibrium system gets smaller and smaller, one of the major quantities that becomes more and more uncertain is its internal energy. These increasing fluctuations can considerably modify the original statistics. The present model considers the effect of such energy fluctuations and is based on an overlapping between the Boltzmann-Gibbs statistics and the statistics of the fluctuation. Within this o verlap statistics , we studied the effects of several types of energy fluctuations on the probability distribution, internal energy and heat capacity. It was shown that the fluctuations can considerably change the temperature dependence of internal energy and heat capacity in the low energy range and at low temperatures. Particularly, it was found that, due to the lower energy limit of the systems, the fluctuations reduce the probability for the low energy states close to the lowest energy and increase the total average energy. This energy increasing is larger for lower temperatures, making negative heat capacity possible for this case.

  5. Correcting the Count: Improving Vital Statistics Data Regarding Deaths Related to Obesity.

    Science.gov (United States)

    McCleskey, Brandi C; Davis, Gregory G; Dye, Daniel W

    2017-11-15

    Obesity can involve any organ system and compromise the overall health of an individual, including premature death. Despite the increased risk of death associated with being obese, obesity itself is infrequently indicated on the death certificate. We performed an audit of our records to identify how often "obesity" was listed on the death certificate to determine how our practices affected national mortality data collection regarding obesity-related mortality. During the span of nearly 25 years, 0.2% of deaths were attributed to or contributed by obesity. Over the course of 5 years, 96% of selected natural deaths were likely underreported as being associated with obesity. We present an algorithm for certifiers to use to determine whether obesity should be listed on the death certificate and guidelines for certifying cases in which this is appropriate. Use of this algorithm will improve vital statistics concerning the role of obesity in causing or contributing to death. © 2017 American Academy of Forensic Sciences.

  6. Actual problems of accession in relation with library statistics

    Directory of Open Access Journals (Sweden)

    Tereza Poličnik-Čermelj

    2010-01-01

    Full Text Available Accession is the process of recording bibliographic units in an accession register. Typically,library materials are acquired by purchase, exchange, gift or legal deposit. How-ever, COBISS (Cooperative Online Bibliographic System and Services Holdings software module includes some additional methods of acquisition which causes problems in gathering and presenting statistical data on local holdings. The article explains how to record holdings of different types of library materials and how to record retrospective collections. It describes necessary procedures in case the codes that define the publication pattern of the holdings are changed with special attention to integrating resources. Procedures of accession and circulation of bound materials, supplementary materials, teaching sets, multi parts, multimedia and collection level catalogue records are described. The attention is given to errors in recording lost item replacements and to the problems of circulation of certain types of library materials. The author also suggests how to record remote electronic resources. It is recommended to verify holdings data before the accession register is generated. The relevant and credible statistical data on collection development can only be created by librarians with sufficient acquisition and cataloguing skills.

  7. Operation of the radiation dose registration system for decontamination and related works

    International Nuclear Information System (INIS)

    Ogawa, Tsubasa; Yasutake, Tsuneo; Itoh, Atsuo; Miyabe, Kenjiro

    2017-01-01

    The radiation dose registration system for decontamination and related works was established on 15 November 2013. Radiation dose registration center and primary contractors of decontamination and related works manage decontamination registration and management system. As of 31 March 2017, 384 primary contractors joined in the radiation dose registration system for decontamination and related works. 383,087 quarterly exposure dose records for decontamination and related works were registered. Based on the registered data provided by the primary contractors, radiation dose registration center has released the statistical data that represent the radiation control status for workers engaged in radiation work at the work areas of decontamination and related works, etc. The statistical data shows that there were 40,377 workers engaged in decontamination and related works in 2015. The average exposure dose for workers was 0.6 mSv in 2015. The maximum exposure dose for workers was 7.8 mSv in 2015. Dose distribution by age of workers shows the range of 60 to 64 years old were most engaged in decontamination and related works in 2015. Dose distribution by gender of workers shows 97% of workers were male in 2015. From 2012 to 2015, about 95% of workers were exposed to radiation less than 3 mSv. And about 80% of workers were exposed to radiation less than 1 mSv. The average exposure dose per year was ranged from 0.5 to 0.7 mSv. (author)

  8. National Vital Statistics System (NVSS) - National Cardiovascular Disease Surveillance Data

    Data.gov (United States)

    U.S. Department of Health & Human Services — 2000 forward. NVSS is a secure, web-based data management system that collects and disseminates the Nation's official vital statistics. Indicators from this data...

  9. A proposal for the measurement of graphical statistics effectiveness: Does it enhance or interfere with statistical reasoning?

    International Nuclear Information System (INIS)

    Agus, M; Penna, M P; Peró-Cebollero, M; Guàrdia-Olmos, J

    2015-01-01

    Numerous studies have examined students' difficulties in understanding some notions related to statistical problems. Some authors observed that the presentation of distinct visual representations could increase statistical reasoning, supporting the principle of graphical facilitation. But other researchers disagree with this viewpoint, emphasising the impediments related to the use of illustrations that could overcharge the cognitive system with insignificant data. In this work we aim at comparing the probabilistic statistical reasoning regarding two different formats of problem presentations: graphical and verbal-numerical. We have conceived and presented five pairs of homologous simple problems in the verbal numerical and graphical format to 311 undergraduate Psychology students (n=156 in Italy and n=155 in Spain) without statistical expertise. The purpose of our work was to evaluate the effect of graphical facilitation in probabilistic statistical reasoning. Every undergraduate has solved each pair of problems in two formats in different problem presentation orders and sequences. Data analyses have highlighted that the effect of graphical facilitation is infrequent in psychology undergraduates. This effect is related to many factors (as knowledge, abilities, attitudes, and anxiety); moreover it might be considered the resultant of interaction between individual and task characteristics

  10. Statistical Anxiety and Attitudes Towards Statistics: Development of a Comprehensive Danish Instrument

    DEFF Research Database (Denmark)

    Nielsen, Tine; Kreiner, Svend

    Short abstract Motivated by experiencing with students’ psychological barriers for learning statistics we modified and extended the Statistical Anxiety Rating Scale (STARS) to develop a contemporary Danish measure of attitudes and relationship to statistics for use with higher education students...... with evidence of DIF in all cases: One TCA-item functioned differentially relative to age, one WS-item functioned differentially relative to statistics course (first or second), and two IA-items functioned differentially relative to statistics course and academic discipline (sociology, public health...

  11. Development of nuclear power plant online monitoring system using statistical quality control

    International Nuclear Information System (INIS)

    An, Sang Ha

    2006-02-01

    Statistical Quality Control techniques have been applied to many aspects of industrial engineering. An application to nuclear power plant maintenance and control is also presented that can greatly improve plant safety. As a demonstration of such an approach, a specific system is analyzed: the reactor coolant pumps (RCP) and the fouling resistance of heat exchanger. This research uses Shewart X-bar, R charts, Cumulative Sum charts (CUSUM), and Sequential Probability Ratio Test (SPRT) to analyze the process for the state of statistical control. And we made Control Chart Analyzer (CCA) to support these analyses that can make a decision of error in process. The analysis shows that statistical process control methods can be applied as an early warning system capable of identifying significant equipment problems well in advance of traditional control room alarm indicators. Such a system would provide operators with enough time to respond to possible emergency situations and thus improve plant safety and reliability

  12. THE ORGANIZATIONAL PRINCIPLES OF CONSTRUCTION THE SYSTEM OF STATISTICAL INDICATORS OF SECONDARY EDUCATION

    Directory of Open Access Journals (Sweden)

    А. Haniukova

    2015-04-01

    Full Text Available In the article deals the structure and content of the statistical analysis secondary education. Particular attention is paid the organization principles of constructing a system of statistical indicators of the status and trends of the phenomenon. Courtesy a system of indicators, that containing existing indicators and proposed by the author. The analysis of which will increase the efficiency of public administration in the area.

  13. Effects of Consecutive Basketball Games on the Game-Related Statistics that Discriminate Winner and Losing Teams

    Science.gov (United States)

    Ibáñez, Sergio J.; García, Javier; Feu, Sebastian; Lorenzo, Alberto; Sampaio, Jaime

    2009-01-01

    The aim of the present study was to identify the game-related statistics that discriminated basketball winning and losing teams in each of the three consecutive games played in a condensed tournament format. The data were obtained from the Spanish Basketball Federation and included game-related statistics from the Under-20 league (2005-2006 and 2006-2007 seasons). A total of 223 games were analyzed with the following game-related statistics: two and three-point field goal (made and missed), free-throws (made and missed), offensive and defensive rebounds, assists, steals, turnovers, blocks (made and received), fouls committed, ball possessions and offensive rating. Results showed that winning teams in this competition had better values in all game-related statistics, with the exception of three point field goals made, free-throws missed and turnovers (p ≥ 0.05). The main effect of game number was only identified in turnovers, with a statistical significant decrease between the second and third game. No interaction was found in the analysed variables. A discriminant analysis allowed identifying the two-point field goals made, the defensive rebounds and the assists as discriminators between winning and losing teams in all three games. Additionally to these, only the three-point field goals made contributed to discriminate teams in game three, suggesting a moderate effect of fatigue. Coaches may benefit from being aware of this variation in game determinant related statistics and, also, from using offensive and defensive strategies in the third game, allowing to explore or hide the three point field-goals performance. Key points Overall team performances along the three consecutive games were very similar, not confirming an accumulated fatigue effect. The results from the three-point field goals in the third game suggested that winning teams were able to shoot better from longer distances and this could be the result of exhibiting higher conditioning status and

  14. Nonequilibrium statistical mechanics and stochastic thermodynamics of small systems

    International Nuclear Information System (INIS)

    Tu Zhanchun

    2014-01-01

    Thermodynamics is an old subject. The research objects in conventional thermodynamics are macroscopic systems with huge number of particles. In recent 30 years, thermodynamics of small systems is a frontier topic in physics. Here we introduce nonequilibrium statistical mechanics and stochastic thermodynamics of small systems. As a case study, we construct a Canot-like cycle of a stochastic heat engine with a single particle controlled by a time-dependent harmonic potential. We find that the efficiency at maximum power is 1 - √T c /T h , where Tc and Th are the temperatures of cold bath and hot bath, respectively. (author)

  15. Statistical models and methods for reliability and survival analysis

    CERN Document Server

    Couallier, Vincent; Huber-Carol, Catherine; Mesbah, Mounir; Huber -Carol, Catherine; Limnios, Nikolaos; Gerville-Reache, Leo

    2013-01-01

    Statistical Models and Methods for Reliability and Survival Analysis brings together contributions by specialists in statistical theory as they discuss their applications providing up-to-date developments in methods used in survival analysis, statistical goodness of fit, stochastic processes for system reliability, amongst others. Many of these are related to the work of Professor M. Nikulin in statistics over the past 30 years. The authors gather together various contributions with a broad array of techniques and results, divided into three parts - Statistical Models and Methods, Statistical

  16. The Role of Discrete Global Grid Systems in the Global Statistical Geospatial Framework

    Science.gov (United States)

    Purss, M. B. J.; Peterson, P.; Minchin, S. A.; Bermudez, L. E.

    2016-12-01

    The United Nations Committee of Experts on Global Geospatial Information Management (UN-GGIM) has proposed the development of a Global Statistical Geospatial Framework (GSGF) as a mechanism for the establishment of common analytical systems that enable the integration of statistical and geospatial information. Conventional coordinate reference systems address the globe with a continuous field of points suitable for repeatable navigation and analytical geometry. While this continuous field is represented on a computer in a digitized and discrete fashion by tuples of fixed-precision floating point values, it is a non-trivial exercise to relate point observations spatially referenced in this way to areal coverages on the surface of the Earth. The GSGF states the need to move to gridded data delivery and the importance of using common geographies and geocoding. The challenges associated with meeting these goals are not new and there has been a significant effort within the geospatial community to develop nested gridding standards to tackle these issues over many years. These efforts have recently culminated in the development of a Discrete Global Grid Systems (DGGS) standard which has been developed under the auspices of Open Geospatial Consortium (OGC). DGGS provide a fixed areal based geospatial reference frame for the persistent location of measured Earth observations, feature interpretations, and modelled predictions. DGGS address the entire planet by partitioning it into a discrete hierarchical tessellation of progressively finer resolution cells, which are referenced by a unique index that facilitates rapid computation, query and analysis. The geometry and location of the cell is the principle aspect of a DGGS. Data integration, decomposition, and aggregation is optimised in the DGGS hierarchical structure and can be exploited for efficient multi-source data processing, storage, discovery, transmission, visualization, computation, analysis, and modelling. During

  17. A new Markov-chain-related statistical approach for modelling synthetic wind power time series

    International Nuclear Information System (INIS)

    Pesch, T; Hake, J F; Schröders, S; Allelein, H J

    2015-01-01

    The integration of rising shares of volatile wind power in the generation mix is a major challenge for the future energy system. To address the uncertainties involved in wind power generation, models analysing and simulating the stochastic nature of this energy source are becoming increasingly important. One statistical approach that has been frequently used in the literature is the Markov chain approach. Recently, the method was identified as being of limited use for generating wind time series with time steps shorter than 15–40 min as it is not capable of reproducing the autocorrelation characteristics accurately. This paper presents a new Markov-chain-related statistical approach that is capable of solving this problem by introducing a variable second lag. Furthermore, additional features are presented that allow for the further adjustment of the generated synthetic time series. The influences of the model parameter settings are examined by meaningful parameter variations. The suitability of the approach is demonstrated by an application analysis with the example of the wind feed-in in Germany. It shows that—in contrast to conventional Markov chain approaches—the generated synthetic time series do not systematically underestimate the required storage capacity to balance wind power fluctuation. (paper)

  18. A statistical view of uncertainty in expert systems

    International Nuclear Information System (INIS)

    Spiegelhalter, D.J.

    1986-01-01

    The constructors of expert systems interpret ''uncertainty'' in a wide sense and have suggested a variety of qualitative and quantitative techniques for handling the concept, such as the theory of ''endorsements,'' fuzzy reasoning, and belief functions. After a brief selective review of procedures that do not adhere to the laws of probability, it is argued that a subjectivist Bayesian view of uncertainty, if flexibly applied, can provide many of the features demanded by expert systems. This claim is illustrated with a number of examples of probabilistic reasoning, and a connection drawn with statistical work on the graphical representation of multivariate distributions. Possible areas of future research are outlined

  19. Bell Correlations in a Many-Body System with Finite Statistics

    Science.gov (United States)

    Wagner, Sebastian; Schmied, Roman; Fadel, Matteo; Treutlein, Philipp; Sangouard, Nicolas; Bancal, Jean-Daniel

    2017-10-01

    A recent experiment reported the first violation of a Bell correlation witness in a many-body system [Science 352, 441 (2016)]. Following discussions in this Letter, we address here the question of the statistics required to witness Bell correlated states, i.e., states violating a Bell inequality, in such experiments. We start by deriving multipartite Bell inequalities involving an arbitrary number of measurement settings, two outcomes per party and one- and two-body correlators only. Based on these inequalities, we then build up improved witnesses able to detect Bell correlated states in many-body systems using two collective measurements only. These witnesses can potentially detect Bell correlations in states with an arbitrarily low amount of spin squeezing. We then establish an upper bound on the statistics needed to convincingly conclude that a measured state is Bell correlated.

  20. Statistics in a Trilinear Interacting Stokes-Antistokes Boson System

    Science.gov (United States)

    Tänzler, W.; Schütte, F.-J.

    The statistics of a system of four boson modes is treated with simultaneous Stokes-Antistokes interaction taking place. The time evolution is calculated in full quantum manner but in short time approximation. Mean photon numbers and correlations of second order are calculated. Antibunching can be found in the laser mode and in the system of Stokes and Antistokes mode.Translated AbstractStatistik in einem trilinear wechselwirkenden Stokes-Antistokes-BosonensystemDie Statistik eines Systems von vier Bosonenmoden mit gleichzeitiger Stokes-Antistokes-Wechselwirkung wird bei vollquantenphysikalischer Beschreibung in Kurzzeitnäherung untersucht. Mittlere Photonenzahlen und Korrelationen zweiter Ordnung werden berechnet. Dabei wird Antibunching sowohl in der Lasermode allein als auch im System aus Stokes- und Antistokesmode gefunden.

  1. Method of statistical estimation of temperature minimums in binary systems

    International Nuclear Information System (INIS)

    Mireev, V.A.; Safonov, V.V.

    1985-01-01

    On the basis of statistical processing of literature data the technique for evaluation of temperature minima on liquidus curves in binary systems with common ion chloride systems being taken as an example, is developed. The systems are formed by 48 chlorides of 45 chemical elements including alkali, alkaline earth, rare earth and transition metals as well as Cd, In, Th. It is shown that calculation error in determining minimum melting points depends on topology of the phase diagram. The comparison of calculated and experimental data for several previously nonstudied systems is given

  2. Statistical modeling of nitrogen-dependent modulation of root system architecture in Arabidopsis thaliana.

    Science.gov (United States)

    Araya, Takao; Kubo, Takuya; von Wirén, Nicolaus; Takahashi, Hideki

    2016-03-01

    Plant root development is strongly affected by nutrient availability. Despite the importance of structure and function of roots in nutrient acquisition, statistical modeling approaches to evaluate dynamic and temporal modulations of root system architecture in response to nutrient availability have remained as widely open and exploratory areas in root biology. In this study, we developed a statistical modeling approach to investigate modulations of root system architecture in response to nitrogen availability. Mathematical models were designed for quantitative assessment of root growth and root branching phenotypes and their dynamic relationships based on hierarchical configuration of primary and lateral roots formulating the fishbone-shaped root system architecture in Arabidopsis thaliana. Time-series datasets reporting dynamic changes in root developmental traits on different nitrate or ammonium concentrations were generated for statistical analyses. Regression analyses unraveled key parameters associated with: (i) inhibition of primary root growth under nitrogen limitation or on ammonium; (ii) rapid progression of lateral root emergence in response to ammonium; and (iii) inhibition of lateral root elongation in the presence of excess nitrate or ammonium. This study provides a statistical framework for interpreting dynamic modulation of root system architecture, supported by meta-analysis of datasets displaying morphological responses of roots to diverse nitrogen supplies. © 2015 Institute of Botany, Chinese Academy of Sciences.

  3. Quantum statistical theory of solid plasma (Com.1)

    International Nuclear Information System (INIS)

    Kim Yon Il

    1986-01-01

    In order to obtain the Hamiltonian of the electron system in solid plasma, the self-consistent electromagnetic field formed by the electron system is quantalized. In this process the longitudinal vector potential is introduced through the relation. The obtained Hamiltonian is expressed by the collective coordinate, consistent with D. Pines' result. Various quantum statistical expressions, the dispersion relation and sum rules of the transverse dielectric function are derived using the fact that the collectived cooredinates are connected with the electromagnetic field in the method in this paper. In additon, various quantum statistical expressions for the longitudinal dielectric function convenient for practical calculations are obtained besides the Nozieres-Pines' expression. (author)

  4. Low-cost data acquisition systems for photovoltaic system monitoring and usage statistics

    Science.gov (United States)

    Fanourakis, S.; Wang, K.; McCarthy, P.; Jiao, L.

    2017-11-01

    This paper presents the design of a low-cost data acquisition system for monitoring a photovoltaic system’s electrical quantities, battery temperatures, and state of charge of the battery. The electrical quantities are the voltages and currents of the solar panels, the battery, and the system loads. The system uses an Atmega328p microcontroller to acquire data from the photovoltaic system’s charge controller. It also records individual load information using current sensing resistors along with a voltage amplification circuit and an analog to digital converter. The system is used in conjunction with a wall power data acquisition system for the recording of regional power outages. Both data acquisition systems record data in micro SD cards. The data has been successfully acquired from both systems and has been used to monitor the status of the PV system and the local power grid. As more data is gathered it can be used for the maintenance and improvement of the photovoltaic system through analysis of the photovoltaic system’s parameters and usage statistics.

  5. Quantum statistical Monte Carlo methods and applications to spin systems

    International Nuclear Information System (INIS)

    Suzuki, M.

    1986-01-01

    A short review is given concerning the quantum statistical Monte Carlo method based on the equivalence theorem that d-dimensional quantum systems are mapped onto (d+1)-dimensional classical systems. The convergence property of this approximate tansformation is discussed in detail. Some applications of this general appoach to quantum spin systems are reviewed. A new Monte Carlo method, ''thermo field Monte Carlo method,'' is presented, which is an extension of the projection Monte Carlo method at zero temperature to that at finite temperatures

  6. High order statistical signatures from source-driven measurements of subcritical fissile systems

    International Nuclear Information System (INIS)

    Mattingly, J.K.

    1998-01-01

    This research focuses on the development and application of high order statistical analyses applied to measurements performed with subcritical fissile systems driven by an introduced neutron source. The signatures presented are derived from counting statistics of the introduced source and radiation detectors that observe the response of the fissile system. It is demonstrated that successively higher order counting statistics possess progressively higher sensitivity to reactivity. Consequently, these signatures are more sensitive to changes in the composition, fissile mass, and configuration of the fissile assembly. Furthermore, it is shown that these techniques are capable of distinguishing the response of the fissile system to the introduced source from its response to any internal or inherent sources. This ability combined with the enhanced sensitivity of higher order signatures indicates that these techniques will be of significant utility in a variety of applications. Potential applications include enhanced radiation signature identification of weapons components for nuclear disarmament and safeguards applications and augmented nondestructive analysis of spent nuclear fuel. In general, these techniques expand present capabilities in the analysis of subcritical measurements

  7. Advances in Statistical Control, Algebraic Systems Theory, and Dynamic Systems Characteristics A Tribute to Michael K Sain

    CERN Document Server

    Won, Chang-Hee; Michel, Anthony N

    2008-01-01

    This volume - dedicated to Michael K. Sain on the occasion of his seventieth birthday - is a collection of chapters covering recent advances in stochastic optimal control theory and algebraic systems theory. Written by experts in their respective fields, the chapters are thematically organized into four parts: Part I focuses on statistical control theory, where the cost function is viewed as a random variable and performance is shaped through cost cumulants. In this respect, statistical control generalizes linear-quadratic-Gaussian and H-infinity control. Part II addresses algebraic systems th

  8. 7 CFR 2.68 - Administrator, National Agricultural Statistics Service.

    Science.gov (United States)

    2010-01-01

    ....S.C. 3318). (6) Enter cost-reimbursable agreements relating to agricultural research and statistical... promote and support the development of a viable and sustainable global agricultural system. Such work may... 7 Agriculture 1 2010-01-01 2010-01-01 false Administrator, National Agricultural Statistics...

  9. Sensitivity analysis and optimization of system dynamics models : Regression analysis and statistical design of experiments

    NARCIS (Netherlands)

    Kleijnen, J.P.C.

    1995-01-01

    This tutorial discusses what-if analysis and optimization of System Dynamics models. These problems are solved, using the statistical techniques of regression analysis and design of experiments (DOE). These issues are illustrated by applying the statistical techniques to a System Dynamics model for

  10. FRAMES Software System: Linking to the Statistical Package R

    Energy Technology Data Exchange (ETDEWEB)

    Castleton, Karl J.; Whelan, Gene; Hoopes, Bonnie L.

    2006-12-11

    This document provides requirements, design, data-file specifications, test plan, and Quality Assurance/Quality Control protocol for the linkage between the statistical package R and the Framework for Risk Analysis in Multimedia Environmental Systems (FRAMES) Versions 1.x and 2.0. The requirements identify the attributes of the system. The design describes how the system will be structured to meet those requirements. The specification presents the specific modifications to FRAMES to meet the requirements and design. The test plan confirms that the basic functionality listed in the requirements (black box testing) actually functions as designed, and QA/QC confirms that the software meets the client’s needs.

  11. Game Related Statistics Discriminating Between Starters and Nonstarters Players in Women’S National Basketball Association League (WNBA)

    Science.gov (United States)

    Gòmez, Miguel-Ángel; Lorenzo, Alberto; Ortega, Enrique; Sampaio, Jaime; Ibàñez, Sergio-José

    2009-01-01

    The aim of the present study was to identify the game-related statistics that allow discriminating between starters and nonstarter players in women’s basketball when related to winning or losing games and best or worst teams. The sample comprised all 216 regular season games from the 2005 Women’s National Basketball Association League (WNBA). The game-related statistics included were 2- and 3- point field-goals (both successful and unsuccessful), free-throws (both successful and unsuccessful), defensive and offensive rebounds, assists, blocks, fouls, steals, turnovers and minutes played. Results from multivariate analysis showed that when best teams won, the discriminant game-related statistics were successful 2-point field-goals (SC = 0.47), successful free-throws (SC = 0.44), fouls (SC = -0.41), assists (SC = 0.37), and defensive rebounds (SC = 0.37). When the worst teams won, the discriminant game-related statistics were successful 2-point field- goals (SC = 0.37), successful free-throws (SC = 0.45), assists (SC = 0.58), and steals (SC = 0.35). The results showed that the successful 2-point field-goals, successful free-throws and the assists were the most powerful variables discriminating between starters and nonstarters. These specific characteristics helped to point out the importance of starters’ players shooting and passing ability during competitions. Key points The players’ game-related statistical profile varied according to team status, game outcome and team quality in women’s basketball. The results of this work help to point out the different player’s performance described in women’s basketball compared with men’s basketball. The results obtained enhance the importance of starters and nonstarters contribution to team’s performance in different game contexts. Results showed the power of successful 2-point field-goals, successful free-throws and assists discriminating between starters and nonstarters in all the analyses. PMID:24149538

  12. Statistical Decision Support Tools for System-Oriented Runway Management, Phase II

    Data.gov (United States)

    National Aeronautics and Space Administration — The feasibility of developing a statistical decision support system for traffic flow management in the terminal area and runway load balancing was demonstrated in...

  13. 2015 QuickCompass of Sexual Assult-Related Responders: Statistical Methodology Report

    Science.gov (United States)

    2016-02-01

    Degree Age CAGE5 18 to 24 years olds 25 to 30 years olds 31 to 34 years olds 35 to 40 years olds 41 years old and older Gender CSEX Male Female...2015 QuickCompass of Sexual Assault Prevention and Response- Related Responders Statistical Methodology Report Additional copies of this report...from: http://www.dtic.mil/ Ask for report by ADA630235 DMDC Report No. 2015-039 February 2016 2015 QUICKCOMPASS OF SEXUAL ASSAULT PREVENTION

  14. 77 FR 66662 - Generalized System of Preferences (GSP): Import Statistics Relating to Competitive Need Limitations

    Science.gov (United States)

    2012-11-06

    ... into effect. Exclusions for exceeding a CNL will be based on full 2012 calendar-year import statistics...--Ferrosilicon containing between 55% and 80% of silicon (Russia) 2106.90.99--Miscellaneous food preparations not canned or frozen (Thailand) 9506.70.40--Ice skates w/footwear permanently attached (Thailand) The list...

  15. Development of modelling algorithm of technological systems by statistical tests

    Science.gov (United States)

    Shemshura, E. A.; Otrokov, A. V.; Chernyh, V. G.

    2018-03-01

    The paper tackles the problem of economic assessment of design efficiency regarding various technological systems at the stage of their operation. The modelling algorithm of a technological system was performed using statistical tests and with account of the reliability index allows estimating the level of machinery technical excellence and defining the efficiency of design reliability against its performance. Economic feasibility of its application shall be determined on the basis of service quality of a technological system with further forecasting of volumes and the range of spare parts supply.

  16. 30 CFR 250.192 - What reports and statistics must I submit relating to a hurricane, earthquake, or other natural...

    Science.gov (United States)

    2010-07-01

    ... relating to a hurricane, earthquake, or other natural occurrence? 250.192 Section 250.192 Mineral Resources... statistics must I submit relating to a hurricane, earthquake, or other natural occurrence? (a) You must... tropical storm, or an earthquake. Statistics include facilities and rigs evacuated and the amount of...

  17. Fundamental concepts and relations for reliability analysis of multi-state systems

    International Nuclear Information System (INIS)

    Murchland, J.D.

    1975-01-01

    The fundamental concepts and relations that should be used in the reliability analysis of systems with numerous components are discussed, with an emphasis on calculable quantities. These are: (1) the average probability of being in a state, (2) the average transition rates between states, in the long run or as time functions, and (3) the integrals of the transition rates, which are the expected numbers of transitions. These quantities are related by the net transition relations, and the calculationally vital transition rate relation when the inputs of an item are statistically independent. Assumptions necessary for the existence of these quantities and for the relations are listed, and proofs given. The importance of exploiting the closeness to ''simple'' structure which systems may possess, and the versatility for different problems of a computational technique of ''reduction'' and ''expansion'' are discussed. The key relations for the latter are formally derived. Applications are made to fault trees, structure networks, undirected and directed communication networks

  18. Prototyping a Distributed Information Retrieval System That Uses Statistical Ranking.

    Science.gov (United States)

    Harman, Donna; And Others

    1991-01-01

    Built using a distributed architecture, this prototype distributed information retrieval system uses statistical ranking techniques to provide better service to the end user. Distributed architecture was shown to be a feasible alternative to centralized or CD-ROM information retrieval, and user testing of the ranking methodology showed both…

  19. Fractional exclusion and braid statistics in one dimension: a study via dimensional reduction of Chern–Simons theory

    International Nuclear Information System (INIS)

    Ye, Fei; Marchetti, P A; Su, Z B; Yu, L

    2017-01-01

    The relation between braid and exclusion statistics is examined in one-dimensional systems, within the framework of Chern–Simons statistical transmutation in gauge invariant form with an appropriate dimensional reduction. If the matter action is anomalous, as for chiral fermions, a relation between braid and exclusion statistics can be established explicitly for both mutual and nonmutual cases. However, if it is not anomalous, the exclusion statistics of emergent low energy excitations is not necessarily connected to the braid statistics of the physical charged fields of the system. Finally, we also discuss the bosonization of one-dimensional anyonic systems through T-duality. (paper)

  20. Identification of Crew-Systems Interactions and Decision Related Trends

    Science.gov (United States)

    Jones, Sharon Monica; Evans, Joni K.; Reveley, Mary S.; Withrow, Colleen A.; Ancel, Ersin; Barr, Lawrence

    2013-01-01

    NASA Vehicle System Safety Technology (VSST) project management uses systems analysis to identify key issues and maintain a portfolio of research leading to potential solutions to its three identified technical challenges. Statistical data and published safety priority lists from academic, industry and other government agencies were reviewed and analyzed by NASA Aviation Safety Program (AvSP) systems analysis personnel to identify issues and future research needs related to one of VSST's technical challenges, Crew Decision Making (CDM). The data examined in the study were obtained from the National Transportation Safety Board (NTSB) Aviation Accident and Incident Data System, Federal Aviation Administration (FAA) Accident/Incident Data System and the NASA Aviation Safety Reporting System (ASRS). In addition, this report contains the results of a review of safety priority lists, information databases and other documented references pertaining to aviation crew systems issues and future research needs. The specific sources examined were: Commercial Aviation Safety Team (CAST) Safety Enhancements Reserved for Future Implementation (SERFIs), Flight Deck Automation Issues (FDAI) and NTSB Most Wanted List and Open Recommendations. Various automation issues taxonomies and priority lists pertaining to human factors, automation and flight design were combined to create a list of automation issues related to CDM.

  1. Statistical properties of earthquakes clustering

    Directory of Open Access Journals (Sweden)

    A. Vecchio

    2008-04-01

    Full Text Available Often in nature the temporal distribution of inhomogeneous stochastic point processes can be modeled as a realization of renewal Poisson processes with a variable rate. Here we investigate one of the classical examples, namely, the temporal distribution of earthquakes. We show that this process strongly departs from a Poisson statistics for both catalogue and sequence data sets. This indicate the presence of correlations in the system probably related to the stressing perturbation characterizing the seismicity in the area under analysis. As shown by this analysis, the catalogues, at variance with sequences, show common statistical properties.

  2. System of National Accounts as an Information Base for Tax Statistics

    Directory of Open Access Journals (Sweden)

    A. E. Lyapin

    2017-01-01

    Full Text Available The article is devoted to those aspects of the system of national accounts, which together perform the role of information base of tax statistics. In our time, the tax system is one of the main subjects of the discussions about the methods and directions of its reform.Taxes are one of the main factors of regulation of the economy and act as an incentive for its development. Analysis of tax revenues to the budgets of different levels will enable to collect taxes and perform tax burden for various industries. From the amount of tax revenue it is possible to judge scales of reproductive processes in the country. It should be noted that taxes in the SNA are special. As mentioned earlier, in the SNA, taxes on products are treated in the form of income. At the same time, most economists prefer, their consideration in the form of consumption taxes, and taxes on various financial transactions (for example: taxes on the purchase/sale of securities are treated as taxes on production, including in cases when there are no services. It would be rational to revise and amend the SNA associated with the interpretation of all taxes and subsidies, to ensure better understanding and compliance with user needs.Taxes are an integral part of any state and an indispensable element of economic relations of any society. In turn, taxes and the budget are inextricably linked, as these relations have a clearly expressed, objective bilateral character. Taxes are the main groups of budget revenues, which makes it possible to finance all the government agencies and expenditure items, as well as the implementation of institutional subsidy units that make up the SNA sector “non-financial corporations”.The second side story is that taxes – a part of the money that is taken from producers and households. The total mass of taxes depends on the composition of taxes, tax rates, tax base and scope of benefits. The bulk of tax revenues also depends on possible changes in

  3. Statistical dynamics of ultradiffusion in hierarchical systems

    International Nuclear Information System (INIS)

    Gardner, S.

    1987-01-01

    In many types of disordered systems which exhibit frustration and competition, an ultrametric topology is found to exist in the space of allowable states. This ultrametric topology of states is associated with a hierarchical relaxation process called ultradiffusion. Ultradiffusion occurs in hierarchical non-linear (HNL) dynamical systems when constraints cause large scale, slow modes of motion to be subordinated to small scale, fast modes. Examples of ultradiffusion are found throughout condensed matter physics and critical phenomena (e.g. the states of spin glasses), in biophysics (e.g. the states of Hopfield networks) and in many other fields including layered computing based upon nonlinear dynamics. The statistical dynamics of ultradiffusion can be treated as a random walk on an ultrametric space. For reversible bifurcating ultrametric spaces the evolution equation governing the probability of a particle being found at site i at time t has a highly degenerate transition matrix. This transition matrix has a fractal geometry similar to the replica form proposed for spin glasses. The authors invert this fractal matrix using a recursive quad-tree (QT) method. Possible applications of hierarchical systems to communications and symbolic computing are discussed briefly

  4. Study of film data processing systems by means of a statistical simulation

    International Nuclear Information System (INIS)

    Deart, A.F.; Gromov, A.I.; Kapustinskaya, V.I.; Okorochenko, G.E.; Sychev, A.Yu.; Tatsij, L.I.

    1974-01-01

    Considered is a statistic model of the film information processing system. The given time diagrams illustrate the model operation algorithm. The program realizing this model of the system is described in detail. The elaborated program model has been tested at the film information processing system which represents a group of measuring devices operating in line with BESM computer. The obtained functioning quantitative characteristics of the system being tested permit to estimate the system operation efficiency

  5. Statistical analysis of complex systems with nonclassical invariant measures

    KAUST Repository

    Fratalocchi, Andrea

    2011-02-28

    I investigate the problem of finding a statistical description of a complex many-body system whose invariant measure cannot be constructed stemming from classical thermodynamics ensembles. By taking solitons as a reference system and by employing a general formalism based on the Ablowitz-Kaup-Newell-Segur scheme, I demonstrate how to build an invariant measure and, within a one-dimensional phase space, how to develop a suitable thermodynamics. A detailed example is provided with a universal model of wave propagation, with reference to a transparent potential sustaining gray solitons. The system shows a rich thermodynamic scenario, with a free-energy landscape supporting phase transitions and controllable emergent properties. I finally discuss the origin of such behavior, trying to identify common denominators in the area of complex dynamics.

  6. Reducing statistics anxiety and enhancing statistics learning achievement: effectiveness of a one-minute strategy.

    Science.gov (United States)

    Chiou, Chei-Chang; Wang, Yu-Min; Lee, Li-Tze

    2014-08-01

    Statistical knowledge is widely used in academia; however, statistics teachers struggle with the issue of how to reduce students' statistics anxiety and enhance students' statistics learning. This study assesses the effectiveness of a "one-minute paper strategy" in reducing students' statistics-related anxiety and in improving students' statistics-related achievement. Participants were 77 undergraduates from two classes enrolled in applied statistics courses. An experiment was implemented according to a pretest/posttest comparison group design. The quasi-experimental design showed that the one-minute paper strategy significantly reduced students' statistics anxiety and improved students' statistics learning achievement. The strategy was a better instructional tool than the textbook exercise for reducing students' statistics anxiety and improving students' statistics achievement.

  7. Second-Order Statistics for Wave Propagation through Complex Optical Systems

    DEFF Research Database (Denmark)

    Yura, H.T.; Hanson, Steen Grüner

    1989-01-01

    Closed-form expressions are derived for various statistical functions that arise in optical propagation through arbitrary optical systems that can be characterized by a complex ABCD matrix in the presence of distributed random inhomogeneities along the optical path. Specifically, within the second......-order Rytov approximation, explicit general expressions are presented for the mutual coherence function, the log-amplitude and phase correlation functions, and the mean-square irradiance that are obtained in propagation through an arbitrary paraxial ABCD optical system containing Gaussian-shaped limiting...

  8. Statistical analysis of natural disasters and related losses

    CERN Document Server

    Pisarenko, VF

    2014-01-01

    The study of disaster statistics and disaster occurrence is a complicated interdisciplinary field involving the interplay of new theoretical findings from several scientific fields like mathematics, physics, and computer science. Statistical studies on the mode of occurrence of natural disasters largely rely on fundamental findings in the statistics of rare events, which were derived in the 20th century. With regard to natural disasters, it is not so much the fact that the importance of this problem for mankind was recognized during the last third of the 20th century - the myths one encounters in ancient civilizations show that the problem of disasters has always been recognized - rather, it is the fact that mankind now possesses the necessary theoretical and practical tools to effectively study natural disasters, which in turn supports effective, major practical measures to minimize their impact. All the above factors have resulted in considerable progress in natural disaster research. Substantial accrued ma...

  9. Statistical physics of human beings in games: Controlled experiments

    International Nuclear Information System (INIS)

    Liang Yuan; Huang Ji-Ping

    2014-01-01

    It is important to know whether the laws or phenomena in statistical physics for natural systems with non-adaptive agents still hold for social human systems with adaptive agents, because this implies whether it is possible to study or understand social human systems by using statistical physics originating from natural systems. For this purpose, we review the role of human adaptability in four kinds of specific human behaviors, namely, normal behavior, herd behavior, contrarian behavior, and hedge behavior. The approach is based on controlled experiments in the framework of market-directed resource-allocation games. The role of the controlled experiments could be at least two-fold: adopting the real human decision-making process so that the system under consideration could reflect the performance of genuine human beings; making it possible to obtain macroscopic physical properties of a human system by tuning a particular factor of the system, thus directly revealing cause and effect. As a result, both computer simulations and theoretical analyses help to show a few counterparts of some laws or phenomena in statistical physics for social human systems: two-phase phenomena or phase transitions, entropy-related phenomena, and a non-equilibrium steady state. This review highlights the role of human adaptability in these counterparts, and makes it possible to study or understand some particular social human systems by means of statistical physics coming from natural systems. (topical review - statistical physics and complex systems)

  10. Fractional exclusion and braid statistics in one dimension: a study via dimensional reduction of Chern-Simons theory

    Science.gov (United States)

    Ye, Fei; Marchetti, P. A.; Su, Z. B.; Yu, L.

    2017-09-01

    The relation between braid and exclusion statistics is examined in one-dimensional systems, within the framework of Chern-Simons statistical transmutation in gauge invariant form with an appropriate dimensional reduction. If the matter action is anomalous, as for chiral fermions, a relation between braid and exclusion statistics can be established explicitly for both mutual and nonmutual cases. However, if it is not anomalous, the exclusion statistics of emergent low energy excitations is not necessarily connected to the braid statistics of the physical charged fields of the system. Finally, we also discuss the bosonization of one-dimensional anyonic systems through T-duality. Dedicated to the memory of Mario Tonin.

  11. Reversible Statistics

    DEFF Research Database (Denmark)

    Tryggestad, Kjell

    2004-01-01

    The study aims is to describe how the inclusion and exclusion of materials and calculative devices construct the boundaries and distinctions between statistical facts and artifacts in economics. My methodological approach is inspired by John Graunt's (1667) Political arithmetic and more recent work...... within constructivism and the field of Science and Technology Studies (STS). The result of this approach is here termed reversible statistics, reconstructing the findings of a statistical study within economics in three different ways. It is argued that all three accounts are quite normal, albeit...... in different ways. The presence and absence of diverse materials, both natural and political, is what distinguishes them from each other. Arguments are presented for a more symmetric relation between the scientific statistical text and the reader. I will argue that a more symmetric relation can be achieved...

  12. Statistical similarities of pre-earthquake electromagnetic emissions to biological and economic extreme events

    Science.gov (United States)

    Potirakis, Stelios M.; Contoyiannis, Yiannis; Kopanas, John; Kalimeris, Anastasios; Antonopoulos, George; Peratzakis, Athanasios; Eftaxias, Konstantinos; Nomicos, Costantinos

    2014-05-01

    When one considers a phenomenon that is "complex" refers to a system whose phenomenological laws that describe the global behavior of the system, are not necessarily directly related to the "microscopic" laws that regulate the evolution of its elementary parts. The field of study of complex systems considers that the dynamics of complex systems are founded on universal principles that may be used to describe disparate problems ranging from particle physics to economies of societies. Several authors have suggested that earthquake (EQ) dynamics can be analyzed within similar mathematical frameworks with economy dynamics, and neurodynamics. A central property of the EQ preparation process is the occurrence of coherent large-scale collective behavior with a very rich structure, resulting from repeated nonlinear interactions among the constituents of the system. As a result, nonextensive statistics is an appropriate, physically meaningful, tool for the study of EQ dynamics. Since the fracture induced electromagnetic (EM) precursors are observable manifestations of the underlying EQ preparation process, the analysis of a fracture induced EM precursor observed prior to the occurrence of a large EQ can also be conducted within the nonextensive statistics framework. Within the frame of the investigation for universal principles that may hold for different dynamical systems that are related to the genesis of extreme events, we present here statistical similarities of the pre-earthquake EM emissions related to an EQ, with the pre-ictal electrical brain activity related to an epileptic seizure, and with the pre-crisis economic observables related to the collapse of a share. It is demonstrated the all three dynamical systems' observables can be analyzed in the frame of nonextensive statistical mechanics, while the frequency-size relations of appropriately defined "events" that precede the extreme event related to each one of these different systems present striking quantitative

  13. Statistical characterization of discrete conservative systems: The web map

    Science.gov (United States)

    Ruiz, Guiomar; Tirnakli, Ugur; Borges, Ernesto P.; Tsallis, Constantino

    2017-10-01

    We numerically study the two-dimensional, area preserving, web map. When the map is governed by ergodic behavior, it is, as expected, correctly described by Boltzmann-Gibbs statistics, based on the additive entropic functional SB G[p (x ) ] =-k ∫d x p (x ) lnp (x ) . In contrast, possible ergodicity breakdown and transitory sticky dynamical behavior drag the map into the realm of generalized q statistics, based on the nonadditive entropic functional Sq[p (x ) ] =k 1/-∫d x [p(x ) ] q q -1 (q ∈R ;S1=SB G ). We statistically describe the system (probability distribution of the sum of successive iterates, sensitivity to the initial condition, and entropy production per unit time) for typical values of the parameter that controls the ergodicity of the map. For small (large) values of the external parameter K , we observe q -Gaussian distributions with q =1.935 ⋯ (Gaussian distributions), like for the standard map. In contrast, for intermediate values of K , we observe a different scenario, due to the fractal structure of the trajectories embedded in the chaotic sea. Long-standing non-Gaussian distributions are characterized in terms of the kurtosis and the box-counting dimension of chaotic sea.

  14. Higher order capacity statistics of multi-hop transmission systems over Rayleigh fading channels

    KAUST Repository

    Yilmaz, Ferkan

    2012-03-01

    In this paper, we present an exact analytical expression to evaluate the higher order statistics of the channel capacity for amplify and forward (AF) multihop transmission systems operating over Rayleigh fading channels. Furthermore, we present simple and efficient closed-form expression to the higher order moments of the channel capacity of dual hop transmission system with Rayleigh fading channels. In order to analyze the behavior of the higher order capacity statistics and investigate the usefulness of the mathematical analysis, some selected numerical and simulation results are presented. Our results are found to be in perfect agreement. © 2012 IEEE.

  15. The κ parameter and κ-distribution in κ-deformed statistics for the systems in an external field

    International Nuclear Information System (INIS)

    Guo, Lina; Du, Jiulin

    2007-01-01

    It is naturally important question for us to ask under what physical situation should the κ-deformed statistics be suitable for the statistical description of a system and what should the κ parameter stand for. In this Letter, a formula expression of κ parameter is derived on the basis of the κ-H theorem, the κ-velocity distribution and the generalized Boltzmann equation in the framework of κ-deformed statistics. We thus obtain a physical interpretation for the parameter κ 0 with regard to the temperature gradient and the external force field. We show, as the q-statistics based on Tsallis entropy, the κ-deformed statistics may also be the candidate one suitable for the statistical description of the systems in external fields when being in the nonequilibrium stationary state, but has different physical characteristics. Namely, the κ-distribution is found to describe the nonequilibrium stationary state of the system where the external force should be vertical to the temperature gradient

  16. Equilibrium statistical mechanics

    CERN Document Server

    Jackson, E Atlee

    2000-01-01

    Ideal as an elementary introduction to equilibrium statistical mechanics, this volume covers both classical and quantum methodology for open and closed systems. Introductory chapters familiarize readers with probability and microscopic models of systems, while additional chapters describe the general derivation of the fundamental statistical mechanics relationships. The final chapter contains 16 sections, each dealing with a different application, ordered according to complexity, from classical through degenerate quantum statistical mechanics. Key features include an elementary introduction t

  17. A DoS/DDoS Attack Detection System Using Chi-Square Statistic Approach

    Directory of Open Access Journals (Sweden)

    Fang-Yie Leu

    2010-04-01

    Full Text Available Nowadays, users can easily access and download network attack tools, which often provide friendly interfaces and easily operated features, from the Internet. Therefore, even a naive hacker can also launch a large scale DoS or DDoS attack to prevent a system, i.e., the victim, from providing Internet services. In this paper, we propose an agent based intrusion detection architecture, which is a distributed detection system, to detect DoS/DDoS attacks by invoking a statistic approach that compares source IP addresses' normal and current packet statistics to discriminate whether there is a DoS/DDoS attack. It first collects all resource IPs' packet statistics so as to create their normal packet distribution. Once some IPs' current packet distribution suddenly changes, very often it is an attack. Experimental results show that this approach can effectively detect DoS/DDoS attacks.

  18. A concept of customer–provider relation monitoring system solution

    Directory of Open Access Journals (Sweden)

    Naděžda Chalupová

    2008-01-01

    Full Text Available The contribution deals with design of customer–provider relationship monitoring system solution with regard to needs of business managers and analytics and to possibilities of contemporaneous information and communication technologies.The attention is followed to targeted modelling, what brings possibilities of acquisition of bigger overview about things taking place in the relation. In consequence it describes the functionality of analytical systems producing these very strategically valuable models – to so-called business intelligence tools. Onward it deals with modern technologies conductive to above mentioned system implementation – with Ajax concept and with some XML applications: PMML for analytical models manipulation, XSLT for XML data transformations to various formats, SVG for representing pictures of statistical graphs etc. and MathML for description of mathematical formulas created in analytical systems.Following these basis it suggests technological solution of some parts of client–provider relationship watching and evaluating system and it discusses its potential advantages and problems, which can occur.

  19. A Statistical Graphical Model of the California Reservoir System

    Science.gov (United States)

    Taeb, A.; Reager, J. T.; Turmon, M.; Chandrasekaran, V.

    2017-11-01

    The recent California drought has highlighted the potential vulnerability of the state's water management infrastructure to multiyear dry intervals. Due to the high complexity of the network, dynamic storage changes in California reservoirs on a state-wide scale have previously been difficult to model using either traditional statistical or physical approaches. Indeed, although there is a significant line of research on exploring models for single (or a small number of) reservoirs, these approaches are not amenable to a system-wide modeling of the California reservoir network due to the spatial and hydrological heterogeneities of the system. In this work, we develop a state-wide statistical graphical model to characterize the dependencies among a collection of 55 major California reservoirs across the state; this model is defined with respect to a graph in which the nodes index reservoirs and the edges specify the relationships or dependencies between reservoirs. We obtain and validate this model in a data-driven manner based on reservoir volumes over the period 2003-2016. A key feature of our framework is a quantification of the effects of external phenomena that influence the entire reservoir network. We further characterize the degree to which physical factors (e.g., state-wide Palmer Drought Severity Index (PDSI), average temperature, snow pack) and economic factors (e.g., consumer price index, number of agricultural workers) explain these external influences. As a consequence of this analysis, we obtain a system-wide health diagnosis of the reservoir network as a function of PDSI.

  20. Accuracy and reliability of China's energy statistics

    Energy Technology Data Exchange (ETDEWEB)

    Sinton, Jonathan E.

    2001-09-14

    Many observers have raised doubts about the accuracy and reliability of China's energy statistics, which show an unprecedented decline in recent years, while reported economic growth has remained strong. This paper explores the internal consistency of China's energy statistics from 1990 to 2000, coverage and reporting issues, and the state of the statistical reporting system. Available information suggests that, while energy statistics were probably relatively good in the early 1990s, their quality has declined since the mid-1990s. China's energy statistics should be treated as a starting point for analysis, and explicit judgments regarding ranges of uncertainty should accompany any conclusions.

  1. High-throughput automated system for statistical biosensing employing microcantilevers arrays

    DEFF Research Database (Denmark)

    Bosco, Filippo; Chen, Ching H.; Hwu, En T.

    2011-01-01

    In this paper we present a completely new and fully automated system for parallel microcantilever-based biosensing. Our platform is able to monitor simultaneously the change of resonance frequency (dynamic mode), of deflection (static mode), and of surface roughness of hundreds of cantilevers...... in a very short time over multiple biochemical reactions. We have proven that our system is capable to measure 900 independent microsensors in less than a second. Here, we report statistical biosensing results performed over a haptens-antibody assay, where complete characterization of the biochemical...

  2. Elementary methods for statistical systems, mean field, large-n, and duality

    International Nuclear Information System (INIS)

    Itzykson, C.

    1983-01-01

    Renormalizable field theories are singled out by such precise restraints that regularization schemes must be used to break these invariances. Statistical methods can be adapted to these problems where asymptotically free models fail. This lecture surveys approximation schemes developed in the context of statistical mechanics. The confluence point of statistical mechanics and field theory is the use of discretized path integrals, where continuous space time has been replaced by a regular lattice. Dynamic variables, a Boltzman weight factor, and boundary conditions are the ingredients. Mean field approximations --field equations, Random field transform, and gauge invariant systems--are surveyed. Under Large-N limits vector models are found to simplify tremendously. The reasons why matrix models drawn from SU (n) gauge theories do not simplify are discussed. In the epilogue, random curves versus random surfaces are offered as an example where global and local symmetries are not alike

  3. Information transport in classical statistical systems

    Science.gov (United States)

    Wetterich, C.

    2018-02-01

    For "static memory materials" the bulk properties depend on boundary conditions. Such materials can be realized by classical statistical systems which admit no unique equilibrium state. We describe the propagation of information from the boundary to the bulk by classical wave functions. The dependence of wave functions on the location of hypersurfaces in the bulk is governed by a linear evolution equation that can be viewed as a generalized Schrödinger equation. Classical wave functions obey the superposition principle, with local probabilities realized as bilinears of wave functions. For static memory materials the evolution within a subsector is unitary, as characteristic for the time evolution in quantum mechanics. The space-dependence in static memory materials can be used as an analogue representation of the time evolution in quantum mechanics - such materials are "quantum simulators". For example, an asymmetric Ising model on a Euclidean two-dimensional lattice represents the time evolution of free relativistic fermions in two-dimensional Minkowski space.

  4. Design and implementation of a modular program system for the carrying-through of statistical analyses

    International Nuclear Information System (INIS)

    Beck, W.

    1984-01-01

    From the complexity of computer programs for the solution of scientific and technical problems results a lot of questions. Typical questions concern the strength and weakness of computer programs, the propagation of incertainties among the input data, the sensitivity of input data on output data and the substitute of complex models by more simple ones, which provide equivalent results in certain ranges. Those questions have a general practical meaning, principle answers may be found by statistical methods, which are based on the Monte Carlo Method. In this report the statistical methods are chosen, described and valuated. They are implemented into the modular program system STAR, which is an own component of the program system RSYST. The design of STAR considers users with different knowledge of data processing and statistics. The variety of statistical methods, generating and evaluating procedures. The processing of large data sets in complex structures. The coupling to other components of RSYST and RSYST foreign programs. That the system can be easily modificated and enlarged. Four examples are given, which demonstrate the application of STAR. (orig.) [de

  5. A statistical-based approach for fault detection and diagnosis in a photovoltaic system

    KAUST Repository

    Garoudja, Elyes; Harrou, Fouzi; Sun, Ying; Kara, Kamel; Chouder, Aissa; Silvestre, Santiago

    2017-01-01

    This paper reports a development of a statistical approach for fault detection and diagnosis in a PV system. Specifically, the overarching goal of this work is to early detect and identify faults on the DC side of a PV system (e.g., short

  6. Intelligent system for statistically significant expertise knowledge on the basis of the model of self-organizing nonequilibrium dissipative system

    Directory of Open Access Journals (Sweden)

    E. A. Tatokchin

    2017-01-01

    Full Text Available Development of the modern educational technologies caused by broad introduction of comput-er testing and development of distant forms of education does necessary revision of methods of an examination of pupils. In work it was shown, need transition to mathematical criteria, exami-nations of knowledge which are deprived of subjectivity. In article the review of the problems arising at realization of this task and are offered approaches for its decision. The greatest atten-tion is paid to discussion of a problem of objective transformation of rated estimates of the ex-pert on to the scale estimates of the student. In general, the discussion this question is was con-cluded that the solution to this problem lies in the creation of specialized intellectual systems. The basis for constructing intelligent system laid the mathematical model of self-organizing nonequilibrium dissipative system, which is a group of students. This article assumes that the dissipative system is provided by the constant influx of new test items of the expert and non-equilibrium – individual psychological characteristics of students in the group. As a result, the system must self-organize themselves into stable patterns. This patern will allow for, relying on large amounts of data, get a statistically significant assessment of student. To justify the pro-posed approach in the work presents the data of the statistical analysis of the results of testing a large sample of students (> 90. Conclusions from this statistical analysis allowed to develop intelligent system statistically significant examination of student performance. It is based on data clustering algorithm (k-mean for the three key parameters. It is shown that this approach allows you to create of the dynamics and objective expertise evaluation.

  7. Principles of classical statistical mechanics: A perspective from the notion of complementarity

    International Nuclear Information System (INIS)

    Velazquez Abad, Luisberis

    2012-01-01

    Quantum mechanics and classical statistical mechanics are two physical theories that share several analogies in their mathematical apparatus and physical foundations. In particular, classical statistical mechanics is hallmarked by the complementarity between two descriptions that are unified in thermodynamics: (i) the parametrization of the system macrostate in terms of mechanical macroscopic observablesI=(I i ), and (ii) the dynamical description that explains the evolution of a system towards the thermodynamic equilibrium. As expected, such a complementarity is related to the uncertainty relations of classical statistical mechanics ΔI i Δη i ≥k. Here, k is the Boltzmann constant, η i =∂S(I|θ)/∂I i are the restituting generalized forces derived from the entropy S(I|θ) of a closed system, which is found in an equilibrium situation driven by certain control parameters θ=(θ α ). These arguments constitute the central ingredients of a reformulation of classical statistical mechanics from the notion of complementarity. In this new framework, Einstein postulate of classical fluctuation theory dp(I|θ)∼exp[S(I|θ)/k]dI appears as the correspondence principle between classical statistical mechanics and thermodynamics in the limit k→0, while the existence of uncertainty relations can be associated with the non-commuting character of certain operators. - Highlights: ► There exists a direct analogy between quantum and classical statistical mechanics. ► Statistical form of Le Chatellier principle leads to the uncertainty principle. ► Einstein postulate is simply the correspondence principle. ► Complementary quantities are associated with non-commuting operators.

  8. Statistical physics of human beings in games: Controlled experiments

    Science.gov (United States)

    Liang, Yuan; Huang, Ji-Ping

    2014-07-01

    It is important to know whether the laws or phenomena in statistical physics for natural systems with non-adaptive agents still hold for social human systems with adaptive agents, because this implies whether it is possible to study or understand social human systems by using statistical physics originating from natural systems. For this purpose, we review the role of human adaptability in four kinds of specific human behaviors, namely, normal behavior, herd behavior, contrarian behavior, and hedge behavior. The approach is based on controlled experiments in the framework of market-directed resource-allocation games. The role of the controlled experiments could be at least two-fold: adopting the real human decision-making process so that the system under consideration could reflect the performance of genuine human beings; making it possible to obtain macroscopic physical properties of a human system by tuning a particular factor of the system, thus directly revealing cause and effect. As a result, both computer simulations and theoretical analyses help to show a few counterparts of some laws or phenomena in statistical physics for social human systems: two-phase phenomena or phase transitions, entropy-related phenomena, and a non-equilibrium steady state. This review highlights the role of human adaptability in these counterparts, and makes it possible to study or understand some particular social human systems by means of statistical physics coming from natural systems.

  9. INCREASE OF QUEUING SYSTEM EFFECTIVENESS OF TRADING ENTERPRISE BY MEANS OF NUMERICAL STATISTICAL SIMULATION

    Directory of Open Access Journals (Sweden)

    Knyazheva Yu. V.

    2014-06-01

    Full Text Available The market economy causes need of development of the economic analysis first of all at microlevel, that is at the level of the separate enterprises as the enterprises are basis for market economy. Therefore improvement of the queuing system trading enterprise is an important economic problem. Analytical solutions of problems of the mass servicing are in described the theory, don’t correspond to real operating conditions of the queuing systems. Therefore in this article optimization of customer service process and improvement of settlement and cash service system trading enterprise are made by means of numerical statistical simulation of the queuing system trading enterprise. The article describe integrated statistical numerical simulation model of queuing systems trading enterprise working in nonstationary conditions with reference to different distribution laws of customers input stream. This model takes account of various behavior customers output stream, includes checkout service model which takes account of cashier rate of working, also this model includes staff motivation model, profit earning and profit optimization models that take into account possible revenue and costs. The created statistical numerical simulation model of queuing systems trading enterprise, at its realization in the suitable software environment, allows to perform optimization of the most important parameters of system. And when developing the convenient user interface, this model can be a component of support decision-making system for rationalization of organizational structure and for management optimization by trading enterprise.

  10. Statistical mechanics of low-density parity-check codes

    Energy Technology Data Exchange (ETDEWEB)

    Kabashima, Yoshiyuki [Department of Computational Intelligence and Systems Science, Tokyo Institute of Technology, Yokohama 2268502 (Japan); Saad, David [Neural Computing Research Group, Aston University, Birmingham B4 7ET (United Kingdom)

    2004-02-13

    We review recent theoretical progress on the statistical mechanics of error correcting codes, focusing on low-density parity-check (LDPC) codes in general, and on Gallager and MacKay-Neal codes in particular. By exploiting the relation between LDPC codes and Ising spin systems with multi-spin interactions, one can carry out a statistical mechanics based analysis that determines the practical and theoretical limitations of various code constructions, corresponding to dynamical and thermodynamical transitions, respectively, as well as the behaviour of error-exponents averaged over the corresponding code ensemble as a function of channel noise. We also contrast the results obtained using methods of statistical mechanics with those derived in the information theory literature, and show how these methods can be generalized to include other channel types and related communication problems. (topical review)

  11. Statistical mechanics of low-density parity-check codes

    International Nuclear Information System (INIS)

    Kabashima, Yoshiyuki; Saad, David

    2004-01-01

    We review recent theoretical progress on the statistical mechanics of error correcting codes, focusing on low-density parity-check (LDPC) codes in general, and on Gallager and MacKay-Neal codes in particular. By exploiting the relation between LDPC codes and Ising spin systems with multi-spin interactions, one can carry out a statistical mechanics based analysis that determines the practical and theoretical limitations of various code constructions, corresponding to dynamical and thermodynamical transitions, respectively, as well as the behaviour of error-exponents averaged over the corresponding code ensemble as a function of channel noise. We also contrast the results obtained using methods of statistical mechanics with those derived in the information theory literature, and show how these methods can be generalized to include other channel types and related communication problems. (topical review)

  12. Nonlinear Fluctuation Behavior of Financial Time Series Model by Statistical Physics System

    Directory of Open Access Journals (Sweden)

    Wuyang Cheng

    2014-01-01

    Full Text Available We develop a random financial time series model of stock market by one of statistical physics systems, the stochastic contact interacting system. Contact process is a continuous time Markov process; one interpretation of this model is as a model for the spread of an infection, where the epidemic spreading mimics the interplay of local infections and recovery of individuals. From this financial model, we study the statistical behaviors of return time series, and the corresponding behaviors of returns for Shanghai Stock Exchange Composite Index (SSECI and Hang Seng Index (HSI are also comparatively studied. Further, we investigate the Zipf distribution and multifractal phenomenon of returns and price changes. Zipf analysis and MF-DFA analysis are applied to investigate the natures of fluctuations for the stock market.

  13. Measuring the Success of an Academic Development Programme: A Statistical Analysis

    Science.gov (United States)

    Smith, L. C.

    2009-01-01

    This study uses statistical analysis to estimate the impact of first-year academic development courses in microeconomics, statistics, accountancy, and information systems, offered by the University of Cape Town's Commerce Academic Development Programme, on students' graduation performance relative to that achieved by mainstream students. The data…

  14. Drivers and seasonal predictability of extreme wind speeds in the ECMWF System 4 and a statistical model

    Science.gov (United States)

    Walz, M. A.; Donat, M.; Leckebusch, G. C.

    2017-12-01

    As extreme wind speeds are responsible for large socio-economic losses in Europe, a skillful prediction would be of great benefit for disaster prevention as well as for the actuarial community. Here we evaluate patterns of large-scale atmospheric variability and the seasonal predictability of extreme wind speeds (e.g. >95th percentile) in the European domain in the dynamical seasonal forecast system ECMWF System 4, and compare to the predictability based on a statistical prediction model. The dominant patterns of atmospheric variability show distinct differences between reanalysis and ECMWF System 4, with most patterns in System 4 extended downstream in comparison to ERA-Interim. The dissimilar manifestations of the patterns within the two models lead to substantially different drivers associated with the occurrence of extreme winds in the respective model. While the ECMWF System 4 is shown to provide some predictive power over Scandinavia and the eastern Atlantic, only very few grid cells in the European domain have significant correlations for extreme wind speeds in System 4 compared to ERA-Interim. In contrast, a statistical model predicts extreme wind speeds during boreal winter in better agreement with the observations. Our results suggest that System 4 does not seem to capture the potential predictability of extreme winds that exists in the real world, and therefore fails to provide reliable seasonal predictions for lead months 2-4. This is likely related to the unrealistic representation of large-scale patterns of atmospheric variability. Hence our study points to potential improvements of dynamical prediction skill by improving the simulation of large-scale atmospheric dynamics.

  15. Humans make efficient use of natural image statistics when performing spatial interpolation.

    Science.gov (United States)

    D'Antona, Anthony D; Perry, Jeffrey S; Geisler, Wilson S

    2013-12-16

    Visual systems learn through evolution and experience over the lifespan to exploit the statistical structure of natural images when performing visual tasks. Understanding which aspects of this statistical structure are incorporated into the human nervous system is a fundamental goal in vision science. To address this goal, we measured human ability to estimate the intensity of missing image pixels in natural images. Human estimation accuracy is compared with various simple heuristics (e.g., local mean) and with optimal observers that have nearly complete knowledge of the local statistical structure of natural images. Human estimates are more accurate than those of simple heuristics, and they match the performance of an optimal observer that knows the local statistical structure of relative intensities (contrasts). This optimal observer predicts the detailed pattern of human estimation errors and hence the results place strong constraints on the underlying neural mechanisms. However, humans do not reach the performance of an optimal observer that knows the local statistical structure of the absolute intensities, which reflect both local relative intensities and local mean intensity. As predicted from a statistical analysis of natural images, human estimation accuracy is negligibly improved by expanding the context from a local patch to the whole image. Our results demonstrate that the human visual system exploits efficiently the statistical structure of natural images.

  16. Simulation of statistical systems with not necessarily real and positive probabilities

    International Nuclear Information System (INIS)

    Kalkreuter, T.

    1991-01-01

    A new method to determine expectation values of observables in statistical systems with not necessarily real and positive probabilities is proposed. It is tested in a numerical study of the two-dimensional O(3)-symmetric nonlinear σ-model with Symanzik's one-loop improved lattice action. This model is simulated as polymer system with field dependent activities which can be made positive definite or indefinite by adjusting additive constants of the action. For a system with indefinite activities the new proposal is found to work. It is also verified that local observables are not affected by far-away ploymers with indefinite activities when the system has no long-range order. (orig.)

  17. Statistical distribution for generalized ideal gas of fractional-statistics particles

    International Nuclear Information System (INIS)

    Wu, Y.

    1994-01-01

    We derive the occupation-number distribution in a generalized ideal gas of particles obeying fractional statistics, including mutual statistics, by adopting a state-counting definition. When there is no mutual statistics, the statistical distribution interpolates between bosons and fermions, and respects a fractional exclusion principle (except for bosons). Anyons in a strong magnetic field at low temperatures constitute such a physical system. Applications to the thermodynamic properties of quasiparticle excitations in the Laughlin quantum Hall fluid are discussed

  18. A new expression of the probability distribution in Incomplete Statistics and fundamental thermodynamic relations

    International Nuclear Information System (INIS)

    Huang Zhifu; Lin Bihong; ChenJincan

    2009-01-01

    In order to overcome the limitations of the original expression of the probability distribution appearing in literature of Incomplete Statistics, a new expression of the probability distribution is derived, where the Lagrange multiplier β introduced here is proved to be identical with that introduced in the second and third choices for the internal energy constraint in Tsallis' statistics and to be just equal to the physical inverse temperature. It is expounded that the probability distribution described by the new expression is invariant through uniform translation of the energy spectrum. Moreover, several fundamental thermodynamic relations are given and the relationship between the new and the original expressions of the probability distribution is discussed.

  19. Program system for inclusion, settlement of account and statistical evaluation of on-line recherches

    International Nuclear Information System (INIS)

    Helmreich, F.; Nevyjel, A.

    1981-03-01

    The described program system is used for the automatisation of the administration in an information retrieval department. The data of the users and of every on line session are stored in two files and can be evaluated in different statistics. The data acquisition is done interactively, the statistic programs run as well in dialog and in batch. (author)

  20. Study of developing a database of energy statistics

    Energy Technology Data Exchange (ETDEWEB)

    Park, T.S. [Korea Energy Economics Institute, Euiwang (Korea, Republic of)

    1997-08-01

    An integrated energy database should be prepared in advance for managing energy statistics comprehensively. However, since much manpower and budget is required for developing an integrated energy database, it is difficult to establish a database within a short period of time. Therefore, this study sets the purpose in drawing methods to analyze existing statistical data lists and to consolidate insufficient data as first stage work for the energy database, and at the same time, in analyzing general concepts and the data structure of the database. I also studied the data content and items of energy databases in operation in international energy-related organizations such as IEA, APEC, Japan, and the USA as overseas cases as well as domestic conditions in energy databases, and the hardware operating systems of Japanese databases. I analyzed the making-out system of Korean energy databases, discussed the KEDB system which is representative of total energy databases, and present design concepts for new energy databases. In addition, I present the establishment directions and their contents of future Korean energy databases, data contents that should be collected by supply and demand statistics, and the establishment of data collection organization, etc. by analyzing the Korean energy statistical data and comparing them with the system of OECD/IEA. 26 refs., 15 figs., 11 tabs.

  1. Information systems development of analysis company financial state based on the expert-statistical approach

    Directory of Open Access Journals (Sweden)

    M. N. Ivliev

    2016-01-01

    Full Text Available The work is devoted to methods of analysis the company financial condition, including aggregated ratings. It is proposed to use the generalized solvency and liquidity indicator and the capital structure composite index. Mathematically, the generalized index is a sum of variables-characteristics and weighting factors characterizing the relative importance of individual characteristics composition. It is offered to select the significant features from a set of standard financial ratios, calculated according to enterprises balance sheets. To obtain the weighting factors values it is proposed to use one of the expert statistical approaches, the analytic hierarchy process. The method is as follows: we choose the most important characteristic and after the experts determine the degree of preference for the main feature based on the linguistic scale. Further, matrix of pairwise comparisons based on the assigned ranks is compiled, which characterizes the relative importance of attributes. The required coefficients are determined as elements of a vector of priorities, which is the first vector of the matrix of paired comparisons. The paper proposes a mechanism for finding the fields for rating numbers analysis. In addition, the paper proposes a method for the statistical evaluation of the balance sheets of various companies by calculating the mutual correlation matrices. Based on the considered mathematical methods to determine quantitative characteristics of technical objects financial and economic activities, was developed algorithms, information and software allowing to realize of different systems economic analysis.

  2. Some statistical considerations related to the estimation of cancer risk following exposure to ionizing radiation

    International Nuclear Information System (INIS)

    Land, C.E.; Pierce, D.A.

    1983-01-01

    Statistical theory and methodology provide the logical structure for scientific inference about the cancer risk associated with exposure to ionizing radiation. Although much is known about radiation carcinogenesis, the risk associated with low-level exposures is difficult to assess because it is too small to measure directly. Estimation must therefore depend upon mathematical models which relate observed risks at high exposure levels to risks at lower exposure levels. Extrapolated risk estimates obtained using such models are heavily dependent upon assumptions about the shape of the dose-response relationship, the temporal distribution of risk following exposure, and variation of risk according to variables such as age at exposure, sex, and underlying population cancer rates. Expanded statistical models, which make explicit certain assumed relationships between different data sets, can be used to strengthen inferences by incorporating relevant information from diverse sources. They also allow the uncertainties inherent in information from related data sets to be expressed in estimates which partially depend upon that information. To the extent that informed opinion is based upon a valid assessment of scientific data, the larger context of decision theory, which includes statistical theory, provides a logical framework for the incorporation into public policy decisions of the informational content of expert opinion

  3. DYNAMIC STABILITY OF THE SOLAR SYSTEM: STATISTICALLY INCONCLUSIVE RESULTS FROM ENSEMBLE INTEGRATIONS

    Energy Technology Data Exchange (ETDEWEB)

    Zeebe, Richard E., E-mail: zeebe@soest.hawaii.edu [School of Ocean and Earth Science and Technology, University of Hawaii at Manoa, 1000 Pope Road, MSB 629, Honolulu, HI 96822 (United States)

    2015-01-01

    Due to the chaotic nature of the solar system, the question of its long-term stability can only be answered in a statistical sense, for instance, based on numerical ensemble integrations of nearby orbits. Destabilization of the inner planets, leading to close encounters and/or collisions can be initiated through a large increase in Mercury's eccentricity, with a currently assumed likelihood of ∼1%. However, little is known at present about the robustness of this number. Here I report ensemble integrations of the full equations of motion of the eight planets and Pluto over 5 Gyr, including contributions from general relativity. The results show that different numerical algorithms lead to statistically different results for the evolution of Mercury's eccentricity (e{sub M}). For instance, starting at present initial conditions (e{sub M}≃0.21), Mercury's maximum eccentricity achieved over 5 Gyr is, on average, significantly higher in symplectic ensemble integrations using heliocentric rather than Jacobi coordinates and stricter error control. In contrast, starting at a possible future configuration (e{sub M}≃0.53), Mercury's maximum eccentricity achieved over the subsequent 500 Myr is, on average, significantly lower using heliocentric rather than Jacobi coordinates. For example, the probability for e{sub M} to increase beyond 0.53 over 500 Myr is >90% (Jacobi) versus only 40%-55% (heliocentric). This poses a dilemma because the physical evolution of the real system—and its probabilistic behavior—cannot depend on the coordinate system or the numerical algorithm chosen to describe it. Some tests of the numerical algorithms suggest that symplectic integrators using heliocentric coordinates underestimate the odds for destabilization of Mercury's orbit at high initial e{sub M}.

  4. Differences in game-related statistics of basketball performance by game location for men's winning and losing teams.

    Science.gov (United States)

    Gómez, Miguel A; Lorenzo, Alberto; Barakat, Rubén; Ortega, Enrique; Palao, José M

    2008-02-01

    The aim of the present study was to identify game-related statistics that differentiate winning and losing teams according to game location. The sample included 306 games of the 2004-2005 regular season of the Spanish professional men's league (ACB League). The independent variables were game location (home or away) and game result (win or loss). The game-related statistics registered were free throws (successful and unsuccessful), 2- and 3-point field goals (successful and unsuccessful), offensive and defensive rebounds, blocks, assists, fouls, steals, and turnovers. Descriptive and inferential analyses were done (one-way analysis of variance and discriminate analysis). The multivariate analysis showed that winning teams differ from losing teams in defensive rebounds (SC = .42) and in assists (SC = .38). Similarly, winning teams differ from losing teams when they play at home in defensive rebounds (SC = .40) and in assists (SC = .41). On the other hand, winning teams differ from losing teams when they play away in defensive rebounds (SC = .44), assists (SC = .30), successful 2-point field goals (SC = .31), and unsuccessful 3-point field goals (SC = -.35). Defensive rebounds and assists were the only game-related statistics common to all three analyses.

  5. Statistical and dynamical remastering of classic exoplanet systems

    Science.gov (United States)

    Nelson, Benjamin Earl

    The most powerful constraints on planet formation will come from characterizing the dynamical state of complex multi-planet systems. Unfortunately, with that complexity comes a number of factors that make analyzing these systems a computationally challenging endeavor: the sheer number of model parameters, a wonky shaped posterior distribution, and hundreds to thousands of time series measurements. In this dissertation, I will review our efforts to improve the statistical analyses of radial velocity (RV) data and their applications to some renown, dynamically complex exoplanet system. In the first project (Chapters 2 and 4), we develop a differential evolution Markov chain Monte Carlo (RUN DMC) algorithm to tackle the aforementioned difficult aspects of data analysis. We test the robustness of the algorithm in regards to the number of modeled planets (model dimensionality) and increasing dynamical strength. We apply RUN DMC to a couple classic multi-planet systems and one highly debated system from radial velocity surveys. In the second project (Chapter 5), we analyze RV data of 55 Cancri, a wide binary system known to harbor five planetary orbiting the primary. We find the inner-most planet "e" must be coplanar to within 40 degrees of the outer planets, otherwise Kozai-like perturbations will cause the planet to enter the stellar photosphere through its periastron passage. We find the orbits of planets "b" and "c" are apsidally aligned and librating with low to median amplitude (50+/-6 10 degrees), but they are not orbiting in a mean-motion resonance. In the third project (Chapters 3, 4, 6), we analyze RV data of Gliese 876, a four planet system with three participating in a multi-body resonance, i.e. a Laplace resonance. From a combined observational and statistical analysis computing Bayes factors, we find a four-planet model is favored over one with three-planets. Conditioned on this preferred model, we meaningfully constrain the three-dimensional orbital

  6. Statistics

    Science.gov (United States)

    Links to sources of cancer-related statistics, including the Surveillance, Epidemiology and End Results (SEER) Program, SEER-Medicare datasets, cancer survivor prevalence data, and the Cancer Trends Progress Report.

  7. Nonequilibrium statistical mechanics in the general theory of relativity. I. A general formalism

    International Nuclear Information System (INIS)

    Israel, W.; Kandrup, H.E.

    1984-01-01

    This is the first in a series of papers, the overall objective of which is the formulation of a new covariant approach to nonequilibrium statistical mechanics in classical general relativity. The objecct here is the development of a tractable theory for self-gravitating systems. It is argued that the ''state'' of an N-particle system may be characterized by an N-particle distribution function, defined in an 8N-dimensional phase space, which satisfies a collection of N conservation equations. By mapping the true physics onto a fictitious ''background'' spacetime, which may be chosen to satisfy some ''average'' field equations, one then obtains a useful covariant notion of ''evolution'' in response to a fluctuating ''gravitational force.'' For many cases of practical interest, one may suppose (i) that these fluctuating forces satisfy linear field equations and (ii) that they may be modeled by a direct interaction. In this case, one can use a relativistic projection operator formalism to derive exact closed equations for the evolution of such objects as an appropriately defined reduced one-particle distribution function. By capturing, in a natural way, the notion of a dilute gas, or impulse, approximation, one is then led to a comparatively simple equation for the one-particle distribution. If, furthermore, one treats the effects of the fluctuating forces as ''localized'' in space and time, one obtains a tractable kinetic equation which reduces, in the Newtonian limit, to the stardard Landau equation

  8. Uncertainty analysis of reactor safety systems with statistically correlated failure data

    International Nuclear Information System (INIS)

    Dezfuli, H.; Modarres, M.

    1985-01-01

    The probability of occurrence of the top event of a fault tree is estimated from failure probability of components that constitute the fault tree. Component failure probabilities are subject to statistical uncertainties. In addition, there are cases where the failure data are statistically correlated. Most fault tree evaluations have so far been based on uncorrelated component failure data. The subject of this paper is the description of a method of assessing the probability intervals for the top event failure probability of fault trees when component failure data are statistically correlated. To estimate the mean and variance of the top event, a second-order system moment method is presented through Taylor series expansion, which provides an alternative to the normally used Monte-Carlo method. For cases where component failure probabilities are statistically correlated, the Taylor expansion terms are treated properly. A moment matching technique is used to obtain the probability distribution function of the top event through fitting a Johnson Ssub(B) distribution. The computer program (CORRELATE) was developed to perform the calculations necessary for the implementation of the method developed. The CORRELATE code is very efficient and consumes minimal computer time. This is primarily because it does not employ the time-consuming Monte-Carlo method. (author)

  9. Applying incomplete statistics to nonextensive systems with different q indices

    International Nuclear Information System (INIS)

    Nivanen, L.; Pezeril, M.; Wang, Q.A.; Mehaute, A. Le

    2005-01-01

    The nonextensive statistics based on the q-entropy Sq=--bar i=1v(pi-piq)1-q has been so far applied to systems in which the q value is uniformly distributed. For the systems containing different q's, the applicability of the theory is still a matter of investigation. The difficulty is that the class of systems to which the theory can be applied is actually limited by the usual nonadditivity rule of entropy which is no more valid when the systems contain non uniform distribution of q values. In this paper, within the framework of the so called incomplete information theory, we propose a more general nonadditivity rule of entropy prescribed by the zeroth law of thermodynamics. This new nonadditivity generalizes in a simple way the usual one and can be proved to lead uniquely to the q-entropy

  10. Research and Development on Food Nutrition Statistical Analysis Software System

    OpenAIRE

    Du Li; Ke Yun

    2013-01-01

    Designing and developing a set of food nutrition component statistical analysis software can realize the automation of nutrition calculation, improve the nutrition processional professional’s working efficiency and achieve the informatization of the nutrition propaganda and education. In the software development process, the software engineering method and database technology are used to calculate the human daily nutritional intake and the intelligent system is used to evaluate the user’s hea...

  11. Ontologies and tag-statistics

    Science.gov (United States)

    Tibély, Gergely; Pollner, Péter; Vicsek, Tamás; Palla, Gergely

    2012-05-01

    Due to the increasing popularity of collaborative tagging systems, the research on tagged networks, hypergraphs, ontologies, folksonomies and other related concepts is becoming an important interdisciplinary area with great potential and relevance for practical applications. In most collaborative tagging systems the tagging by the users is completely ‘flat’, while in some cases they are allowed to define a shallow hierarchy for their own tags. However, usually no overall hierarchical organization of the tags is given, and one of the interesting challenges of this area is to provide an algorithm generating the ontology of the tags from the available data. In contrast, there are also other types of tagged networks available for research, where the tags are already organized into a directed acyclic graph (DAG), encapsulating the ‘is a sub-category of’ type of hierarchy between each other. In this paper, we study how this DAG affects the statistical distribution of tags on the nodes marked by the tags in various real networks. The motivation for this research was the fact that understanding the tagging based on a known hierarchy can help in revealing the hidden hierarchy of tags in collaborative tagging systems. We analyse the relation between the tag-frequency and the position of the tag in the DAG in two large sub-networks of the English Wikipedia and a protein-protein interaction network. We also study the tag co-occurrence statistics by introducing a two-dimensional (2D) tag-distance distribution preserving both the difference in the levels and the absolute distance in the DAG for the co-occurring pairs of tags. Our most interesting finding is that the local relevance of tags in the DAG (i.e. their rank or significance as characterized by, e.g., the length of the branches starting from them) is much more important than their global distance from the root. Furthermore, we also introduce a simple tagging model based on random walks on the DAG, capable of

  12. Ontologies and tag-statistics

    International Nuclear Information System (INIS)

    Tibély, Gergely; Vicsek, Tamás; Pollner, Péter; Palla, Gergely

    2012-01-01

    Due to the increasing popularity of collaborative tagging systems, the research on tagged networks, hypergraphs, ontologies, folksonomies and other related concepts is becoming an important interdisciplinary area with great potential and relevance for practical applications. In most collaborative tagging systems the tagging by the users is completely ‘flat’, while in some cases they are allowed to define a shallow hierarchy for their own tags. However, usually no overall hierarchical organization of the tags is given, and one of the interesting challenges of this area is to provide an algorithm generating the ontology of the tags from the available data. In contrast, there are also other types of tagged networks available for research, where the tags are already organized into a directed acyclic graph (DAG), encapsulating the ‘is a sub-category of’ type of hierarchy between each other. In this paper, we study how this DAG affects the statistical distribution of tags on the nodes marked by the tags in various real networks. The motivation for this research was the fact that understanding the tagging based on a known hierarchy can help in revealing the hidden hierarchy of tags in collaborative tagging systems. We analyse the relation between the tag-frequency and the position of the tag in the DAG in two large sub-networks of the English Wikipedia and a protein-protein interaction network. We also study the tag co-occurrence statistics by introducing a two-dimensional (2D) tag-distance distribution preserving both the difference in the levels and the absolute distance in the DAG for the co-occurring pairs of tags. Our most interesting finding is that the local relevance of tags in the DAG (i.e. their rank or significance as characterized by, e.g., the length of the branches starting from them) is much more important than their global distance from the root. Furthermore, we also introduce a simple tagging model based on random walks on the DAG, capable of

  13. Technical issues relating to the statistical parametric mapping of brain SPECT studies

    International Nuclear Information System (INIS)

    Hatton, R.L.; Cordato, N.; Hutton, B.F.; Lau, Y.H.; Evans, S.G.

    2000-01-01

    Full text: Statistical Parametric Mapping (SPM) is a software tool designed for the statistical analysis of functional neuro images, specifically Positron Emission Tomography and functional Magnetic Resonance Imaging, and more recently SPECT. This review examines some problems associated with the analysis of SPECT. A comparison of a patient group with normal studies revealed factors that could influence results, some that commonly occur, others that require further exploration. To optimise the differences between two groups of subjects, both spatial variability and differences in global activity must be minimised. The choice and effectiveness of co registration method and approach to normalisation of activity concentration can affect the optimisation. A small number of subject scans were identified as possessing truncated data resulting in edge effects that could adversely influence the analysis. Other problems included unusual areas of significance possibly related to reconstruction methods and the geometry associated with nonparallel collimators. Areas of extra cerebral significance are a point of concern - and may result from scatter effects, or mis registration. Difficulties in patient positioning, due to postural limitations, can lead to resolution differences. SPM has been used to assess areas of statistical significance arising from these technical factors, as opposed to areas of true clinical significance when comparing subject groups. This contributes to a better understanding of the effects of technical factors so that these may be eliminated, minimised, or incorporated in the study design. Copyright (2000) The Australian and New Zealand Society of Nuclear Medicine Inc

  14. Principle of maximum Fisher information from Hardy's axioms applied to statistical systems.

    Science.gov (United States)

    Frieden, B Roy; Gatenby, Robert A

    2013-10-01

    Consider a finite-sized, multidimensional system in parameter state a. The system is either at statistical equilibrium or general nonequilibrium, and may obey either classical or quantum physics. L. Hardy's mathematical axioms provide a basis for the physics obeyed by any such system. One axiom is that the number N of distinguishable states a in the system obeys N=max. This assumes that N is known as deterministic prior knowledge. However, most observed systems suffer statistical fluctuations, for which N is therefore only known approximately. Then what happens if the scope of the axiom N=max is extended to include such observed systems? It is found that the state a of the system must obey a principle of maximum Fisher information, I=I(max). This is important because many physical laws have been derived, assuming as a working hypothesis that I=I(max). These derivations include uses of the principle of extreme physical information (EPI). Examples of such derivations were of the De Broglie wave hypothesis, quantum wave equations, Maxwell's equations, new laws of biology (e.g., of Coulomb force-directed cell development and of in situ cancer growth), and new laws of economic fluctuation and investment. That the principle I=I(max) itself derives from suitably extended Hardy axioms thereby eliminates its need to be assumed in these derivations. Thus, uses of I=I(max) and EPI express physics at its most fundamental level, its axiomatic basis in math.

  15. Infant Statistical-Learning Ability Is Related to Real-Time Language Processing

    Science.gov (United States)

    Lany, Jill; Shoaib, Amber; Thompson, Abbie; Estes, Katharine Graf

    2018-01-01

    Infants are adept at learning statistical regularities in artificial language materials, suggesting that the ability to learn statistical structure may support language development. Indeed, infants who perform better on statistical learning tasks tend to be more advanced in parental reports of infants' language skills. Work with adults suggests…

  16. Quantum fields on manifolds: an interplay between quantum theory, statistical thermodynamics and general relativity

    International Nuclear Information System (INIS)

    Sewell, G.L.

    1986-01-01

    The author shows how the basic axioms of quantum field theory, general relativity and statistical thermodynamics lead, in a model-independent way, to a generalized Hawking-Unruh effect, whereby the gravitational fields carried by a class of space-time manifolds with event horizons thermalize ambient quantum fields. The author is concerned with a quantum field on a space-time x containing a submanifold X' bounded by event horizons. The objective is to show that, for a wide class of space-times, the global vacuum state of the field reduces, in X', to a thermal state, whose temperature depends on the geometry. The statistical thermodynaical, geometrical, and quantum field theoretical essential ingredients for the reduction of the vacuum state are discussed

  17. Quantum entanglement and teleportation using statistical correlations

    Indian Academy of Sciences (India)

    Administrator

    Abstract. A study of quantum teleportation using two and three-particle correlated density matrix is presented. A criterion based on standard quantum statistical correlations employed in the many-body virial expansion is used to determine the extent of entanglement for a 2N-particle system. A relation between the probability ...

  18. Stability and equilibrium in quantum statistical mechanics

    International Nuclear Information System (INIS)

    Kastler, Daniel.

    1975-01-01

    A derivation of the Gibbs Ansatz, base of the equilibrium statistical mechanics is provided from a stability requirements, in technical connection with the harmonic analysis of non-commutative dynamical systems. By the same token a relation is established between stability and the positivity of Hamiltonian in the zero temperature case [fr

  19. Statistical ensembles for money and debt

    Science.gov (United States)

    Viaggiu, Stefano; Lionetto, Andrea; Bargigli, Leonardo; Longo, Michele

    2012-10-01

    We build a statistical ensemble representation of two economic models describing respectively, in simplified terms, a payment system and a credit market. To this purpose we adopt the Boltzmann-Gibbs distribution where the role of the Hamiltonian is taken by the total money supply (i.e. including money created from debt) of a set of interacting economic agents. As a result, we can read the main thermodynamic quantities in terms of monetary ones. In particular, we define for the credit market model a work term which is related to the impact of monetary policy on credit creation. Furthermore, with our formalism we recover and extend some results concerning the temperature of an economic system, previously presented in the literature by considering only the monetary base as a conserved quantity. Finally, we study the statistical ensemble for the Pareto distribution.

  20. Optimal Design and Related Areas in Optimization and Statistics

    CERN Document Server

    Pronzato, Luc

    2009-01-01

    This edited volume, dedicated to Henry P. Wynn, reflects his broad range of research interests, focusing in particular on the applications of optimal design theory in optimization and statistics. It covers algorithms for constructing optimal experimental designs, general gradient-type algorithms for convex optimization, majorization and stochastic ordering, algebraic statistics, Bayesian networks and nonlinear regression. Written by leading specialists in the field, each chapter contains a survey of the existing literature along with substantial new material. This work will appeal to both the

  1. INSTITUTIONAL MANAGEMENT OF EUROPEAN STATISTICS AND OF THEIR QUALITY - CURRENT CONCERNS AT EUROPEAN LEVEL

    Directory of Open Access Journals (Sweden)

    Daniela ŞTEFĂNESCU

    2011-08-01

    Full Text Available The issues referring to official statistics quality and reliability became the main topics of debates as far as statistical governance in Europe is concerned. The Council welcomed the Commission Communication to the European Parliament and to the Council « Towards robust quality management for European Statistics » (COM 211, appreciating that the approach and the objective of the strategy would confer the European Statistical System (ESS the quality management framework for the coordination of consolidated economic policies. The Council pointed out that the European Statistical System management was improved during recent years, that progress was noticed in relation with high quality statistics production and dissemination within the European Union, but has also noticed that, in the context of recent financial crisis, certain weaknesses were identified, particularly related to quality management general framework.„Greece Case” proved that progresses were not enough for guaranteeing the complete independence of national statistical institutes and entailed the need for further consolidating ESS governance. Several undertakings are now in the preparatory stage, in accordance with the Commission Communication; these actions are welcomed, but the question arise: are these sufficient for definitively solving the problem?The paper aims to go ahead in the attempt of identifying a different way, innovative (courageous! on the long run, towards an advanced institutional structure of ESS, by setting up the European System of Statistical Institutes, similar to the European System of Central Banks, that would require a change in the Treaty.

  2. Performance evaluation of a hybrid-passive landfill leachate treatment system using multivariate statistical techniques

    Energy Technology Data Exchange (ETDEWEB)

    Wallace, Jack, E-mail: jack.wallace@ce.queensu.ca [Department of Civil Engineering, Queen’s University, Ellis Hall, 58 University Avenue, Kingston, Ontario K7L 3N6 (Canada); Champagne, Pascale, E-mail: champagne@civil.queensu.ca [Department of Civil Engineering, Queen’s University, Ellis Hall, 58 University Avenue, Kingston, Ontario K7L 3N6 (Canada); Monnier, Anne-Charlotte, E-mail: anne-charlotte.monnier@insa-lyon.fr [National Institute for Applied Sciences – Lyon, 20 Avenue Albert Einstein, 69621 Villeurbanne Cedex (France)

    2015-01-15

    Highlights: • Performance of a hybrid passive landfill leachate treatment system was evaluated. • 33 Water chemistry parameters were sampled for 21 months and statistically analyzed. • Parameters were strongly linked and explained most (>40%) of the variation in data. • Alkalinity, ammonia, COD, heavy metals, and iron were criteria for performance. • Eight other parameters were key in modeling system dynamics and criteria. - Abstract: A pilot-scale hybrid-passive treatment system operated at the Merrick Landfill in North Bay, Ontario, Canada, treats municipal landfill leachate and provides for subsequent natural attenuation. Collected leachate is directed to a hybrid-passive treatment system, followed by controlled release to a natural attenuation zone before entering the nearby Little Sturgeon River. The study presents a comprehensive evaluation of the performance of the system using multivariate statistical techniques to determine the interactions between parameters, major pollutants in the leachate, and the biological and chemical processes occurring in the system. Five parameters (ammonia, alkalinity, chemical oxygen demand (COD), “heavy” metals of interest, with atomic weights above calcium, and iron) were set as criteria for the evaluation of system performance based on their toxicity to aquatic ecosystems and importance in treatment with respect to discharge regulations. System data for a full range of water quality parameters over a 21-month period were analyzed using principal components analysis (PCA), as well as principal components (PC) and partial least squares (PLS) regressions. PCA indicated a high degree of association for most parameters with the first PC, which explained a high percentage (>40%) of the variation in the data, suggesting strong statistical relationships among most of the parameters in the system. Regression analyses identified 8 parameters (set as independent variables) that were most frequently retained for modeling

  3. Morphology of Laplacian growth processes and statistics of equivalent many-body systems

    International Nuclear Information System (INIS)

    Blumenfeld, R.

    1994-01-01

    The authors proposes a theory for the nonlinear evolution of two dimensional interfaces in Laplacian fields. The growing region is conformally mapped onto the unit disk, generating an equivalent many-body system whose dynamics and statistics are studied. The process is shown to be Hamiltonian, with the Hamiltonian being the imaginary part of the complex electrostatic potential. Surface effects are introduced through the Hamiltonian as an external field. An extension to a continuous density of particles is presented. The results are used to study the morphology of the interface using statistical mechanics for the many-body system. The distribution of the curvature and the moments of the growth probability along the interface are calculated exactly from the distribution of the particles. In the dilute limit, the distribution of the curvature is shown to develop algebraic tails, which may, for the first time, explain the origin of fractality in diffusion controlled processes

  4. Industrial commodity statistics yearbook 2001. Production statistics (1992-2001)

    International Nuclear Information System (INIS)

    2003-01-01

    This is the thirty-fifth in a series of annual compilations of statistics on world industry designed to meet both the general demand for information of this kind and the special requirements of the United Nations and related international bodies. Beginning with the 1992 edition, the title of the publication was changed to industrial Commodity Statistics Yearbook as the result of a decision made by the United Nations Statistical Commission at its twenty-seventh session to discontinue, effective 1994, publication of the Industrial Statistics Yearbook, volume I, General Industrial Statistics by the Statistics Division of the United Nations. The United Nations Industrial Development Organization (UNIDO) has become responsible for the collection and dissemination of general industrial statistics while the Statistics Division of the United Nations continues to be responsible for industrial commodity production statistics. The previous title, Industrial Statistics Yearbook, volume II, Commodity Production Statistics, was introduced in the 1982 edition. The first seven editions in this series were published under the title The Growth of World industry and the next eight editions under the title Yearbook of Industrial Statistics. This edition of the Yearbook contains annual quantity data on production of industrial commodities by country, geographical region, economic grouping and for the world. A standard list of about 530 commodities (about 590 statistical series) has been adopted for the publication. The statistics refer to the ten-year period 1992-2001 for about 200 countries and areas

  5. Industrial commodity statistics yearbook 2002. Production statistics (1993-2002)

    International Nuclear Information System (INIS)

    2004-01-01

    This is the thirty-sixth in a series of annual compilations of statistics on world industry designed to meet both the general demand for information of this kind and the special requirements of the United Nations and related international bodies. Beginning with the 1992 edition, the title of the publication was changed to industrial Commodity Statistics Yearbook as the result of a decision made by the United Nations Statistical Commission at its twenty-seventh session to discontinue, effective 1994, publication of the Industrial Statistics Yearbook, volume I, General Industrial Statistics by the Statistics Division of the United Nations. The United Nations Industrial Development Organization (UNIDO) has become responsible for the collection and dissemination of general industrial statistics while the Statistics Division of the United Nations continues to be responsible for industrial commodity production statistics. The previous title, Industrial Statistics Yearbook, volume II, Commodity Production Statistics, was introduced in the 1982 edition. The first seven editions in this series were published under the title 'The Growth of World industry' and the next eight editions under the title 'Yearbook of Industrial Statistics'. This edition of the Yearbook contains annual quantity data on production of industrial commodities by country, geographical region, economic grouping and for the world. A standard list of about 530 commodities (about 590 statistical series) has been adopted for the publication. The statistics refer to the ten-year period 1993-2002 for about 200 countries and areas

  6. Industrial commodity statistics yearbook 2000. Production statistics (1991-2000)

    International Nuclear Information System (INIS)

    2002-01-01

    This is the thirty-third in a series of annual compilations of statistics on world industry designed to meet both the general demand for information of this kind and the special requirements of the United Nations and related international bodies. Beginning with the 1992 edition, the title of the publication was changed to industrial Commodity Statistics Yearbook as the result of a decision made by the United Nations Statistical Commission at its twenty-seventh session to discontinue, effective 1994, publication of the Industrial Statistics Yearbook, volume I, General Industrial Statistics by the Statistics Division of the United Nations. The United Nations Industrial Development Organization (UNIDO) has become responsible for the collection and dissemination of general industrial statistics while the Statistics Division of the United Nations continues to be responsible for industrial commodity production statistics. The previous title, Industrial Statistics Yearbook, volume II, Commodity Production Statistics, was introduced in the 1982 edition. The first seven editions in this series were published under the title The Growth of World industry and the next eight editions under the title Yearbook of Industrial Statistics. This edition of the Yearbook contains annual quantity data on production of industrial commodities by country, geographical region, economic grouping and for the world. A standard list of about 530 commodities (about 590 statistical series) has been adopted for the publication. Most of the statistics refer to the ten-year period 1991-2000 for about 200 countries and areas

  7. Statistical methods for including two-body forces in large system calculations

    International Nuclear Information System (INIS)

    Grimes, S.M.

    1980-07-01

    Large systems of interacting particles are often treated by assuming that the effect on any one particle of the remaining N-1 may be approximated by an average potential. This approach reduces the problem to that of finding the bound-state solutions for a particle in a potential; statistical mechanics is then used to obtain the properties of the many-body system. In some physical systems this approach may not be acceptable, because the two-body force component cannot be treated in this one-body limit. A technique for incorporating two-body forces in such calculations in a more realistic fashion is described. 1 figure

  8. ASYMPTOTIC COMPARISONS OF U-STATISTICS, V-STATISTICS AND LIMITS OF BAYES ESTIMATES BY DEFICIENCIES

    OpenAIRE

    Toshifumi, Nomachi; Hajime, Yamato; Graduate School of Science and Engineering, Kagoshima University:Miyakonojo College of Technology; Faculty of Science, Kagoshima University

    2001-01-01

    As estimators of estimable parameters, we consider three statistics which are U-statistic, V-statistic and limit of Bayes estimate. This limit of Bayes estimate, called LB-statistic in this paper, is obtained from Bayes estimate of estimable parameter based on Dirichlet process, by letting its parameter tend to zero. For the estimable parameter with non-degenerate kernel, the asymptotic relative efficiencies of LB-statistic with respect to U-statistic and V-statistic and that of V-statistic w...

  9. GAME-RELATED STATISTICS THAT DISCRIMINATED WINNING, DRAWING AND LOSING TEAMS FROM THE SPANISH SOCCER LEAGUE

    Directory of Open Access Journals (Sweden)

    Carlos Lago-Peñas

    2010-06-01

    Full Text Available The aim of the present study was to analyze men's football competitions, trying to identify which game-related statistics allow to discriminate winning, drawing and losing teams. The sample used corresponded to 380 games from the 2008-2009 season of the Spanish Men's Professional League. The game-related statistics gathered were: total shots, shots on goal, effectiveness, assists, crosses, offsides commited and received, corners, ball possession, crosses against, fouls committed and received, corners against, yellow and red cards, and venue. An univariate (t-test and multivariate (discriminant analysis of data was done. The results showed that winning teams had averages that were significantly higher for the following game statistics: total shots (p < 0.001, shots on goal (p < 0.01, effectiveness (p < 0.01, assists (p < 0.01, offsides committed (p < 0.01 and crosses against (p < 0.01. Losing teams had significantly higher averages in the variable crosses (p < 0.01, offsides received (p < 0. 01 and red cards (p < 0.01. Discriminant analysis allowed to conclude the following: the variables that discriminate between winning, drawing and losing teams were the total shots, shots on goal, crosses, crosses against, ball possession and venue. Coaches and players should be aware for these different profiles in order to increase knowledge about game cognitive and motor solicitation and, therefore, to evaluate specificity at the time of practice and game planning

  10. Relationship between physical fitness and game-related statistics in elite professional basketball players: Regular season vs. playoffs

    Directory of Open Access Journals (Sweden)

    João Henrique Gomes

    2017-05-01

    Full Text Available Abstract AIMS This study aimed to verify th erelation ship between of anthropometric and physical performance variables with game-related statistics in professional elite basketball players during a competition. METHODS Eleven male basketball players were evaluated during 10 weeks in two distinct moments (regular season and playoffs. Overall, 11 variables of physical fitness and 13 variables of game-related statistics were analysed. RESULTS The following significant Pearson’scorrelations were found in regular season: percentage of fat mass with assists (r = -0.62 and steals (r = -0.63; height (r = 0.68, lean mass (r = 0.64, and maximum strength (r = 0.67 with blocks; squat jump with steals (r = 0.63; and time in the T-test with success ful two-point field-goals (r = -0.65, success ful free-throws (r = -0.61, and steals (r = -0.62. However, in playoffs, only stature and lean mass maintained these correlations (p ≤ 0.05. CONCLUSIONS The anthropometric and physical characteristics of the players showed few correlations with the game-related statistics in regular season, and these correlations are even lower in the playoff games of a professional elite Champion ship, wherefore, not being good predictors of technical performance.

  11. Statistical models for the analysis of water distribution system pipe break data

    International Nuclear Information System (INIS)

    Yamijala, Shridhar; Guikema, Seth D.; Brumbelow, Kelly

    2009-01-01

    The deterioration of pipes leading to pipe breaks and leaks in urban water distribution systems is of concern to water utilities throughout the world. Pipe breaks and leaks may result in reduction in the water-carrying capacity of the pipes and contamination of water in the distribution systems. Water utilities incur large expenses in the replacement and rehabilitation of water mains, making it critical to evaluate the current and future condition of the system for maintenance decision-making. This paper compares different statistical regression models proposed in the literature for estimating the reliability of pipes in a water distribution system on the basis of short time histories. The goals of these models are to estimate the likelihood of pipe breaks in the future and determine the parameters that most affect the likelihood of pipe breaks. The data set used for the analysis comes from a major US city, and these data include approximately 85,000 pipe segments with nearly 2500 breaks from 2000 through 2005. The results show that the set of statistical models previously proposed for this problem do not provide good estimates with the test data set. However, logistic generalized linear models do provide good estimates of pipe reliability and can be useful for water utilities in planning pipe inspection and maintenance

  12. Statistical relation between particle contaminations in ultra pure water and defects generated by process tools

    NARCIS (Netherlands)

    Wali, F.; Knotter, D. Martin; Wortelboer, Ronald; Mud, Auke

    2007-01-01

    Ultra pure water supplied inside the Fab is used in different tools at different stages of processing. Data of the particles measured in ultra pure water was compared with the defect density on wafers processed on these tools and a statistical relation is found Keywords— Yield, defect density,

  13. Solvability of a class of systems of infinite-dimensional integral equations and their application in statistical mechanics

    International Nuclear Information System (INIS)

    Gonchar, N.S.

    1986-01-01

    This paper presents a mathematical method developed for investigating a class of systems of infinite-dimensional integral equations which have application in statistical mechanics. Necessary and sufficient conditions are obtained for the uniqueness and bifurcation of the solution of this class of systems of equations. Problems of equilibrium statistical mechanics are considered on the basis of this method

  14. Preparing systems engineering and computing science students in disciplined methods, quantitative, and advanced statistical techniques to improve process performance

    Science.gov (United States)

    McCray, Wilmon Wil L., Jr.

    The research was prompted by a need to conduct a study that assesses process improvement, quality management and analytical techniques taught to students in U.S. colleges and universities undergraduate and graduate systems engineering and the computing science discipline (e.g., software engineering, computer science, and information technology) degree programs during their academic training that can be applied to quantitatively manage processes for performance. Everyone involved in executing repeatable processes in the software and systems development lifecycle processes needs to become familiar with the concepts of quantitative management, statistical thinking, process improvement methods and how they relate to process-performance. Organizations are starting to embrace the de facto Software Engineering Institute (SEI) Capability Maturity Model Integration (CMMI RTM) Models as process improvement frameworks to improve business processes performance. High maturity process areas in the CMMI model imply the use of analytical, statistical, quantitative management techniques, and process performance modeling to identify and eliminate sources of variation, continually improve process-performance; reduce cost and predict future outcomes. The research study identifies and provides a detail discussion of the gap analysis findings of process improvement and quantitative analysis techniques taught in U.S. universities systems engineering and computing science degree programs, gaps that exist in the literature, and a comparison analysis which identifies the gaps that exist between the SEI's "healthy ingredients " of a process performance model and courses taught in U.S. universities degree program. The research also heightens awareness that academicians have conducted little research on applicable statistics and quantitative techniques that can be used to demonstrate high maturity as implied in the CMMI models. The research also includes a Monte Carlo simulation optimization

  15. Safety-related control air systems

    International Nuclear Information System (INIS)

    Anon.

    1977-01-01

    This Standard applies to those portions of the control air system that furnish air required to support, control, or operate systems or portions of systems that are safety related in nuclear power plants. This Standard relates only to the air supply system(s) for safety-related air operated devices and does not apply to the safety-related air operated device or to air operated actuators for such devices. The objectives of this Standard are to provide (1) minimum system design requirements for equipment, piping, instruments, controls, and wiring that constitute the air supply system; and (2) the system and component testing and maintenance requirements

  16. [Design and implementation of online statistical analysis function in information system of air pollution and health impact monitoring].

    Science.gov (United States)

    Lü, Yiran; Hao, Shuxin; Zhang, Guoqing; Liu, Jie; Liu, Yue; Xu, Dongqun

    2018-01-01

    To implement the online statistical analysis function in information system of air pollution and health impact monitoring, and obtain the data analysis information real-time. Using the descriptive statistical method as well as time-series analysis and multivariate regression analysis, SQL language and visual tools to implement online statistical analysis based on database software. Generate basic statistical tables and summary tables of air pollution exposure and health impact data online; Generate tendency charts of each data part online and proceed interaction connecting to database; Generate butting sheets which can lead to R, SAS and SPSS directly online. The information system air pollution and health impact monitoring implements the statistical analysis function online, which can provide real-time analysis result to its users.

  17. Fractional exclusion statistics: the method for describing interacting particle systems as ideal gases

    International Nuclear Information System (INIS)

    Anghel, Dragoş-Victor

    2012-01-01

    I show that if the total energy of a system of interacting particles may be written as a sum of quasiparticle energies, then the system of quasiparticles can be viewed, in general, as an ideal gas with fractional exclusion statistics (FES). The general method for calculating the FES parameters is also provided. The interacting particle system cannot be described as an ideal gas of Bose and Fermi quasiparticles except in trivial situations.

  18. Networking—a statistical physics perspective

    Science.gov (United States)

    Yeung, Chi Ho; Saad, David

    2013-03-01

    Networking encompasses a variety of tasks related to the communication of information on networks; it has a substantial economic and societal impact on a broad range of areas including transportation systems, wired and wireless communications and a range of Internet applications. As transportation and communication networks become increasingly more complex, the ever increasing demand for congestion control, higher traffic capacity, quality of service, robustness and reduced energy consumption requires new tools and methods to meet these conflicting requirements. The new methodology should serve for gaining better understanding of the properties of networking systems at the macroscopic level, as well as for the development of new principled optimization and management algorithms at the microscopic level. Methods of statistical physics seem best placed to provide new approaches as they have been developed specifically to deal with nonlinear large-scale systems. This review aims at presenting an overview of tools and methods that have been developed within the statistical physics community and that can be readily applied to address the emerging problems in networking. These include diffusion processes, methods from disordered systems and polymer physics, probabilistic inference, which have direct relevance to network routing, file and frequency distribution, the exploration of network structures and vulnerability, and various other practical networking applications.

  19. Networking—a statistical physics perspective

    International Nuclear Information System (INIS)

    Yeung, Chi Ho; Saad, David

    2013-01-01

    Networking encompasses a variety of tasks related to the communication of information on networks; it has a substantial economic and societal impact on a broad range of areas including transportation systems, wired and wireless communications and a range of Internet applications. As transportation and communication networks become increasingly more complex, the ever increasing demand for congestion control, higher traffic capacity, quality of service, robustness and reduced energy consumption requires new tools and methods to meet these conflicting requirements. The new methodology should serve for gaining better understanding of the properties of networking systems at the macroscopic level, as well as for the development of new principled optimization and management algorithms at the microscopic level. Methods of statistical physics seem best placed to provide new approaches as they have been developed specifically to deal with nonlinear large-scale systems. This review aims at presenting an overview of tools and methods that have been developed within the statistical physics community and that can be readily applied to address the emerging problems in networking. These include diffusion processes, methods from disordered systems and polymer physics, probabilistic inference, which have direct relevance to network routing, file and frequency distribution, the exploration of network structures and vulnerability, and various other practical networking applications. (topical review)

  20. Statistical Requirements For Pass-Fail Testing Of Contraband Detection Systems

    International Nuclear Information System (INIS)

    Gilliam, David M.

    2011-01-01

    Contraband detection systems for homeland security applications are typically tested for probability of detection (PD) and probability of false alarm (PFA) using pass-fail testing protocols. Test protocols usually require specified values for PD and PFA to be demonstrated at a specified level of statistical confidence CL. Based on a recent more theoretical treatment of this subject [1], this summary reviews the definition of CL and provides formulas and spreadsheet functions for constructing tables of general test requirements and for determining the minimum number of tests required. The formulas and tables in this article may be generally applied to many other applications of pass-fail testing, in addition to testing of contraband detection systems.

  1. Statistical characterization of the standard map

    Science.gov (United States)

    Ruiz, Guiomar; Tirnakli, Ugur; Borges, Ernesto P.; Tsallis, Constantino

    2017-06-01

    The standard map, paradigmatic conservative system in the (x, p) phase space, has been recently shown (Tirnakli and Borges (2016 Sci. Rep. 6 23644)) to exhibit interesting statistical behaviors directly related to the value of the standard map external parameter K. A comprehensive statistical numerical description is achieved in the present paper. More precisely, for large values of K (e.g. K  =  10) where the Lyapunov exponents are neatly positive over virtually the entire phase space consistently with Boltzmann-Gibbs (BG) statistics, we verify that the q-generalized indices related to the entropy production q{ent} , the sensitivity to initial conditions q{sen} , the distribution of a time-averaged (over successive iterations) phase-space coordinate q{stat} , and the relaxation to the equilibrium final state q{rel} , collapse onto a fixed point, i.e. q{ent}=q{sen}=q{stat}=q{rel}=1 . In remarkable contrast, for small values of K (e.g. K  =  0.2) where the Lyapunov exponents are virtually zero over the entire phase space, we verify q{ent}=q{sen}=0 , q{stat} ≃ 1.935 , and q{rel} ≃1.4 . The situation corresponding to intermediate values of K, where both stable orbits and a chaotic sea are present, is discussed as well. The present results transparently illustrate when BG behavior and/or q-statistical behavior are observed.

  2. Long-Term Propagation Statistics and Availability Performance Assessment for Simulated Terrestrial Hybrid FSO/RF System

    Directory of Open Access Journals (Sweden)

    Fiser Ondrej

    2011-01-01

    Full Text Available Long-term monthly and annual statistics of the attenuation of electromagnetic waves that have been obtained from 6 years of measurements on a free space optical path, 853 meters long, with a wavelength of 850 nm and on a precisely parallel radio path with a frequency of 58 GHz are presented. All the attenuation events observed are systematically classified according to the hydrometeor type causing the particular event. Monthly and yearly propagation statistics on the free space optical path and radio path are obtained. The influence of individual hydrometeors on attenuation is analysed. The obtained propagation statistics are compared to the calculated statistics using ITU-R models. The calculated attenuation statistics both at 850 nm and 58 GHz underestimate the measured statistics for higher attenuation levels. The availability performance of a simulated hybrid FSO/RF system is analysed based on the measured data.

  3. Principles of statistics

    CERN Document Server

    Bulmer, M G

    1979-01-01

    There are many textbooks which describe current methods of statistical analysis, while neglecting related theory. There are equally many advanced textbooks which delve into the far reaches of statistical theory, while bypassing practical applications. But between these two approaches is an unfilled gap, in which theory and practice merge at an intermediate level. Professor M. G. Bulmer's Principles of Statistics, originally published in 1965, was created to fill that need. The new, corrected Dover edition of Principles of Statistics makes this invaluable mid-level text available once again fo

  4. Higher order capacity statistics of multi-hop transmission systems over Rayleigh fading channels

    KAUST Repository

    Yilmaz, Ferkan; Tabassum, Hina; Alouini, Mohamed-Slim

    2012-01-01

    In this paper, we present an exact analytical expression to evaluate the higher order statistics of the channel capacity for amplify and forward (AF) multihop transmission systems operating over Rayleigh fading channels. Furthermore, we present

  5. Sub-Poissonian statistics in order-to-chaos transition

    International Nuclear Information System (INIS)

    Kryuchkyan, Gagik Yu.; Manvelyan, Suren B.

    2003-01-01

    We study the phenomena at the overlap of quantum chaos and nonclassical statistics for the time-dependent model of nonlinear oscillator. It is shown in the framework of Mandel Q parameter and Wigner function that the statistics of oscillatory excitation numbers is drastically changed in the order-to-chaos transition. The essential improvement of sub-Poissonian statistics in comparison with an analogous one for the standard model of driven anharmonic oscillator is observed for the regular operational regime. It is shown that in the chaotic regime, the system exhibits the range of sub-Poissonian and super-Poissonian statistics which alternate one to other depending on time intervals. Unusual dependence of the variance of oscillatory number on the external noise level for the chaotic dynamics is observed. The scaling invariance of the quantum statistics is demonstrated and its relation to dissipation and decoherence is studied

  6. STATLIB, Interactive Statistics Program Library of Tutorial System

    International Nuclear Information System (INIS)

    Anderson, H.E.

    1986-01-01

    1 - Description of program or function: STATLIB is a conversational statistical program library developed in conjunction with a Sandia National Laboratories applied statistics course intended for practicing engineers and scientists. STATLIB is a group of 15 interactive, argument-free, statistical routines. Included are analysis of sensitivity tests; sample statistics for the normal, exponential, hypergeometric, Weibull, and extreme value distributions; three models of multiple regression analysis; x-y data plots; exact probabilities for RxC tables; n sets of m permuted integers in the range 1 to m; simple linear regression and correlation; K different random integers in the range m to n; and Fisher's exact test of independence for a 2 by 2 contingency table. Forty-five other subroutines in the library support the basic 15

  7. Level and width statistics for a decaying chaotic system

    International Nuclear Information System (INIS)

    Mizutori, S.; Zelevinsky, V.G.

    1993-01-01

    The random matrix ensemble of discretized effective non-hermitian hamiltonians is used for studying local correlations and fluctuations of energies and widths in a quantum system where intrinsic levels are coupled to the continuum via a common decay channel. With the use of analytical estimates and numerical simulations, generic properties of statistical observables are obtained for the regimes of weak and strong continuum coupling as well as for the transitional region. Typical signals of the transition (width collectivization, disappearance of level repulsion at small spacings and violation of uniformity along the energy axis) are discussed quantitatively. (orig.)

  8. A multivariate statistical study on a diversified data gathering system for nuclear power plants

    International Nuclear Information System (INIS)

    Samanta, P.K.; Teichmann, T.; Levine, M.M.; Kato, W.Y.

    1989-02-01

    In this report, multivariate statistical methods are presented and applied to demonstrate their use in analyzing nuclear power plant operational data. For analyses of nuclear power plant events, approaches are presented for detecting malfunctions and degradations within the course of the event. At the system level, approaches are investigated as a means of diagnosis of system level performance. This involves the detection of deviations from normal performance of the system. The input data analyzed are the measurable physical parameters, such as steam generator level, pressurizer water level, auxiliary feedwater flow, etc. The study provides the methodology and illustrative examples based on data gathered from simulation of nuclear power plant transients and computer simulation of a plant system performance (due to lack of easily accessible operational data). Such an approach, once fully developed, can be used to explore statistically the detection of failure trends and patterns and prevention of conditions with serious safety implications. 33 refs., 18 figs., 9 tabs

  9. Classicality condition on a system observable in a quantum measurement and a relative-entropy conservation law

    Science.gov (United States)

    Kuramochi, Yui; Ueda, Masahito

    2015-03-01

    We consider the information flow on a system observable X corresponding to a positive-operator-valued measure under a quantum measurement process Y described by a completely positive instrument from the viewpoint of the relative entropy. We establish a sufficient condition for the relative-entropy conservation law which states that the average decrease in the relative entropy of the system observable X equals the relative entropy of the measurement outcome of Y , i.e., the information gain due to measurement. This sufficient condition is interpreted as an assumption of classicality in the sense that there exists a sufficient statistic in a joint successive measurement of Y followed by X such that the probability distribution of the statistic coincides with that of a single measurement of X for the premeasurement state. We show that in the case when X is a discrete projection-valued measure and Y is discrete, the classicality condition is equivalent to the relative-entropy conservation for arbitrary states. The general theory on the relative-entropy conservation is applied to typical quantum measurement models, namely, quantum nondemolition measurement, destructive sharp measurements on two-level systems, a photon counting, a quantum counting, homodyne and heterodyne measurements. These examples except for the nondemolition and photon-counting measurements do not satisfy the known Shannon-entropy conservation law proposed by Ban [M. Ban, J. Phys. A: Math. Gen. 32, 1643 (1999), 10.1088/0305-4470/32/9/012], implying that our approach based on the relative entropy is applicable to a wider class of quantum measurements.

  10. Perception in statistical graphics

    Science.gov (United States)

    VanderPlas, Susan Ruth

    There has been quite a bit of research on statistical graphics and visualization, generally focused on new types of graphics, new software to create graphics, interactivity, and usability studies. Our ability to interpret and use statistical graphics hinges on the interface between the graph itself and the brain that perceives and interprets it, and there is substantially less research on the interplay between graph, eye, brain, and mind than is sufficient to understand the nature of these relationships. The goal of the work presented here is to further explore the interplay between a static graph, the translation of that graph from paper to mental representation (the journey from eye to brain), and the mental processes that operate on that graph once it is transferred into memory (mind). Understanding the perception of statistical graphics should allow researchers to create more effective graphs which produce fewer distortions and viewer errors while reducing the cognitive load necessary to understand the information presented in the graph. Taken together, these experiments should lay a foundation for exploring the perception of statistical graphics. There has been considerable research into the accuracy of numerical judgments viewers make from graphs, and these studies are useful, but it is more effective to understand how errors in these judgments occur so that the root cause of the error can be addressed directly. Understanding how visual reasoning relates to the ability to make judgments from graphs allows us to tailor graphics to particular target audiences. In addition, understanding the hierarchy of salient features in statistical graphics allows us to clearly communicate the important message from data or statistical models by constructing graphics which are designed specifically for the perceptual system.

  11. Reliable fault detection and diagnosis of photovoltaic systems based on statistical monitoring approaches

    KAUST Repository

    Harrou, Fouzi; Sun, Ying; Taghezouit, Bilal; Saidi, Ahmed; Hamlati, Mohamed-Elkarim

    2017-01-01

    This study reports the development of an innovative fault detection and diagnosis scheme to monitor the direct current (DC) side of photovoltaic (PV) systems. Towards this end, we propose a statistical approach that exploits the advantages of one

  12. Methods of statistical physics

    CERN Document Server

    Akhiezer, Aleksandr I

    1981-01-01

    Methods of Statistical Physics is an exposition of the tools of statistical mechanics, which evaluates the kinetic equations of classical and quantized systems. The book also analyzes the equations of macroscopic physics, such as the equations of hydrodynamics for normal and superfluid liquids and macroscopic electrodynamics. The text gives particular attention to the study of quantum systems. This study begins with a discussion of problems of quantum statistics with a detailed description of the basics of quantum mechanics along with the theory of measurement. An analysis of the asymptotic be

  13. Energy statistics: A manual for developing countries

    International Nuclear Information System (INIS)

    1991-01-01

    Considerable advances have been made by developing countries during the last 20 years in the collection and compilation of energy statistics. the present Manual is a guide, which it is hoped will be used in countries whose system of statistics is less advanced to identify the main areas that should be developed and how this might be achieved. The generally accepted aim is for countries to be able to compile statistics annually on the main characteristics shown for each fuel, and for energy in total. These characteristics are mainly concerned with production, supply and consumption, but others relating to the size and capabilities of the different energy industries may also be of considerable importance. The initial task of collecting data from the energy industries (mines, oil producers, refineries and distributors, electrical power stations, etc.) may well fall to a number of organizations. ''Energy'' from a statistical point of view is the sum of the component fuels, and good energy statistics are therefore dependent on good fuel statistics. For this reason a considerable part of this Manual is devoted to the production of regular, comprehensive and reliable statistics relating to individual fuels. Chapters V to IX of this Manual are concerned with identifying the flows of energy, from production to final consumption, for each individual fuel, and how data on these flows might be expected to be obtained. The very different problems concerned with the collection of data on the flows for biomass fuels are covered in chapter X. The data needed to complete the picture of the national scene for each individual fuel, more concerned with describing the size, capabilities and efficiency of the industries related to that fuel, are discussed in chapter XI. Annex I sets out the relationships between the classifications of the various types of fuels. The compilation of energy balances from the data obtained for individual fuels is covered in chapter XIII. Finally, chapter

  14. Side effect of acting on the world: Acquisition of action-outcome statistic relation alters visual interpretation of action outcome

    Directory of Open Access Journals (Sweden)

    Takahiro eKawabe

    2013-09-01

    Full Text Available Humans can acquire the statistical features of the external world and employ them to control behaviors. Some external events occur in harmony with an agent’s action, and thus humans should also be able to acquire the statistical features between an action and its external outcome. We report that the acquired action-outcome statistical features alter the visual appearance of the action outcome. Pressing either of two assigned keys triggered visual motion whose direction was statistically biased either upward or downward, and observers judged the stimulus motion direction. Points of subjective equality (PSE for judging motion direction were shifted repulsively from the mean of the distribution associated with each key. Our Bayesian model accounted for the PSE shifts, indicating the optimal acquisition of the action-effect statistical relation. The PSE shifts were moderately attenuated when the action-outcome contingency was reduced. The Bayesian model again accounted for the attenuated PSE shifts. On the other hand, when the action-outcome contiguity was greatly reduced, the PSE shifts were greatly attenuated, and however, the Bayesian model could not accounted for the shifts. The results indicate that visual appearance can be modified by prediction based on the optimal acquisition of action-effect causal relation.

  15. Path integral molecular dynamics for exact quantum statistics of multi-electronic-state systems.

    Science.gov (United States)

    Liu, Xinzijian; Liu, Jian

    2018-03-14

    An exact approach to compute physical properties for general multi-electronic-state (MES) systems in thermal equilibrium is presented. The approach is extended from our recent progress on path integral molecular dynamics (PIMD), Liu et al. [J. Chem. Phys. 145, 024103 (2016)] and Zhang et al. [J. Chem. Phys. 147, 034109 (2017)], for quantum statistical mechanics when a single potential energy surface is involved. We first define an effective potential function that is numerically favorable for MES-PIMD and then derive corresponding estimators in MES-PIMD for evaluating various physical properties. Its application to several representative one-dimensional and multi-dimensional models demonstrates that MES-PIMD in principle offers a practical tool in either of the diabatic and adiabatic representations for studying exact quantum statistics of complex/large MES systems when the Born-Oppenheimer approximation, Condon approximation, and harmonic bath approximation are broken.

  16. AP statistics

    CERN Document Server

    Levine-Wissing, Robin

    2012-01-01

    All Access for the AP® Statistics Exam Book + Web + Mobile Everything you need to prepare for the Advanced Placement® exam, in a study system built around you! There are many different ways to prepare for an Advanced Placement® exam. What's best for you depends on how much time you have to study and how comfortable you are with the subject matter. To score your highest, you need a system that can be customized to fit you: your schedule, your learning style, and your current level of knowledge. This book, and the online tools that come with it, will help you personalize your AP® Statistics prep

  17. Statistical modeling of the mother-baby system in newborn infants with cerebral ischemia

    Directory of Open Access Journals (Sweden)

    A. V. Filonenko

    2014-01-01

    Full Text Available The statistical model could consider the influence of specific maternal psychoemotional and personality factors on a newborn with cerebral ischemia and develop a procedure to prevent negative consequences of postpartum depression in the mother-baby system.

  18. Worry, Intolerance of Uncertainty, and Statistics Anxiety

    Science.gov (United States)

    Williams, Amanda S.

    2013-01-01

    Statistics anxiety is a problem for most graduate students. This study investigates the relationship between intolerance of uncertainty, worry, and statistics anxiety. Intolerance of uncertainty was significantly related to worry, and worry was significantly related to three types of statistics anxiety. Six types of statistics anxiety were…

  19. Equilibration, thermalisation, and the emergence of statistical mechanics in closed quantum systems.

    Science.gov (United States)

    Gogolin, Christian; Eisert, Jens

    2016-05-01

    We review selected advances in the theoretical understanding of complex quantum many-body systems with regard to emergent notions of quantum statistical mechanics. We cover topics such as equilibration and thermalisation in pure state statistical mechanics, the eigenstate thermalisation hypothesis, the equivalence of ensembles, non-equilibration dynamics following global and local quenches as well as ramps. We also address initial state independence, absence of thermalisation, and many-body localisation. We elucidate the role played by key concepts for these phenomena, such as Lieb-Robinson bounds, entanglement growth, typicality arguments, quantum maximum entropy principles and the generalised Gibbs ensembles, and quantum (non-)integrability. We put emphasis on rigorous approaches and present the most important results in a unified language.

  20. Statistical characterization of speckle noise in coherent imaging systems

    Science.gov (United States)

    Yaroslavsky, Leonid; Shefler, A.

    2003-05-01

    Speckle noise imposes fundamental limitation on image quality in coherent radiation based imaging and optical metrology systems. Speckle noise phenomena are associated with properties of objects to diffusely scatter irradiation and with the fact that in recording the wave field, a number of signal distortions inevitably occur due to technical limitations inherent to hologram sensors. The statistical theory of speckle noise was developed with regard to only limited resolving power of coherent imaging devices. It is valid only asymptotically as much as the central limit theorem of the probability theory can be applied. In applications this assumption is not always applicable. Moreover, in treating speckle noise problem one should also consider other sources of the hologram deterioration. In the paper, statistical properties of speckle due to the limitation of hologram size, dynamic range and hologram signal quantization are studied by Monte-Carlo simulation for holograms recorded in near and far diffraction zones. The simulation experiments have shown that, for limited resolving power of the imaging system, widely accepted opinion that speckle contrast is equal to one holds only for rather severe level of the hologram size limitation. For moderate limitations, speckle contrast changes gradually from zero for no limitation to one for limitation to less than about 20% of hologram size. The results obtained for the limitation of the hologram sensor"s dynamic range and hologram signal quantization reveal that speckle noise due to these hologram signal distortions is not multiplicative and is directly associated with the severity of the limitation and quantization. On the base of the simulation results, analytical models are suggested.

  1. Beyond quantum microcanonical statistics

    International Nuclear Information System (INIS)

    Fresch, Barbara; Moro, Giorgio J.

    2011-01-01

    Descriptions of molecular systems usually refer to two distinct theoretical frameworks. On the one hand the quantum pure state, i.e., the wavefunction, of an isolated system is determined to calculate molecular properties and their time evolution according to the unitary Schroedinger equation. On the other hand a mixed state, i.e., a statistical density matrix, is the standard formalism to account for thermal equilibrium, as postulated in the microcanonical quantum statistics. In the present paper an alternative treatment relying on a statistical analysis of the possible wavefunctions of an isolated system is presented. In analogy with the classical ergodic theory, the time evolution of the wavefunction determines the probability distribution in the phase space pertaining to an isolated system. However, this alone cannot account for a well defined thermodynamical description of the system in the macroscopic limit, unless a suitable probability distribution for the quantum constants of motion is introduced. We present a workable formalism assuring the emergence of typical values of thermodynamic functions, such as the internal energy and the entropy, in the large size limit of the system. This allows the identification of macroscopic properties independently of the specific realization of the quantum state. A description of material systems in agreement with equilibrium thermodynamics is then derived without constraints on the physical constituents and interactions of the system. Furthermore, the canonical statistics is recovered in all generality for the reduced density matrix of a subsystem.

  2. Integrating community-based verbal autopsy into civil registration and vital statistics (CRVS): system-level considerations

    Science.gov (United States)

    de Savigny, Don; Riley, Ian; Chandramohan, Daniel; Odhiambo, Frank; Nichols, Erin; Notzon, Sam; AbouZahr, Carla; Mitra, Raj; Cobos Muñoz, Daniel; Firth, Sonja; Maire, Nicolas; Sankoh, Osman; Bronson, Gay; Setel, Philip; Byass, Peter; Jakob, Robert; Boerma, Ties; Lopez, Alan D.

    2017-01-01

    ABSTRACT Background: Reliable and representative cause of death (COD) statistics are essential to inform public health policy, respond to emerging health needs, and document progress towards Sustainable Development Goals. However, less than one-third of deaths worldwide are assigned a cause. Civil registration and vital statistics (CRVS) systems in low- and lower-middle-income countries are failing to provide timely, complete and accurate vital statistics, and it will still be some time before they can provide physician-certified COD for every death. Proposals: Verbal autopsy (VA) is a method to ascertain the probable COD and, although imperfect, it is the best alternative in the absence of medical certification. There is extensive experience with VA in research settings but only a few examples of its use on a large scale. Data collection using electronic questionnaires on mobile devices and computer algorithms to analyse responses and estimate probable COD have increased the potential for VA to be routinely applied in CRVS systems. However, a number of CRVS and health system integration issues should be considered in planning, piloting and implementing a system-wide intervention such as VA. These include addressing the multiplicity of stakeholders and sub-systems involved, integration with existing CRVS work processes and information flows, linking VA results to civil registration records, information technology requirements and data quality assurance. Conclusions: Integrating VA within CRVS systems is not simply a technical undertaking. It will have profound system-wide effects that should be carefully considered when planning for an effective implementation. This paper identifies and discusses the major system-level issues and emerging practices, provides a planning checklist of system-level considerations and proposes an overview for how VA can be integrated into routine CRVS systems. PMID:28137194

  3. Finding the Root Causes of Statistical Inconsistency in Community Earth System Model Output

    Science.gov (United States)

    Milroy, D.; Hammerling, D.; Baker, A. H.

    2017-12-01

    Baker et al (2015) developed the Community Earth System Model Ensemble Consistency Test (CESM-ECT) to provide a metric for software quality assurance by determining statistical consistency between an ensemble of CESM outputs and new test runs. The test has proved useful for detecting statistical difference caused by compiler bugs and errors in physical modules. However, detection is only the necessary first step in finding the causes of statistical difference. The CESM is a vastly complex model comprised of millions of lines of code which is developed and maintained by a large community of software engineers and scientists. Any root cause analysis is correspondingly challenging. We propose a new capability for CESM-ECT: identifying the sections of code that cause statistical distinguishability. The first step is to discover CESM variables that cause CESM-ECT to classify new runs as statistically distinct, which we achieve via Randomized Logistic Regression. Next we use a tool developed to identify CESM components that define or compute the variables found in the first step. Finally, we employ the application Kernel GENerator (KGEN) created in Kim et al (2016) to detect fine-grained floating point differences. We demonstrate an example of the procedure and advance a plan to automate this process in our future work.

  4. Local Finite Density Theory, Statistical Blocking and Color Superconductivity

    OpenAIRE

    Ying, S.

    2000-01-01

    The motivation for the development of a local finite density theory is discussed. One of the problems related to an instability in the baryon number fluctuation of the chiral symmetry breaking phase of the quark system in the local theory is shown to exist. Such an instability problem is removed by taking into account the statistical blocking effects for the quark propagator, which depends on a macroscopic {\\em statistical blocking parameter} $\\epsilon$. This new frame work is then applied to...

  5. Implementing the “Big Data” Concept in Official Statistics

    OpenAIRE

    О. V.

    2017-01-01

    Big data is a huge resource that needs to be used at all levels of economic planning. The article is devoted to the study of the development of the concept of “Big Data” in the world and its impact on the transformation of statistical simulation of economic processes. Statistics at the current stage should take into account the complex system of international economic relations, which functions in the conditions of globalization and brings new forms of economic development in small open ec...

  6. A system for classifying wood-using industries and recording statistics for automatic data processing.

    Science.gov (United States)

    E.W. Fobes; R.W. Rowe

    1968-01-01

    A system for classifying wood-using industries and recording pertinent statistics for automatic data processing is described. Forms and coding instructions for recording data of primary processing plants are included.

  7. National Center for Health Statistics

    Science.gov (United States)

    ... Submit Search the CDC National Center for Health Statistics Note: Javascript is disabled or is not supported ... Survey of Family Growth Vital Records National Vital Statistics System National Death Index Vital Statistics Rapid Release ...

  8. Statistical Outlier Detection for Jury Based Grading Systems

    DEFF Research Database (Denmark)

    Thompson, Mary Kathryn; Clemmensen, Line Katrine Harder; Rosas, Harvey

    2013-01-01

    This paper presents an algorithm that was developed to identify statistical outliers from the scores of grading jury members in a large project-based first year design course. The background and requirements for the outlier detection system are presented. The outlier detection algorithm...... and the follow-up procedures for score validation and appeals are described in detail. Finally, the impact of various elements of the outlier detection algorithm, their interactions, and the sensitivity of their numerical values are investigated. It is shown that the difference in the mean score produced...... by a grading jury before and after a suspected outlier is removed from the mean is the single most effective criterion for identifying potential outliers but that all of the criteria included in the algorithm have an effect on the outlier detection process....

  9. Asymptotic expansion and statistical description of turbulent systems

    International Nuclear Information System (INIS)

    Hagan, W.K. III.

    1986-01-01

    A new approach to studying turbulent systems is presented in which an asymptotic expansion of the general dynamical equations is performed prior to the application of statistical methods for describing the evolution of the system. This approach has been applied to two specific systems: anomalous drift wave turbulence in plasmas and homogeneous, isotropic turbulence in fluids. For the plasma case, the time and length scales of the turbulent state result in the asymptotic expansion of the Vlasov/Poisson equations taking the form of nonlinear gyrokinetic theory. Questions regarding this theory and modern Hamiltonian perturbation methods are discussed and resolved. A new alternative Hamiltonian method is described. The Eulerian Direct Interaction Approximation (EDIA) is slightly reformulated and applied to the equations of nonlinear gyrokinetic theory. Using a similarity transformation technique, expressions for the thermal diffusivity are derived from the EDIA equations for various geometries, including a tokamak. In particular, the unique result for generalized geometry may be of use in evaluating fusion reactor designs and theories of anomalous thermal transport in tokamaks. Finally, a new and useful property of the EDIA is pointed out. For the fluid case, an asymptotic expansion is applied to the Navier-Stokes equation and the results lead to the speculation that such an approach may resolve the problem of predicting the Kolmogorov inertial range energy spectrum for homogeneous, isotropic turbulence. 45 refs., 3 figs

  10. National transportation statistics 2011

    Science.gov (United States)

    2011-04-01

    Compiled and published by the U.S. Department of Transportation's Bureau of Transportation Statistics : (BTS), National Transportation Statistics presents information on the U.S. transportation system, including : its physical components, safety reco...

  11. National Transportation Statistics 2008

    Science.gov (United States)

    2009-01-08

    Compiled and published by the U.S. Department of Transportations Bureau of Transportation Statistics (BTS), National Transportation Statistics presents information on the U.S. transportation system, including its physical components, safety record...

  12. National Transportation Statistics 2009

    Science.gov (United States)

    2010-01-21

    Compiled and published by the U.S. Department of Transportation's Bureau of Transportation Statistics (BTS), National Transportation Statistics presents information on the U.S. transportation system, including its physical components, safety record, ...

  13. Practical statistics in pain research.

    Science.gov (United States)

    Kim, Tae Kyun

    2017-10-01

    Pain is subjective, while statistics related to pain research are objective. This review was written to help researchers involved in pain research make statistical decisions. The main issues are related with the level of scales that are often used in pain research, the choice of statistical methods between parametric or nonparametric statistics, and problems which arise from repeated measurements. In the field of pain research, parametric statistics used to be applied in an erroneous way. This is closely related with the scales of data and repeated measurements. The level of scales includes nominal, ordinal, interval, and ratio scales. The level of scales affects the choice of statistics between parametric or non-parametric methods. In the field of pain research, the most frequently used pain assessment scale is the ordinal scale, which would include the visual analogue scale (VAS). There used to be another view, however, which considered the VAS to be an interval or ratio scale, so that the usage of parametric statistics would be accepted practically in some cases. Repeated measurements of the same subjects always complicates statistics. It means that measurements inevitably have correlations between each other, and would preclude the application of one-way ANOVA in which independence between the measurements is necessary. Repeated measures of ANOVA (RMANOVA), however, would permit the comparison between the correlated measurements as long as the condition of sphericity assumption is satisfied. Conclusively, parametric statistical methods should be used only when the assumptions of parametric statistics, such as normality and sphericity, are established.

  14. Statistics for economics

    CERN Document Server

    Naghshpour, Shahdad

    2012-01-01

    Statistics is the branch of mathematics that deals with real-life problems. As such, it is an essential tool for economists. Unfortunately, the way you and many other economists learn the concept of statistics is not compatible with the way economists think and learn. The problem is worsened by the use of mathematical jargon and complex derivations. Here's a book that proves none of this is necessary. All the examples and exercises in this book are constructed within the field of economics, thus eliminating the difficulty of learning statistics with examples from fields that have no relation to business, politics, or policy. Statistics is, in fact, not more difficult than economics. Anyone who can comprehend economics can understand and use statistics successfully within this field, including you! This book utilizes Microsoft Excel to obtain statistical results, as well as to perform additional necessary computations. Microsoft Excel is not the software of choice for performing sophisticated statistical analy...

  15. Statistics of the relative velocity of particles in bidisperse turbulent suspensions

    Science.gov (United States)

    Bhatnagar, Akshay; Gustavsson, Kristian; Mehlig, Bernhard; Mitra, Dhrubaditya

    2017-11-01

    We calculate the joint probability distribution function (JPDF) of relative distances (R) and velocities (V with longitudinal component VR) of a pair of bidisperse heavy inertial particles in homogeneous and isotropic turbulent flows using direct numerical simulations (DNS). A recent paper (J. Meibohm, et. al. 2017), using statistical-model simulations and mathematical analysis of an one-dimensional white-noise model, has shown that the JPDF, P (R ,VR) , for two particles with Stokes numbers, St1 and St2 , can be interpreted in terms of StM , the harmonic mean of St1 and St2 and θ ≡ | St1 - St2 | / (St1 + St2) . For small θ there emerges a small-scale cutoff Rc and a small-velocity cutoff Vc such that for VR Foundation, Dnr. KAW 2014.0048.

  16. Rule-based statistical data mining agents for an e-commerce application

    Science.gov (United States)

    Qin, Yi; Zhang, Yan-Qing; King, K. N.; Sunderraman, Rajshekhar

    2003-03-01

    Intelligent data mining techniques have useful e-Business applications. Because an e-Commerce application is related to multiple domains such as statistical analysis, market competition, price comparison, profit improvement and personal preferences, this paper presents a hybrid knowledge-based e-Commerce system fusing intelligent techniques, statistical data mining, and personal information to enhance QoS (Quality of Service) of e-Commerce. A Web-based e-Commerce application software system, eDVD Web Shopping Center, is successfully implemented uisng Java servlets and an Oracle81 database server. Simulation results have shown that the hybrid intelligent e-Commerce system is able to make smart decisions for different customers.

  17. National transportation statistics 2010

    Science.gov (United States)

    2010-01-01

    National Transportation Statistics presents statistics on the U.S. transportation system, including its physical components, safety record, economic performance, the human and natural environment, and national security. This is a large online documen...

  18. Statistical Thermodynamics and Microscale Thermophysics

    Science.gov (United States)

    Carey, Van P.

    1999-08-01

    Many exciting new developments in microscale engineering are based on the application of traditional principles of statistical thermodynamics. In this text Van Carey offers a modern view of thermodynamics, interweaving classical and statistical thermodynamic principles and applying them to current engineering systems. He begins with coverage of microscale energy storage mechanisms from a quantum mechanics perspective and then develops the fundamental elements of classical and statistical thermodynamics. Subsequent chapters discuss applications of equilibrium statistical thermodynamics to solid, liquid, and gas phase systems. The remainder of the book is devoted to nonequilibrium thermodynamics of transport phenomena and to nonequilibrium effects and noncontinuum behavior at the microscale. Although the text emphasizes mathematical development, Carey includes many examples and exercises to illustrate how the theoretical concepts are applied to systems of scientific and engineering interest. In the process he offers a fresh view of statistical thermodynamics for advanced undergraduate and graduate students, as well as practitioners, in mechanical, chemical, and materials engineering.

  19. Are systemizing and autistic traits related to talent and interest in mathematics and engineering? Testing some of the central claims of the empathizing-systemizing theory.

    Science.gov (United States)

    Morsanyi, Kinga; Primi, Caterina; Handley, Simon J; Chiesi, Francesca; Galli, Silvia

    2012-11-01

    In two experiments, we tested some of the central claims of the empathizing-systemizing (E-S) theory. Experiment 1 showed that the systemizing quotient (SQ) was unrelated to performance on a mathematics test, although it was correlated with statistics-related attitudes, self-efficacy, and anxiety. In Experiment 2, systemizing skills, and gender differences in these skills, were more strongly related to spatial thinking styles than to SQ. In fact, when we partialled the effect of spatial thinking styles, SQ was no longer related to systemizing skills. Additionally, there was no relationship between the Autism Spectrum Quotient (AQ) and the SQ, or skills and interest in mathematics and mechanical reasoning. We discuss the implications of our findings for the E-S theory, and for understanding the autistic cognitive profile. ©2011 The British Psychological Society.

  20. HPV-Associated Cancers Statistics

    Science.gov (United States)

    ... What CDC Is Doing Related Links Stay Informed Statistics for Other Kinds of Cancer Breast Cervical Colorectal ( ... Vaginal and Vulvar Cancer Home HPV-Associated Cancer Statistics Language: English (US) Español (Spanish) Recommend on Facebook ...

  1. a Statistical Dynamic Approach to Structural Evolution of Complex Capital Market Systems

    Science.gov (United States)

    Shao, Xiao; Chai, Li H.

    As an important part of modern financial systems, capital market has played a crucial role on diverse social resource allocations and economical exchanges. Beyond traditional models and/or theories based on neoclassical economics, considering capital markets as typical complex open systems, this paper attempts to develop a new approach to overcome some shortcomings of the available researches. By defining the generalized entropy of capital market systems, a theoretical model and nonlinear dynamic equation on the operations of capital market are proposed from statistical dynamic perspectives. The US security market from 1995 to 2001 is then simulated and analyzed as a typical case. Some instructive results are discussed and summarized.

  2. Foundations of Complex Systems Nonlinear Dynamics, Statistical Physics, and Prediction

    CERN Document Server

    Nicolis, Gregoire

    2007-01-01

    Complexity is emerging as a post-Newtonian paradigm for approaching a large body of phenomena of concern at the crossroads of physical, engineering, environmental, life and human sciences from a unifying point of view. This book outlines the foundations of modern complexity research as it arose from the cross-fertilization of ideas and tools from nonlinear science, statistical physics and numerical simulation. It is shown how these developments lead to an understanding, both qualitative and quantitative, of the complex systems encountered in nature and in everyday experience and, conversely, h

  3. Quantum formalism for classical statistics

    Science.gov (United States)

    Wetterich, C.

    2018-06-01

    In static classical statistical systems the problem of information transport from a boundary to the bulk finds a simple description in terms of wave functions or density matrices. While the transfer matrix formalism is a type of Heisenberg picture for this problem, we develop here the associated Schrödinger picture that keeps track of the local probabilistic information. The transport of the probabilistic information between neighboring hypersurfaces obeys a linear evolution equation, and therefore the superposition principle for the possible solutions. Operators are associated to local observables, with rules for the computation of expectation values similar to quantum mechanics. We discuss how non-commutativity naturally arises in this setting. Also other features characteristic of quantum mechanics, such as complex structure, change of basis or symmetry transformations, can be found in classical statistics once formulated in terms of wave functions or density matrices. We construct for every quantum system an equivalent classical statistical system, such that time in quantum mechanics corresponds to the location of hypersurfaces in the classical probabilistic ensemble. For suitable choices of local observables in the classical statistical system one can, in principle, compute all expectation values and correlations of observables in the quantum system from the local probabilistic information of the associated classical statistical system. Realizing a static memory material as a quantum simulator for a given quantum system is not a matter of principle, but rather of practical simplicity.

  4. Statistics associated with an elemental analysis system of particles induced by X-ray emission

    International Nuclear Information System (INIS)

    Romo K, C.M.

    1987-01-01

    In the quantitative elemental analysis by X-ray techniques one has to use data spectra which present fluctuations of statistical nature both from the energy and from the number of counts accumulated. While processing the results for the obtainment of a quantitative result, a detailed knowledge of the associated statistics distributions is needed. In this work, l) the statistics associated with the system photon's counting as well as 2) the distribution of the results as a function of the energy are analyzed. The first one is important for the definition of the expected values and uncertainties and for the spectra simulation (Mukoyama, 1975). The second one is fundamental for the determination of the contribution for each spectral line. (M.R.) [es

  5. Statistical Pattern Recognition

    CERN Document Server

    Webb, Andrew R

    2011-01-01

    Statistical pattern recognition relates to the use of statistical techniques for analysing data measurements in order to extract information and make justified decisions.  It is a very active area of study and research, which has seen many advances in recent years. Applications such as data mining, web searching, multimedia data retrieval, face recognition, and cursive handwriting recognition, all require robust and efficient pattern recognition techniques. This third edition provides an introduction to statistical pattern theory and techniques, with material drawn from a wide range of fields,

  6. Statistical methods in nonlinear dynamics

    Indian Academy of Sciences (India)

    Sensitivity to initial conditions in nonlinear dynamical systems leads to exponential divergence of trajectories that are initially arbitrarily close, and hence to unpredictability. Statistical methods have been found to be helpful in extracting useful information about such systems. In this paper, we review briefly some statistical ...

  7. Statistical physics of hard optimization problems

    International Nuclear Information System (INIS)

    Zdeborova, L.

    2009-01-01

    Optimization is fundamental in many areas of science, from computer science and information theory to engineering and statistical physics, as well as to biology or social sciences. It typically involves a large number of variables and a cost function depending on these variables. Optimization problems in the non-deterministic polynomial (NP)-complete class are particularly difficult, it is believed that the number of operations required to minimize the cost function is in the most difficult cases exponential in the system size. However, even in an NP-complete problem the practically arising instances might, in fact, be easy to solve. The principal question we address in this article is: How to recognize if an NP-complete constraint satisfaction problem is typically hard and what are the main reasons for this? We adopt approaches from the statistical physics of disordered systems, in particular the cavity method developed originally to describe glassy systems. We describe new properties of the space of solutions in two of the most studied constraint satisfaction problems - random satisfy ability and random graph coloring. We suggest a relation between the existence of the so-called frozen variables and the algorithmic hardness of a problem. Based on these insights, we introduce a new class of problems which we named ”locked” constraint satisfaction, where the statistical description is easily solvable, but from the algorithmic point of view they are even more challenging than the canonical satisfy ability.

  8. Statistical physics of hard optimization problems

    International Nuclear Information System (INIS)

    Zdeborova, L.

    2009-01-01

    Optimization is fundamental in many areas of science, from computer science and information theory to engineering and statistical physics, as well as to biology or social sciences. It typically involves a large number of variables and a cost function depending on these variables. Optimization problems in the non-deterministic polynomial-complete class are particularly difficult, it is believed that the number of operations required to minimize the cost function is in the most difficult cases exponential in the system size. However, even in an non-deterministic polynomial-complete problem the practically arising instances might, in fact, be easy to solve. The principal the question we address in the article is: How to recognize if an non-deterministic polynomial-complete constraint satisfaction problem is typically hard and what are the main reasons for this? We adopt approaches from the statistical physics of disordered systems, in particular the cavity method developed originally to describe glassy systems. We describe new properties of the space of solutions in two of the most studied constraint satisfaction problems - random satisfiability and random graph coloring. We suggest a relation between the existence of the so-called frozen variables and the algorithmic hardness of a problem. Based on these insights, we introduce a new class of problems which we named 'locked' constraint satisfaction, where the statistical description is easily solvable, but from the algorithmic point of view they are even more challenging than the canonical satisfiability (Authors)

  9. Statistical physics of hard optimization problems

    Science.gov (United States)

    Zdeborová, Lenka

    2009-06-01

    Optimization is fundamental in many areas of science, from computer science and information theory to engineering and statistical physics, as well as to biology or social sciences. It typically involves a large number of variables and a cost function depending on these variables. Optimization problems in the non-deterministic polynomial (NP)-complete class are particularly difficult, it is believed that the number of operations required to minimize the cost function is in the most difficult cases exponential in the system size. However, even in an NP-complete problem the practically arising instances might, in fact, be easy to solve. The principal question we address in this article is: How to recognize if an NP-complete constraint satisfaction problem is typically hard and what are the main reasons for this? We adopt approaches from the statistical physics of disordered systems, in particular the cavity method developed originally to describe glassy systems. We describe new properties of the space of solutions in two of the most studied constraint satisfaction problems - random satisfiability and random graph coloring. We suggest a relation between the existence of the so-called frozen variables and the algorithmic hardness of a problem. Based on these insights, we introduce a new class of problems which we named "locked" constraint satisfaction, where the statistical description is easily solvable, but from the algorithmic point of view they are even more challenging than the canonical satisfiability.

  10. Statistical distributions of earthquakes and related non-linear features in seismic waves

    International Nuclear Information System (INIS)

    Apostol, B.-F.

    2006-01-01

    A few basic facts in the science of the earthquakes are briefly reviewed. An accumulation, or growth, model is put forward for the focal mechanisms and the critical focal zone of the earthquakes, which relates the earthquake average recurrence time to the released seismic energy. The temporal statistical distribution for average recurrence time is introduced for earthquakes, and, on this basis, the Omori-type distribution in energy is derived, as well as the distribution in magnitude, by making use of the semi-empirical Gutenberg-Richter law relating seismic energy to earthquake magnitude. On geometric grounds, the accumulation model suggests the value r = 1/3 for the Omori parameter in the power-law of energy distribution, which leads to β = 1,17 for the coefficient in the Gutenberg-Richter recurrence law, in fair agreement with the statistical analysis of the empirical data. Making use of this value, the empirical Bath's law is discussed for the average magnitude of the aftershocks (which is 1.2 less than the magnitude of the main seismic shock), by assuming that the aftershocks are relaxation events of the seismic zone. The time distribution of the earthquakes with a fixed average recurrence time is also derived, the earthquake occurrence prediction is discussed by means of the average recurrence time and the seismicity rate, and application of this discussion to the seismic region Vrancea, Romania, is outlined. Finally, a special effect of non-linear behaviour of the seismic waves is discussed, by describing an exact solution derived recently for the elastic waves equation with cubic anharmonicities, its relevance, and its connection to the approximate quasi-plane waves picture. The properties of the seismic activity accompanying a main seismic shock, both like foreshocks and aftershocks, are relegated to forthcoming publications. (author)

  11. Statistical Feature Extraction for Fault Locations in Nonintrusive Fault Detection of Low Voltage Distribution Systems

    Directory of Open Access Journals (Sweden)

    Hsueh-Hsien Chang

    2017-04-01

    Full Text Available This paper proposes statistical feature extraction methods combined with artificial intelligence (AI approaches for fault locations in non-intrusive single-line-to-ground fault (SLGF detection of low voltage distribution systems. The input features of the AI algorithms are extracted using statistical moment transformation for reducing the dimensions of the power signature inputs measured by using non-intrusive fault monitoring (NIFM techniques. The data required to develop the network are generated by simulating SLGF using the Electromagnetic Transient Program (EMTP in a test system. To enhance the identification accuracy, these features after normalization are given to AI algorithms for presenting and evaluating in this paper. Different AI techniques are then utilized to compare which identification algorithms are suitable to diagnose the SLGF for various power signatures in a NIFM system. The simulation results show that the proposed method is effective and can identify the fault locations by using non-intrusive monitoring techniques for low voltage distribution systems.

  12. The statistical mechanics of learning a rule

    International Nuclear Information System (INIS)

    Watkin, T.L.H.; Rau, A.; Biehl, M.

    1993-01-01

    A summary is presented of the statistical mechanical theory of learning a rule with a neural network, a rapidly advancing area which is closely related to other inverse problems frequently encountered by physicists. By emphasizing the relationship between neural networks and strongly interacting physical systems, such as spin glasses, the authors show how learning theory has provided a workshop in which to develop new, exact analytical techniques

  13. Electron-positron momentium distribution measurements of high-Tc superconductors and related systems

    International Nuclear Information System (INIS)

    Wachs, A.L.; Turchi, P.E.A.; Howell, R.H.; Jean, Y.C.; Fluss, M.J.; West, R.N.; Kaiser, J.H.; Rayner, S.; Haghighi, H.; Merkle, K.L.; Revcolevshi, A.; Wang, Z.Z.

    1989-01-01

    The authors discuss measurements of the 2D-angular correlation of positron annihilation radiation (ACAR) in La 2 CuO 4 , YBa 2 Cu 3 O 7 (YBCO), and NiO. The measurements for NiO are the first such 2D-ACAR measurements; the YBCO results are of a higher statistical quality than previously reported. The data are compared with complementary theoretical calculations and with each other. The authors discuss the implication of this analysis for ACAR studies of similar and related systems

  14. Statistical precision of delayed-neutron nondestructive assay techniques

    International Nuclear Information System (INIS)

    Bayne, C.K.; McNeany, S.R.

    1979-02-01

    A theoretical analysis of the statistical precision of delayed-neutron nondestructive assay instruments is presented. Such instruments measure the fissile content of nuclear fuel samples by neutron irradiation and delayed-neutron detection. The precision of these techniques is limited by the statistical nature of the nuclear decay process, but the precision can be optimized by proper selection of system operating parameters. Our method is a three-part analysis. We first present differential--difference equations describing the fundamental physics of the measurements. We then derive and present complete analytical solutions to these equations. Final equations governing the expected number and variance of delayed-neutron counts were computer programmed to calculate the relative statistical precision of specific system operating parameters. Our results show that Poisson statistics do not govern the number of counts accumulated in multiple irradiation-count cycles and that, in general, maximum count precision does not correspond with maximum count as first expected. Covariance between the counts of individual cycles must be considered in determining the optimum number of irradiation-count cycles and the optimum irradiation-to-count time ratio. For the assay system in use at ORNL, covariance effects are small, but for systems with short irradiation-to-count transition times, covariance effects force the optimum number of irradiation-count cycles to be half those giving maximum count. We conclude that the equations governing the expected value and variance of delayed-neutron counts have been derived in closed form. These have been computerized and can be used to select optimum operating parameters for delayed-neutron assay devices

  15. Digital immunohistochemistry platform for the staining variation monitoring based on integration of image and statistical analyses with laboratory information system.

    Science.gov (United States)

    Laurinaviciene, Aida; Plancoulaine, Benoit; Baltrusaityte, Indra; Meskauskas, Raimundas; Besusparis, Justinas; Lesciute-Krilaviciene, Daiva; Raudeliunas, Darius; Iqbal, Yasir; Herlin, Paulette; Laurinavicius, Arvydas

    2014-01-01

    Digital immunohistochemistry (IHC) is one of the most promising applications brought by new generation image analysis (IA). While conventional IHC staining quality is monitored by semi-quantitative visual evaluation of tissue controls, IA may require more sensitive measurement. We designed an automated system to digitally monitor IHC multi-tissue controls, based on SQL-level integration of laboratory information system with image and statistical analysis tools. Consecutive sections of TMA containing 10 cores of breast cancer tissue were used as tissue controls in routine Ki67 IHC testing. Ventana slide label barcode ID was sent to the LIS to register the serial section sequence. The slides were stained and scanned (Aperio ScanScope XT), IA was performed by the Aperio/Leica Colocalization and Genie Classifier/Nuclear algorithms. SQL-based integration ensured automated statistical analysis of the IA data by the SAS Enterprise Guide project. Factor analysis and plot visualizations were performed to explore slide-to-slide variation of the Ki67 IHC staining results in the control tissue. Slide-to-slide intra-core IHC staining analysis revealed rather significant variation of the variables reflecting the sample size, while Brown and Blue Intensity were relatively stable. To further investigate this variation, the IA results from the 10 cores were aggregated to minimize tissue-related variance. Factor analysis revealed association between the variables reflecting the sample size detected by IA and Blue Intensity. Since the main feature to be extracted from the tissue controls was staining intensity, we further explored the variation of the intensity variables in the individual cores. MeanBrownBlue Intensity ((Brown+Blue)/2) and DiffBrownBlue Intensity (Brown-Blue) were introduced to better contrast the absolute intensity and the colour balance variation in each core; relevant factor scores were extracted. Finally, tissue-related factors of IHC staining variance were

  16. Statistics of multi-tube detecting systems; Estadistica de sistemas de deteccion multitubo

    Energy Technology Data Exchange (ETDEWEB)

    Grau Carles, P.; Grau Malonda, A.

    1994-07-01

    In this paper three new statistical theorems are demonstrated and applied. These theorems simplify very much the obtention of the formulae to compute the counting efficiency when the detection system is formed by several photomultipliers associated in coincidence and sum. These theorems are applied to several photomultiplier arrangements in order to show their potential and the application way. (Author) 6 refs.

  17. Statistical theory of heat

    CERN Document Server

    Scheck, Florian

    2016-01-01

    Scheck’s textbook starts with a concise introduction to classical thermodynamics, including geometrical aspects. Then a short introduction to probabilities and statistics lays the basis for the statistical interpretation of thermodynamics. Phase transitions, discrete models and the stability of matter are explained in great detail. Thermodynamics has a special role in theoretical physics. Due to the general approach of thermodynamics the field has a bridging function between several areas like the theory of condensed matter, elementary particle physics, astrophysics and cosmology. The classical thermodynamics describes predominantly averaged properties of matter, reaching from few particle systems and state of matter to stellar objects. Statistical Thermodynamics covers the same fields, but explores them in greater depth and unifies classical statistical mechanics with quantum theory of multiple particle systems. The content is presented as two tracks: the fast track for master students, providing the essen...

  18. New results related to QGP-like effects in small systems with ALICE

    CERN Document Server

    Vislavicius, Vytautas

    2016-09-15

    Results on the production of $\\pi^{\\pm}$, $\\textrm{K}^{\\pm}$, $\\textrm{p}(\\bar{\\textrm{p}})$, $\\Lambda(\\bar{\\Lambda})$, $\\Xi^{-} \\left(\\bar{\\Xi}^{+}\\right)$ and $\\Omega^{-} \\left(\\bar{\\Omega}^{+}\\right)$ at midrapidity (${|y|<0.5}$) as a function of multiplicity in $\\sqrt{s}~=~7~\\textrm{TeV}$ pp collisions are reported. Transverse momentum distributions and integrated yields are compared to expectations from statistical hadronization models along with results from different colliding systems and center-of-mass energies. The evolution of spectral shapes with multiplicity show similar patterns to those seen in p-Pb and Pb-Pb collisions. The $p_{\\textrm{T}}$-integrated baryon yields relative to pions exhibit a significant strangeness-related enhancement in both pp and p-Pb collisions.

  19. Implementing the “Big Data” Concept in Official Statistics

    Directory of Open Access Journals (Sweden)

    О. V.

    2017-02-01

    Full Text Available Big data is a huge resource that needs to be used at all levels of economic planning. The article is devoted to the study of the development of the concept of “Big Data” in the world and its impact on the transformation of statistical simulation of economic processes. Statistics at the current stage should take into account the complex system of international economic relations, which functions in the conditions of globalization and brings new forms of economic development in small open economies. Statistical science should take into account such phenomena as gig-economy, common economy, institutional factors, etc. The concept of “Big Data” and open data are analyzed, problems of implementation of “Big Data” in the official statistics are shown. The ways of implementation of “Big Data” in the official statistics of Ukraine through active use of technological opportunities of mobile operators, navigation systems, surveillance cameras, social networks, etc. are presented. The possibilities of using “Big Data” in different sectors of the economy, also on the level of companies are shown. The problems of storage of large volumes of data are highlighted. The study shows that “Big Data” is a huge resource that should be used across the Ukrainian economy.

  20. Hamiltonian formulation and statistics of an attracting system of nonlinear oscillators

    International Nuclear Information System (INIS)

    Tasso, H.

    1987-10-01

    An attracting system of r nonlinear oscillators of an extended van der Pol type was investigated with respect to Hamiltonian formulation. The case of r=2 is rather simple, though nontrivial. For r>2 the tests with Jacobi's identity and Frechet derivatives are negative if Hamiltonians in the natural variables are looked for. Independently, a Liouville theorem is proved and equilibrium statistics is made possible, which leads to a Gaussian distribution in the natural variables. (orig.)

  1. Statistical Modelling of Temperature and Moisture Uptake of Biochars Exposed to Selected Relative Humidity of Air

    Directory of Open Access Journals (Sweden)

    Luciane Bastistella

    2018-02-01

    Full Text Available New experimental techniques, as well as modern variants on known methods, have recently been employed to investigate the fundamental reactions underlying the oxidation of biochar. The purpose of this paper was to experimentally and statistically study how the relative humidity of air, mass, and particle size of four biochars influenced the adsorption of water and the increase in temperature. A random factorial design was employed using the intuitive statistical software Xlstat. A simple linear regression model and an analysis of variance with a pairwise comparison were performed. The experimental study was carried out on the wood of Quercus pubescens, Cyclobalanopsis glauca, Trigonostemon huangmosun, and Bambusa vulgaris, and involved five relative humidity conditions (22, 43, 75, 84, and 90%, two mass samples (0.1 and 1 g, and two particle sizes (powder and piece. Two response variables including water adsorption and temperature increase were analyzed and discussed. The temperature did not increase linearly with the adsorption of water. Temperature was modeled by nine explanatory variables, while water adsorption was modeled by eight. Five variables, including factors and their interactions, were found to be common to the two models. Sample mass and relative humidity influenced the two qualitative variables, while particle size and biochar type only influenced the temperature.

  2. Low-level contrast statistics of natural images can modulate the frequency of event-related potentials (ERP in humans

    Directory of Open Access Journals (Sweden)

    Masoud Ghodrati

    2016-12-01

    Full Text Available Humans are fast and accurate in categorizing complex natural images. It is, however, unclear what features of visual information are exploited by brain to perceive the images with such speed and accuracy. It has been shown that low-level contrast statistics of natural scenes can explain the variance of amplitude of event-related potentials (ERP in response to rapidly presented images. In this study, we investigated the effect of these statistics on frequency content of ERPs. We recorded ERPs from human subjects, while they viewed natural images each presented for 70 ms. Our results showed that Weibull contrast statistics, as a biologically plausible model, explained the variance of ERPs the best, compared to other image statistics that we assessed. Our time-frequency analysis revealed a significant correlation between these statistics and ERPs’ power within theta frequency band (~3-7 Hz. This is interesting, as theta band is believed to be involved in context updating and semantic encoding. This correlation became significant at ~110 ms after stimulus onset, and peaked at 138 ms. Our results show that not only the amplitude but also the frequency of neural responses can be modulated with low-level contrast statistics of natural images and highlights their potential role in scene perception.

  3. MORTICIA, a statistical analysis software package for determining optical surveillance system effectiveness.

    Science.gov (United States)

    Ramkilowan, A.; Griffith, D. J.

    2017-10-01

    Surveillance modelling in terms of the standard Detect, Recognise and Identify (DRI) thresholds remains a key requirement for determining the effectiveness of surveillance sensors. With readily available computational resources it has become feasible to perform statistically representative evaluations of the effectiveness of these sensors. A new capability for performing this Monte-Carlo type analysis is demonstrated in the MORTICIA (Monte- Carlo Optical Rendering for Theatre Investigations of Capability under the Influence of the Atmosphere) software package developed at the Council for Scientific and Industrial Research (CSIR). This first generation, python-based open-source integrated software package, currently in the alpha stage of development aims to provide all the functionality required to perform statistical investigations of the effectiveness of optical surveillance systems in specific or generic deployment theatres. This includes modelling of the mathematical and physical processes that govern amongst other components of a surveillance system; a sensor's detector and optical components, a target and its background as well as the intervening atmospheric influences. In this paper we discuss integral aspects of the bespoke framework that are critical to the longevity of all subsequent modelling efforts. Additionally, some preliminary results are presented.

  4. An experimental test of the fluctuation relation in an active camphor boat system

    Science.gov (United States)

    Paroor, H. M.; Nambiar, N.; Bandi, M. M.

    The Gallavotti-Cohen fluctuation relation (FR) posits a specific symmetry between positive and negative fluctuations in entropy production, or a related quantity (e.g power) for systems in non-equilibrium stationary state. Successful tests in a variety of systems suggest the FR may be more generally applicable than the conditions under which it was originally derived. Systems where the FR fails are therefore valuable for the insight they provide into the FR's general success. It has recently been suggested that ``active matter'' should not satisfy the fluctuation-dissipation theorem or FR. We experimentally test this possibility in a system of active camphor boats, self-propelled by surface tension gradients at air-water interfaces. The boats interact via short-range capillary attraction which competes with long-range surface tension mediated repulsion. Tuning interaction strength with number density, we test the FR through the statistics of power as one goes from a free non-interacting camphor boat, through a few weakly interacting boats to several, strongly interacting boats. We present preliminary results of our experiments and data analysis.

  5. A new universality class in corpus of texts; A statistical physics study

    Science.gov (United States)

    Najafi, Elham; Darooneh, Amir H.

    2018-05-01

    Text can be regarded as a complex system. There are some methods in statistical physics which can be used to study this system. In this work, by means of statistical physics methods, we reveal new universal behaviors of texts associating with the fractality values of words in a text. The fractality measure indicates the importance of words in a text by considering distribution pattern of words throughout the text. We observed a power law relation between fractality of text and vocabulary size for texts and corpora. We also observed this behavior in studying biological data.

  6. An approach to build knowledge base for reactor accident diagnostic system using statistical method

    International Nuclear Information System (INIS)

    Kohsaka, Atsuo; Yokobayashi, Masao; Matsumoto, Kiyoshi; Fujii, Minoru

    1988-01-01

    In the development of a rule based expert system, one of key issues is how to build a knowledge base (KB). A systematic approach has been attempted for building an objective KB efficiently. The approach is based on the concept that a prototype KB should first be generated in a systematic way and then it is to be modified and/or improved by expert for practical use. The statistical method, Factor Analysis, was applied to build a prototype KB for the JAERI expert system DISKET using source information obtained from a PWR simulator. The prototype KB was obtained and the inference with this KB was performed against several types of transients. In each diagnosis, the transient type was well identified. From this study, it is concluded that the statistical method used is useful for building a prototype knowledge base. (author)

  7. Reliability assessment for safety critical systems by statistical random testing

    International Nuclear Information System (INIS)

    Mills, S.E.

    1995-11-01

    In this report we present an overview of reliability assessment for software and focus on some basic aspects of assessing reliability for safety critical systems by statistical random testing. We also discuss possible deviations from some essential assumptions on which the general methodology is based. These deviations appear quite likely in practical applications. We present and discuss possible remedies and adjustments and then undertake applying this methodology to a portion of the SDS1 software. We also indicate shortcomings of the methodology and possible avenues to address to follow to address these problems. (author). 128 refs., 11 tabs., 31 figs

  8. Reliability assessment for safety critical systems by statistical random testing

    Energy Technology Data Exchange (ETDEWEB)

    Mills, S E [Carleton Univ., Ottawa, ON (Canada). Statistical Consulting Centre

    1995-11-01

    In this report we present an overview of reliability assessment for software and focus on some basic aspects of assessing reliability for safety critical systems by statistical random testing. We also discuss possible deviations from some essential assumptions on which the general methodology is based. These deviations appear quite likely in practical applications. We present and discuss possible remedies and adjustments and then undertake applying this methodology to a portion of the SDS1 software. We also indicate shortcomings of the methodology and possible avenues to address to follow to address these problems. (author). 128 refs., 11 tabs., 31 figs.

  9. Analysis of radiation monitoring data by distribution-free statistical methods (a case of river system Techa-Iset'-Tobol-Irtysh contamination)

    International Nuclear Information System (INIS)

    Luneva, K.V.; Kryshev, A.I.; Nikitin, A.I.; Kryshev, I.I.

    2010-01-01

    The article presents the results of statistical analysis of radiation monitoring data of river system Techa-Iset'-Tobol-Irtysh contamination. A short description of analyzable data and the territory under consideration was given. The distribution-free statistic methods, used for comparative analysis, were described. Reasons of the methods selection and their application features were given. Comparative data analysis with traditional statistics methods was presented. Reliable decrease of 90 Sr specific activity in the river system object to object was determined, which is the evidence of the radionuclide transportation in the river system Techa-Iset'-Tobol-Irtysh [ru

  10. A Statistic-Based Calibration Method for TIADC System

    Directory of Open Access Journals (Sweden)

    Kuojun Yang

    2015-01-01

    Full Text Available Time-interleaved technique is widely used to increase the sampling rate of analog-to-digital converter (ADC. However, the channel mismatches degrade the performance of time-interleaved ADC (TIADC. Therefore, a statistic-based calibration method for TIADC is proposed in this paper. The average value of sampling points is utilized to calculate offset error, and the summation of sampling points is used to calculate gain error. After offset and gain error are obtained, they are calibrated by offset and gain adjustment elements in ADC. Timing skew is calibrated by an iterative method. The product of sampling points of two adjacent subchannels is used as a metric for calibration. The proposed method is employed to calibrate mismatches in a four-channel 5 GS/s TIADC system. Simulation results show that the proposed method can estimate mismatches accurately in a wide frequency range. It is also proved that an accurate estimation can be obtained even if the signal noise ratio (SNR of input signal is 20 dB. Furthermore, the results obtained from a real four-channel 5 GS/s TIADC system demonstrate the effectiveness of the proposed method. We can see that the spectra spurs due to mismatches have been effectively eliminated after calibration.

  11. Natural disaster risk analysis for critical infrastructure systems: An approach based on statistical learning theory

    International Nuclear Information System (INIS)

    Guikema, Seth D.

    2009-01-01

    Probabilistic risk analysis has historically been developed for situations in which measured data about the overall reliability of a system are limited and expert knowledge is the best source of information available. There continue to be a number of important problem areas characterized by a lack of hard data. However, in other important problem areas the emergence of information technology has transformed the situation from one characterized by little data to one characterized by data overabundance. Natural disaster risk assessments for events impacting large-scale, critical infrastructure systems such as electric power distribution systems, transportation systems, water supply systems, and natural gas supply systems are important examples of problems characterized by data overabundance. There are often substantial amounts of information collected and archived about the behavior of these systems over time. Yet it can be difficult to effectively utilize these large data sets for risk assessment. Using this information for estimating the probability or consequences of system failure requires a different approach and analysis paradigm than risk analysis for data-poor systems does. Statistical learning theory, a diverse set of methods designed to draw inferences from large, complex data sets, can provide a basis for risk analysis for data-rich systems. This paper provides an overview of statistical learning theory methods and discusses their potential for greater use in risk analysis

  12. Equipment Maintenance management support system based on statistical analysis of maintenance history data

    International Nuclear Information System (INIS)

    Shimizu, S.; Ando, Y.; Morioka, T.

    1990-01-01

    Plant maintenance is recently becoming important with the increase in the number of nuclear power stations and in plant operating time. Various kinds of requirements for plant maintenance, such as countermeasures for equipment degradation and saving maintenance costs while keeping up plant reliability and productivity, are proposed. For this purpose, plant maintenance programs should be improved based on equipment reliability estimated by field data. In order to meet these requirements, it is planned to develop an equipment maintenance management support system for nuclear power plants based on statistical analysis of equipment maintenance history data. The large difference between this proposed new method and current similar methods is to evaluate not only failure data but maintenance data, which includes normal termination data and some degree of degradation or functional disorder data for equipment and parts. So, it is possible to utilize these field data for improving maintenance schedules and to evaluate actual equipment and parts reliability under the current maintenance schedule. In the present paper, the authors show the objectives of this system, an outline of this system and its functions, and the basic technique for collecting and managing of maintenance history data on statistical analysis. It is shown, from the results of feasibility tests using simulation data of maintenance history, that this system has the ability to provide useful information for maintenance and the design enhancement

  13. Traffic and related self-driven many-particle systems

    Science.gov (United States)

    Helbing, Dirk

    2001-10-01

    Since the subject of traffic dynamics has captured the interest of physicists, many surprising effects have been revealed and explained. Some of the questions now understood are the following: Why are vehicles sometimes stopped by ``phantom traffic jams'' even though drivers all like to drive fast? What are the mechanisms behind stop-and-go traffic? Why are there several different kinds of congestion, and how are they related? Why do most traffic jams occur considerably before the road capacity is reached? Can a temporary reduction in the volume of traffic cause a lasting traffic jam? Under which conditions can speed limits speed up traffic? Why do pedestrians moving in opposite directions normally organize into lanes, while similar systems ``freeze by heating''? All of these questions have been answered by applying and extending methods from statistical physics and nonlinear dynamics to self-driven many-particle systems. This article considers the empirical data and then reviews the main approaches to modeling pedestrian and vehicle traffic. These include microscopic (particle-based), mesoscopic (gas-kinetic), and macroscopic (fluid-dynamic) models. Attention is also paid to the formulation of a micro-macro link, to aspects of universality, and to other unifying concepts, such as a general modeling framework for self-driven many-particle systems, including spin systems. While the primary focus is upon vehicle and pedestrian traffic, applications to biological or socio-economic systems such as bacterial colonies, flocks of birds, panics, and stock market dynamics are touched upon as well.

  14. Statistically validated network of portfolio overlaps and systemic risk.

    Science.gov (United States)

    Gualdi, Stanislao; Cimini, Giulio; Primicerio, Kevin; Di Clemente, Riccardo; Challet, Damien

    2016-12-21

    Common asset holding by financial institutions (portfolio overlap) is nowadays regarded as an important channel for financial contagion with the potential to trigger fire sales and severe losses at the systemic level. We propose a method to assess the statistical significance of the overlap between heterogeneously diversified portfolios, which we use to build a validated network of financial institutions where links indicate potential contagion channels. The method is implemented on a historical database of institutional holdings ranging from 1999 to the end of 2013, but can be applied to any bipartite network. We find that the proportion of validated links (i.e. of significant overlaps) increased steadily before the 2007-2008 financial crisis and reached a maximum when the crisis occurred. We argue that the nature of this measure implies that systemic risk from fire sales liquidation was maximal at that time. After a sharp drop in 2008, systemic risk resumed its growth in 2009, with a notable acceleration in 2013. We finally show that market trends tend to be amplified in the portfolios identified by the algorithm, such that it is possible to have an informative signal about institutions that are about to suffer (enjoy) the most significant losses (gains).

  15. GIA Model Statistics for GRACE Hydrology, Cryosphere, and Ocean Science

    Science.gov (United States)

    Caron, L.; Ivins, E. R.; Larour, E.; Adhikari, S.; Nilsson, J.; Blewitt, G.

    2018-03-01

    We provide a new analysis of glacial isostatic adjustment (GIA) with the goal of assembling the model uncertainty statistics required for rigorously extracting trends in surface mass from the Gravity Recovery and Climate Experiment (GRACE) mission. Such statistics are essential for deciphering sea level, ocean mass, and hydrological changes because the latter signals can be relatively small (≤2 mm/yr water height equivalent) over very large regions, such as major ocean basins and watersheds. With abundant new >7 year continuous measurements of vertical land motion (VLM) reported by Global Positioning System stations on bedrock and new relative sea level records, our new statistical evaluation of GIA uncertainties incorporates Bayesian methodologies. A unique aspect of the method is that both the ice history and 1-D Earth structure vary through a total of 128,000 forward models. We find that best fit models poorly capture the statistical inferences needed to correctly invert for lower mantle viscosity and that GIA uncertainty exceeds the uncertainty ascribed to trends from 14 years of GRACE data in polar regions.

  16. Common pitfalls in statistical analysis: Absolute risk reduction, relative risk reduction, and number needed to treat

    Science.gov (United States)

    Ranganathan, Priya; Pramesh, C. S.; Aggarwal, Rakesh

    2016-01-01

    In the previous article in this series on common pitfalls in statistical analysis, we looked at the difference between risk and odds. Risk, which refers to the probability of occurrence of an event or outcome, can be defined in absolute or relative terms. Understanding what these measures represent is essential for the accurate interpretation of study results. PMID:26952180

  17. Acceleration transforms and statistical kinetic models

    International Nuclear Information System (INIS)

    LuValle, M.J.; Welsher, T.L.; Svoboda, K.

    1988-01-01

    For a restricted class of problems a mathematical model of microscopic degradation processes, statistical kinetics, is developed and linked through acceleration transforms to the information which can be obtained from a system in which the only observable sign of degradation is sudden and catastrophic failure. The acceleration transforms were developed in accelerated life testing applications as a tool for extrapolating from the observable results of an accelerated life test to the dynamics of the underlying degradation processes. A particular concern of a physicist attempting to interpreted the results of an analysis based on acceleration transforms is determining the physical species involved in the degradation process. These species may be (a) relatively abundant or (b) relatively rare. The main results of this paper are a theorem showing that for an important subclass of statistical kinetic models, acceleration transforms cannot be used to distinguish between cases a and b, and an example showing that in some cases falling outside the restrictions of the theorem, cases a and b can be distinguished by their acceleration transforms

  18. Statistically qualified neuro-analytic failure detection method and system

    Science.gov (United States)

    Vilim, Richard B.; Garcia, Humberto E.; Chen, Frederick W.

    2002-03-02

    An apparatus and method for monitoring a process involve development and application of a statistically qualified neuro-analytic (SQNA) model to accurately and reliably identify process change. The development of the SQNA model is accomplished in two stages: deterministic model adaption and stochastic model modification of the deterministic model adaptation. Deterministic model adaption involves formulating an analytic model of the process representing known process characteristics, augmenting the analytic model with a neural network that captures unknown process characteristics, and training the resulting neuro-analytic model by adjusting the neural network weights according to a unique scaled equation error minimization technique. Stochastic model modification involves qualifying any remaining uncertainty in the trained neuro-analytic model by formulating a likelihood function, given an error propagation equation, for computing the probability that the neuro-analytic model generates measured process output. Preferably, the developed SQNA model is validated using known sequential probability ratio tests and applied to the process as an on-line monitoring system. Illustrative of the method and apparatus, the method is applied to a peristaltic pump system.

  19. Lectures on algebraic statistics

    CERN Document Server

    Drton, Mathias; Sullivant, Seth

    2009-01-01

    How does an algebraic geometer studying secant varieties further the understanding of hypothesis tests in statistics? Why would a statistician working on factor analysis raise open problems about determinantal varieties? Connections of this type are at the heart of the new field of "algebraic statistics". In this field, mathematicians and statisticians come together to solve statistical inference problems using concepts from algebraic geometry as well as related computational and combinatorial techniques. The goal of these lectures is to introduce newcomers from the different camps to algebraic statistics. The introduction will be centered around the following three observations: many important statistical models correspond to algebraic or semi-algebraic sets of parameters; the geometry of these parameter spaces determines the behaviour of widely used statistical inference procedures; computational algebraic geometry can be used to study parameter spaces and other features of statistical models.

  20. Statistical Physics An Introduction

    CERN Document Server

    Yoshioka, Daijiro

    2007-01-01

    This book provides a comprehensive presentation of the basics of statistical physics. The first part explains the essence of statistical physics and how it provides a bridge between microscopic and macroscopic phenomena, allowing one to derive quantities such as entropy. Here the author avoids going into details such as Liouville’s theorem or the ergodic theorem, which are difficult for beginners and unnecessary for the actual application of the statistical mechanics. In the second part, statistical mechanics is applied to various systems which, although they look different, share the same mathematical structure. In this way readers can deepen their understanding of statistical physics. The book also features applications to quantum dynamics, thermodynamics, the Ising model and the statistical dynamics of free spins.

  1. Intelligent tutorial system for teaching of probability and statistics at high school in Mexico

    Directory of Open Access Journals (Sweden)

    Fernando Gudino Penaloza, Miguel Gonzalez Mendoza, Neil Hernandez Gress, Jaime Mora Vargas

    2009-12-01

    Full Text Available This paper describes the implementation of an intelligent tutoring system dedicated to teaching probability and statistics atthe preparatory school (or high school in Mexico. The system solution was used as a desktop computer and adapted tocarry a mobile environment for the implementation of mobile learning or m-learning. The system complies with the idea ofbeing adaptable to the needs of each student and is able to adapt to three different teaching models that meet the criteriaof three student profiles.

  2. Statistical ultrasonics: the influence of Robert F. Wagner

    Science.gov (United States)

    Insana, Michael F.

    2009-02-01

    An important ongoing question for higher education is how to successfully mentor the next generation of scientists and engineers. It has been my privilege to have been mentored by one of the best, Dr Robert F. Wagner and his colleagues at the CDRH/FDA during the mid 1980s. Bob introduced many of us in medical ultrasonics to statistical imaging techniques. These ideas continue to broadly influence studies on adaptive aperture management (beamforming, speckle suppression, compounding), tissue characterization (texture features, Rayleigh/Rician statistics, scatterer size and number density estimators), and fundamental questions about how limitations of the human eye-brain system for extracting information from textured images can motivate image processing. He adapted the classical techniques of signal detection theory to coherent imaging systems that, for the first time in ultrasonics, related common engineering metrics for image quality to task-based clinical performance. This talk summarizes my wonderfully-exciting three years with Bob as I watched him explore topics in statistical image analysis that formed a rational basis for many of the signal processing techniques used in commercial systems today. It is a story of an exciting time in medical ultrasonics, and of how a sparkling personality guided and motivated the development of junior scientists who flocked around him in admiration and amazement.

  3. In vivo evaluation of the effect of stimulus distribution on FIR statistical efficiency in event-related fMRI.

    Science.gov (United States)

    Jansma, J Martijn; de Zwart, Jacco A; van Gelderen, Peter; Duyn, Jeff H; Drevets, Wayne C; Furey, Maura L

    2013-05-15

    Technical developments in MRI have improved signal to noise, allowing use of analysis methods such as Finite impulse response (FIR) of rapid event related functional MRI (er-fMRI). FIR is one of the most informative analysis methods as it determines onset and full shape of the hemodynamic response function (HRF) without any a priori assumptions. FIR is however vulnerable to multicollinearity, which is directly related to the distribution of stimuli over time. Efficiency can be optimized by simplifying a design, and restricting stimuli distribution to specific sequences, while more design flexibility necessarily reduces efficiency. However, the actual effect of efficiency on fMRI results has never been tested in vivo. Thus, it is currently difficult to make an informed choice between protocol flexibility and statistical efficiency. The main goal of this study was to assign concrete fMRI signal to noise values to the abstract scale of FIR statistical efficiency. Ten subjects repeated a perception task with five random and m-sequence based protocol, with varying but, according to literature, acceptable levels of multicollinearity. Results indicated substantial differences in signal standard deviation, while the level was a function of multicollinearity. Experiment protocols varied up to 55.4% in standard deviation. Results confirm that quality of fMRI in an FIR analysis can significantly and substantially vary with statistical efficiency. Our in vivo measurements can be used to aid in making an informed decision between freedom in protocol design and statistical efficiency. Published by Elsevier B.V.

  4. Nonequilibrium statistical mechanics of shear flow: invariant quantities and current relations

    International Nuclear Information System (INIS)

    Baule, A; Evans, R M L

    2010-01-01

    In modeling nonequilibrium systems one usually starts with a definition of the microscopic dynamics, e.g., in terms of transition rates, and then derives the resulting macroscopic behavior. We address the inverse question for a class of steady state systems, namely complex fluids under continuous shear flow: how does an externally imposed shear current affect the microscopic dynamics of the fluid? The answer can be formulated in the form of invariant quantities, exact relations for the transition rates in the nonequilibrium steady state, as discussed in a recent letter (Baule and Evans, 2008 Phys. Rev. Lett. 101 240601). Here, we present a more pedagogical account of the invariant quantities and the theory underlying them, known as the nonequilibrium counterpart to detailed balance (NCDB). Furthermore, we investigate the relationship between the transition rates and the shear current in the steady state. We show that a fluctuation relation of the Gallavotti–Cohen type holds for systems satisfying NCDB

  5. Statistics and Analysis of the Relations between Rainstorm Floods and Earthquakes

    Directory of Open Access Journals (Sweden)

    Baodeng Hou

    2016-01-01

    Full Text Available The frequent occurrence of geophysical disasters under climate change has drawn Chinese scholars to pay their attention to disaster relations. If the occurrence sequence of disasters could be identified, long-term disaster forecast could be realized. Based on the Earth Degassing Effect (EDE which is valid, this paper took the magnitude, epicenter, and occurrence time of the earthquake, as well as the epicenter and occurrence time of the rainstorm floods as basic factors to establish an integrated model to study the correlation between rainstorm floods and earthquakes. 2461 severe earthquakes occurred in China or within 3000 km from China and the 169 heavy rainstorm floods occurred in China over the past 200+ years as the input data of the model. The computational results showed that although most of the rainstorm floods have nothing to do with the severe earthquakes from a statistical perspective, some floods might relate to earthquakes. This is especially true when the earthquakes happen in the vapor transmission zone where rainstorms lead to abundant water vapors. In this regard, earthquakes are more likely to cause big rainstorm floods. However, many cases of rainstorm floods could be found after severe earthquakes with a large extent of uncertainty.

  6. Statistical mechanics and field theory

    International Nuclear Information System (INIS)

    Samuel, S.A.

    1979-05-01

    Field theory methods are applied to statistical mechanics. Statistical systems are related to fermionic-like field theories through a path integral representation. Considered are the Ising model, the free-fermion model, and close-packed dimer problems on various lattices. Graphical calculational techniques are developed. They are powerful and yield a simple procedure to compute the vacuum expectation value of an arbitrary product of Ising spin variables. From a field theorist's point of view, this is the simplest most logical derivation of the Ising model partition function and correlation functions. This work promises to open a new area of physics research when the methods are used to approximate unsolved problems. By the above methods a new model named the 128 pseudo-free vertex model is solved. Statistical mechanics intuition is applied to field theories. It is shown that certain relativistic field theories are equivalent to classical interacting gases. Using this analogy many results are obtained, particularly for the Sine-Gordon field theory. Quark confinement is considered. Although not a proof of confinement, a logical, esthetic, and simple picture is presented of how confinement works. A key ingredient is the insight gained by using an analog statistical system consisting of a gas of macromolecules. This analogy allows the computation of Wilson loops in the presence of topological vortices and when symmetry breakdown occurs in the topological quantum number. Topological symmetry breakdown calculations are placed on approximately the same level of rigor as instanton calculations. The picture of confinement that emerges is similar to the dual Meissner type advocated by Mandelstam. Before topological symmetry breakdown, QCD has monopoles bound linearly together by three topological strings. Topological symmetry breakdown corresponds to a new phase where these monopoles are liberated. It is these liberated monopoles that confine quarks. 64 references

  7. Analysis of statistical misconception in terms of statistical reasoning

    Science.gov (United States)

    Maryati, I.; Priatna, N.

    2018-05-01

    Reasoning skill is needed for everyone to face globalization era, because every person have to be able to manage and use information from all over the world which can be obtained easily. Statistical reasoning skill is the ability to collect, group, process, interpret, and draw conclusion of information. Developing this skill can be done through various levels of education. However, the skill is low because many people assume that statistics is just the ability to count and using formulas and so do students. Students still have negative attitude toward course which is related to research. The purpose of this research is analyzing students’ misconception in descriptive statistic course toward the statistical reasoning skill. The observation was done by analyzing the misconception test result and statistical reasoning skill test; observing the students’ misconception effect toward statistical reasoning skill. The sample of this research was 32 students of math education department who had taken descriptive statistic course. The mean value of misconception test was 49,7 and standard deviation was 10,6 whereas the mean value of statistical reasoning skill test was 51,8 and standard deviation was 8,5. If the minimal value is 65 to state the standard achievement of a course competence, students’ mean value is lower than the standard competence. The result of students’ misconception study emphasized on which sub discussion that should be considered. Based on the assessment result, it was found that students’ misconception happen on this: 1) writing mathematical sentence and symbol well, 2) understanding basic definitions, 3) determining concept that will be used in solving problem. In statistical reasoning skill, the assessment was done to measure reasoning from: 1) data, 2) representation, 3) statistic format, 4) probability, 5) sample, and 6) association.

  8. Statistical Physics and Light-Front Quantization

    Energy Technology Data Exchange (ETDEWEB)

    Raufeisen, J

    2004-08-12

    Light-front quantization has important advantages for describing relativistic statistical systems, particularly systems for which boost invariance is essential, such as the fireball created in a heavy ion collisions. In this paper the authors develop light-front field theory at finite temperature and density with special attention to quantum chromodynamics. They construct the most general form of the statistical operator allowed by the Poincare algebra and show that there are no zero-mode related problems when describing phase transitions. They then demonstrate a direct connection between densities in light-front thermal field theory and the parton distributions measured in hard scattering experiments. The approach thus generalizes the concept of a parton distribution to finite temperature. In light-front quantization, the gauge-invariant Green's functions of a quark in a medium can be defined in terms of just 2-component spinors and have a much simpler spinor structure than the equal-time fermion propagator. From the Green's function, the authors introduce the new concept of a light-front density matrix, whose matrix elements are related to forward and to off-diagonal parton distributions. Furthermore, they explain how thermodynamic quantities can be calculated in discretized light-cone quantization, which is applicable at high chemical potential and is not plagued by the fermion-doubling problems.

  9. Fluctuation Relations for Currents

    Science.gov (United States)

    Sinitsyn, Nikolai; Akimov, Alexei; Chernyak, Vladimir; Chertkov, Michael

    2011-03-01

    We consider a non-equilibrium statistical system on a graph or a network. Identical particles are injected, interact with each other, traverse, and leave the graph in a stochastic manner described in terms of Poisson rates, possibly strongly dependent on time and instantaneous occupation numbers at the nodes of the graph. We show that the system demonstrates a profound statistical symmetry, leading to new Fluctuation Relations that originate from the supersymmetry and the principle of the geometric universality of currents rather than from the relations between probabilities of forward and reverse trajectories. NSF/ECCS-0925618, NSF/CHE-0808910 and DOE at LANL under Contract No. DE-AC52-06NA25396.

  10. On the statistical-mechanical meaning of the Bousso bound

    International Nuclear Information System (INIS)

    Pesci, Alessandro

    2008-01-01

    The Bousso entropy bound, in its generalized form, is investigated for the case of perfect fluids at local thermodynamic equilibrium and evidence is found that the bound is satisfied if and only if a certain local thermodynamic property holds, emerging when the attempt is made to apply the bound to thin layers of matter. This property consists of the existence of an ultimate lower limit l* to the thickness of the slices for which a statistical-mechanical description is viable, depending l* on the thermodynamical variables which define the state of the system locally. This limiting scale, found to be in general much larger than the Planck scale (so that no Planck scale physics must be necessarily invoked to justify it), appears not related to gravity and this suggests that the generalized entropy bound is likely to be rooted on conventional flat-spacetime statistical mechanics, with the maximum admitted entropy being however actually determined also by gravity. Some examples of ideal fluids are considered in order to identify the mechanisms which can set a lower limit to the statistical-mechanical description and these systems are found to respect the lower limiting scale l*. The photon gas, in particular, appears to seemingly saturate this limiting scale and the consequence is drawn that for systems consisting of a single slice of a photon gas with thickness l*, the generalized Bousso bound is saturated. It is argued that this seems to open the way to a peculiar understanding of black hole entropy: if an entropy can meaningfully (i.e. with a second law) be assigned to a black hole, the value A/4 for it (where A is the area of the black hole) is required simply by (conventional) statistical mechanics coupled to general relativity

  11. An 'electronic' extramural course in epidemiology and medical statistics.

    Science.gov (United States)

    Ostbye, T

    1989-03-01

    This article describes an extramural university course in epidemiology and medical statistics taught using a computer conferencing system, microcomputers and data communications. Computer conferencing was shown to be a powerful, yet quite easily mastered, vehicle for distance education. It allows health personnel unable to attend regular classes due to geographical or time constraints, to take part in an interactive learning environment at low cost. This overcomes part of the intellectual and social isolation associated with traditional correspondence courses. Teaching of epidemiology and medical statistics is well suited to computer conferencing, even if the asynchronicity of the medium makes discussion of the most complex statistical concepts a little cumbersome. Computer conferencing may also prove to be a useful tool for teaching other medical and health related subjects.

  12. Statistical moments of the Strehl ratio

    Science.gov (United States)

    Yaitskova, Natalia; Esselborn, Michael; Gladysz, Szymon

    2012-07-01

    Knowledge of the statistical characteristics of the Strehl ratio is essential for the performance assessment of the existing and future adaptive optics systems. For full assessment not only the mean value of the Strehl ratio but also higher statistical moments are important. Variance is related to the stability of an image and skewness reflects the chance to have in a set of short exposure images more or less images with the quality exceeding the mean. Skewness is a central parameter in the domain of lucky imaging. We present a rigorous theory for the calculation of the mean value, the variance and the skewness of the Strehl ratio. In our approach we represent the residual wavefront as being formed by independent cells. The level of the adaptive optics correction defines the number of the cells and the variance of the cells, which are the two main parameters of our theory. The deliverables are the values of the three moments as the functions of the correction level. We make no further assumptions except for the statistical independence of the cells.

  13. Mathematical and Statistical Techniques for Systems Medicine: The Wnt Signaling Pathway as a Case Study

    KAUST Repository

    MacLean, Adam L.

    2015-12-16

    The last decade has seen an explosion in models that describe phenomena in systems medicine. Such models are especially useful for studying signaling pathways, such as the Wnt pathway. In this chapter we use the Wnt pathway to showcase current mathematical and statistical techniques that enable modelers to gain insight into (models of) gene regulation and generate testable predictions. We introduce a range of modeling frameworks, but focus on ordinary differential equation (ODE) models since they remain the most widely used approach in systems biology and medicine and continue to offer great potential. We present methods for the analysis of a single model, comprising applications of standard dynamical systems approaches such as nondimensionalization, steady state, asymptotic and sensitivity analysis, and more recent statistical and algebraic approaches to compare models with data. We present parameter estimation and model comparison techniques, focusing on Bayesian analysis and coplanarity via algebraic geometry. Our intention is that this (non-exhaustive) review may serve as a useful starting point for the analysis of models in systems medicine.

  14. Statistical mechanics

    CERN Document Server

    Davidson, Norman

    2003-01-01

    Clear and readable, this fine text assists students in achieving a grasp of the techniques and limitations of statistical mechanics. The treatment follows a logical progression from elementary to advanced theories, with careful attention to detail and mathematical development, and is sufficiently rigorous for introductory or intermediate graduate courses.Beginning with a study of the statistical mechanics of ideal gases and other systems of non-interacting particles, the text develops the theory in detail and applies it to the study of chemical equilibrium and the calculation of the thermody

  15. Non-equilibrium statistical theory about microscopic fatigue cracks of metal in magnetic field

    International Nuclear Information System (INIS)

    Zhao-Long, Liu; Hai-Yun, Hu; Tian-You, Fan; Xiu-San, Xing

    2010-01-01

    This paper develops the non-equilibrium statistical fatigue damage theory to study the statistical behaviour of micro-crack for metals in magnetic field. The one-dimensional homogeneous crack system is chosen for study. To investigate the effect caused by magnetic field on the statistical distribution of micro-crack in the system, the theoretical analysis on microcrack evolution equation, the average length of micro-crack, density distribution function of micro-crack and fatigue fracture probability have been performed. The derived results relate the changes of some quantities, such as average length, density distribution function and fatigue fracture probability, to the applied magnetic field, the magnetic and mechanical properties of metals. It gives a theoretical explanation on the change of fatigue damage due to magnetic fields observed by experiments, and presents an analytic approach on studying the fatigue damage of metal in magnetic field. (cross-disciplinary physics and related areas of science and technology)

  16. Record Statistics and Dynamics

    DEFF Research Database (Denmark)

    Sibani, Paolo; Jensen, Henrik J.

    2009-01-01

    with independent random increments. The term record dynamics covers the rather new idea that records may, in special situations, have measurable dynamical consequences. The approach applies to the aging dynamics of glasses and other systems with multiple metastable states. The basic idea is that record sizes...... fluctuations of e. g. the energy are able to push the system past some sort of ‘edge of stability’, inducing irreversible configurational changes, whose statistics then closely follows the statistics of record fluctuations....

  17. Counting statistics in radioactivity measurements

    International Nuclear Information System (INIS)

    Martin, J.

    1975-01-01

    The application of statistical methods to radioactivity measurement problems is analyzed in several chapters devoted successively to: the statistical nature of radioactivity counts; the application to radioactive counting of two theoretical probability distributions, Poisson's distribution law and the Laplace-Gauss law; true counting laws; corrections related to the nature of the apparatus; statistical techniques in gamma spectrometry [fr

  18. Statistical analogues of thermodynamic extremum principles

    Science.gov (United States)

    Ramshaw, John D.

    2018-05-01

    As shown by Jaynes, the canonical and grand canonical probability distributions of equilibrium statistical mechanics can be simply derived from the principle of maximum entropy, in which the statistical entropy S=- {k}{{B}}{\\sum }i{p}i{log}{p}i is maximised subject to constraints on the mean values of the energy E and/or number of particles N in a system of fixed volume V. The Lagrange multipliers associated with those constraints are then found to be simply related to the temperature T and chemical potential μ. Here we show that the constrained maximisation of S is equivalent to, and can therefore be replaced by, the essentially unconstrained minimisation of the obvious statistical analogues of the Helmholtz free energy F = E ‑ TS and the grand potential J = F ‑ μN. Those minimisations are more easily performed than the maximisation of S because they formally eliminate the constraints on the mean values of E and N and their associated Lagrange multipliers. This procedure significantly simplifies the derivation of the canonical and grand canonical probability distributions, and shows that the well known extremum principles for the various thermodynamic potentials possess natural statistical analogues which are equivalent to the constrained maximisation of S.

  19. Trajectory and Relative Dispersion Case Studies and Statistics from the Green River Mesoscale Deformation, Dispersion, and Dissipation Program

    Science.gov (United States)

    Niemann, Brand Lee

    A major field program to study beta-mesoscale transport and dispersion over complex mountainous terrain was conducted during 1969 with the cooperation of three government agencies at the White Sands Missile Range in central Utah. The purpose of the program was to measure simultaneously on a large number of days the synoptic and mesoscale wind fields, the relative dispersion between pairs of particle trajectories and the rate of small scale turbulence dissipation. The field program included measurements during more than 60 days in the months of March, June, and November. The large quantity of data generated from this program has been processed and analyzed to provide case studies and statistics to evaluate and refine Lagrangian variable trajectory models. The case studies selected to illustrate the complexities of mesoscale transport and dispersion over complex terrain include those with terrain blocking, lee waves, and stagnation, as well as those with large vertical wind shears and horizontal wind field deformation. The statistics of relative particle dispersion were computed and compared to the classical theories of Richardson and Batchelor and the more recent theories of Lin and Kao among others. The relative particle dispersion was generally found to increase with travel time in the alongwind and crosswind directions, but in a more oscillatory than sustained or even accelerated manner as predicted by most theories, unless substantial wind shears or finite vertical separations between particles were present. The relative particle dispersion in the vertical was generally found to be small and bounded even when substantial vertical motions due to lee waves were present because of the limiting effect of stable temperature stratification. The data show that velocity shears have a more significant effect than turbulence on relative particle dispersion and that sufficient turbulence may not always be present above the planetary boundary layer for "wind direction shear

  20. Emergence of quantum mechanics from classical statistics

    International Nuclear Information System (INIS)

    Wetterich, C

    2009-01-01

    The conceptual setting of quantum mechanics is subject to an ongoing debate from its beginnings until now. The consequences of the apparent differences between quantum statistics and classical statistics range from the philosophical interpretations to practical issues as quantum computing. In this note we demonstrate how quantum mechanics can emerge from classical statistical systems. We discuss conditions and circumstances for this to happen. Quantum systems describe isolated subsystems of classical statistical systems with infinitely many states. While infinitely many classical observables 'measure' properties of the subsystem and its environment, the state of the subsystem can be characterized by the expectation values of only a few probabilistic observables. They define a density matrix, and all the usual laws of quantum mechanics follow. No concepts beyond classical statistics are needed for quantum physics - the differences are only apparent and result from the particularities of those classical statistical systems which admit a quantum mechanical description. In particular, we show how the non-commuting properties of quantum operators are associated to the use of conditional probabilities within the classical system, and how a unitary time evolution reflects the isolation of the subsystem.

  1. Tennessee StreamStats: A Web-Enabled Geographic Information System Application for Automating the Retrieval and Calculation of Streamflow Statistics

    Science.gov (United States)

    Ladd, David E.; Law, George S.

    2007-01-01

    The U.S. Geological Survey (USGS) provides streamflow and other stream-related information needed to protect people and property from floods, to plan and manage water resources, and to protect water quality in the streams. Streamflow statistics provided by the USGS, such as the 100-year flood and the 7-day 10-year low flow, frequently are used by engineers, land managers, biologists, and many others to help guide decisions in their everyday work. In addition to streamflow statistics, resource managers often need to know the physical and climatic characteristics (basin characteristics) of the drainage basins for locations of interest to help them understand the mechanisms that control water availability and water quality at these locations. StreamStats is a Web-enabled geographic information system (GIS) application that makes it easy for users to obtain streamflow statistics, basin characteristics, and other information for USGS data-collection stations and for ungaged sites of interest. If a user selects the location of a data-collection station, StreamStats will provide previously published information for the station from a database. If a user selects a location where no data are available (an ungaged site), StreamStats will run a GIS program to delineate a drainage basin boundary, measure basin characteristics, and estimate streamflow statistics based on USGS streamflow prediction methods. A user can download a GIS feature class of the drainage basin boundary with attributes including the measured basin characteristics and streamflow estimates.

  2. Statistical core design

    International Nuclear Information System (INIS)

    Oelkers, E.; Heller, A.S.; Farnsworth, D.A.; Kearfott, K.J.

    1978-01-01

    The report describes the statistical analysis of DNBR thermal-hydraulic margin of a 3800 MWt, 205-FA core under design overpower conditions. The analysis used LYNX-generated data at predetermined values of the input variables whose uncertainties were to be statistically combined. LYNX data were used to construct an efficient response surface model in the region of interest; the statistical analysis was accomplished through the evaluation of core reliability; utilizing propagation of the uncertainty distributions of the inputs. The response surface model was implemented in both the analytical error propagation and Monte Carlo Techniques. The basic structural units relating to the acceptance criteria are fuel pins. Therefore, the statistical population of pins with minimum DNBR values smaller than specified values is determined. The specified values are designated relative to the most probable and maximum design DNBR values on the power limiting pin used in present design analysis, so that gains over the present design criteria could be assessed for specified probabilistic acceptance criteria. The results are equivalent to gains ranging from 1.2 to 4.8 percent of rated power dependent on the acceptance criterion. The corresponding acceptance criteria range from 95 percent confidence that no pin will be in DNB to 99.9 percent of the pins, which are expected to avoid DNB

  3. The Norwegian research and innovation system - statistics and indicators 2003

    International Nuclear Information System (INIS)

    2003-01-01

    This is the fourth report in a series from the Research Council of Norway. The report shows the extent of the resource use in research and development and innovation and presents results of these activities. As a basis the R and D and the innovation statistics for 2001 are used as well as other statistics and analyses. The report contains time series and international comparisons. The aim of the report is to present a collective survey of the state and development of the activities in Norway within research, innovation, science and technology. This includes data regarding costs and financing of the R and D work, human resources, cooperation relations and results from the R and D and innovation activities, publishing and quotations, patenting and trade balances included. The report opens with a research political article about research as basis for new business. Furthermore several ''focusboxes'' are included that indicate the development of science and technology indicators within various themes. In the report for 2003 the EU central indicator pairs for reference testing are included for the first time and a survey is made of public investigations, white papers and parliamentary proposals within research, higher education and innovation. For the second time a short English version is included

  4. [The main directions of reforming the service of medical statistics in Ukraine].

    Science.gov (United States)

    Golubchykov, Mykhailo V; Orlova, Nataliia M; Bielikova, Inna V

    2018-01-01

    accepted methods for collecting, processing, analyzing and disseminating medical and statistical information; the creation of a medical and statistical service that adapted to the specifics of market relations in health care, flexible and sensitive to changes in international methodologies and standards. Conclusions: The data of medical statistics are the basis for taking managerial decisions by managers at all levels of health care. Reform of Medical Statistics Service of Ukraine should be considered only in the context of the reform of the entire health system. The main directions of the reform of the medical statistics service in Ukraine are: the introduction of information technologies, the improvement of the training of personnel for the service, the improvement of material and technical equipment, the maximum reuse of the data obtained, which provides for the unification of primary data and a system of indicators. The most difficult area is the formation of information funds and the introduction of modern information technologies.

  5. Statistical analysis of the potassium concentration obtained through

    International Nuclear Information System (INIS)

    Pereira, Joao Eduardo da Silva; Silva, Jose Luiz Silverio da; Pires, Carlos Alberto da Fonseca; Strieder, Adelir Jose

    2007-01-01

    The present work was developed in outcrops of Santa Maria region, southern Brazil, Rio Grande do Sul State. Statistic evaluations were applied in different rock types. The possibility to distinguish different geologic units, sedimentary and volcanic (acid and basic types) by means of the statistic analyses from the use of airborne gamma-ray spectrometry integrating potash radiation emissions data with geological and geochemistry data is discussed. This Project was carried out at 1973 by Geological Survey of Brazil/Companhia de Pesquisas de Recursos Minerais. The Camaqua Project evaluated the behavior of potash concentrations generating XYZ Geosof 1997 format, one grid, thematic map and digital thematic map files from this total area. Using these data base, the integration of statistics analyses in sedimentary formations which belong to the Depressao Central do Rio Grande do Sul and/or to volcanic rocks from Planalto da Serra Geral at the border of Parana Basin was tested. Univariate statistics model was used: the media, the standard media error, and the trust limits were estimated. The Tukey's Test was used in order to compare mean values. The results allowed to create criteria to distinguish geological formations based on their potash content. The back-calibration technique was employed to transform K radiation to percentage. Inside this context it was possible to define characteristic values from radioactive potash emissions and their trust ranges in relation to geologic formations. The potash variable when evaluated in relation to geographic Universal Transverse Mercator coordinates system showed a spatial relation following one polynomial model of second order, with one determination coefficient. The statistica 7.1 software Generalist Linear Models produced by Statistics Department of Federal University of Santa Maria/Brazil was used. (author)

  6. Thermal and statistical properties of nuclei and nuclear systems

    International Nuclear Information System (INIS)

    Moretto, L.G.; Wozniak, G.J.

    1989-07-01

    The term statistical decay, statistical or thermodynamic equilibrium, thermalization, temperature, etc., have been used in nuclear physics since the introduction of the compound nucleus (CN) concept, and they are still used, perhaps even more frequently, in the context of intermediate- and high-energy heavy-ion reactions. Unfortunately, the increased popularity of these terms has not made them any clearer, and more often than not one encounters sweeping statements about the alleged statisticity of a nuclear process where the ''statistical'' connotation is a more apt description of the state of the speaker's mind than of the nuclear reaction. It is our goal, in this short set of lectures, to set at least some ideas straight on this broad and beautiful subject, on the one hand by clarifying some fundamental concepts, on the other by presenting some interesting applications to actual physical cases. 74 refs., 38 figs

  7. Statistical distribution of the local purity in a large quantum system

    International Nuclear Information System (INIS)

    De Pasquale, A; Pascazio, S; Facchi, P; Giovannetti, V; Parisi, G; Scardicchio, A

    2012-01-01

    The local purity of large many-body quantum systems can be studied by following a statistical mechanical approach based on a random matrix model. Restricting the analysis to the case of global pure states, this method proved to be successful, and a full characterization of the statistical properties of the local purity was obtained by computing the partition function of the problem. Here we generalize these techniques to the case of global mixed states. In this context, by uniformly sampling the phase space of states with assigned global mixedness, we determine the exact expression of the first two moments of the local purity and a general expression for the moments of higher order. This generalizes previous results obtained for globally pure configurations. Furthermore, through the introduction of a partition function for a suitable canonical ensemble, we compute the approximate expression of the first moment of the marginal purity in the high-temperature regime. In the process, we establish a formal connection with the theory of quantum twirling maps that provides an alternative, possibly fruitful, way of performing the calculation. (paper)

  8. Exploring Factors Related to Completion of an Online Undergraduate-Level Introductory Statistics Course

    Science.gov (United States)

    Zimmerman, Whitney Alicia; Johnson, Glenn

    2017-01-01

    Data were collected from 353 online undergraduate introductory statistics students at the beginning of a semester using the Goals and Outcomes Associated with Learning Statistics (GOALS) instrument and an abbreviated form of the Statistics Anxiety Rating Scale (STARS). Data included a survey of expected grade, expected time commitment, and the…

  9. Identical particles, exotic statistics and braid groups

    International Nuclear Information System (INIS)

    Imbo, T.D.; Sudarshan, E.C.G.; Shah Imbo, C.

    1990-01-01

    The inequivalent quantizations of a system of n identical particles on a manifold M, dim M≥2, are in 1-1 correspondence with irreducible unitary representations of the braid group B n (M). The notion of the statistics of the particles is made precise. We give various examples where all the possible statistics for the system are determined, and find instances where the particles obey statistics different from the well-studied Bose, Fermi para- and θ-statistics. (orig.)

  10. Designing Solutions by a Student Centred Approach: Integration of Chemical Process Simulation with Statistical Tools to Improve Distillation Systems

    Directory of Open Access Journals (Sweden)

    Isabel M. Joao

    2017-09-01

    Full Text Available Projects thematically focused on simulation and statistical techniques for designing and optimizing chemical processes can be helpful in chemical engineering education in order to meet the needs of engineers. We argue for the relevance of the projects to improve a student centred approach and boost higher order thinking skills. This paper addresses the use of Aspen HYSYS by Portuguese chemical engineering master students to model distillation systems together with statistical experimental design techniques in order to optimize the systems highlighting the value of applying problem specific knowledge, simulation tools and sound statistical techniques. The paper summarizes the work developed by the students in order to model steady-state processes, dynamic processes and optimize the distillation systems emphasizing the benefits of the simulation tools and statistical techniques in helping the students learn how to learn. Students strengthened their domain specific knowledge and became motivated to rethink and improve chemical processes in their future chemical engineering profession. We discuss the main advantages of the methodology from the students’ and teachers perspective

  11. Improving the Statistical Modeling of the TRMM Extreme Precipitation Monitoring System

    Science.gov (United States)

    Demirdjian, L.; Zhou, Y.; Huffman, G. J.

    2016-12-01

    This project improves upon an existing extreme precipitation monitoring system based on the Tropical Rainfall Measuring Mission (TRMM) daily product (3B42) using new statistical models. The proposed system utilizes a regional modeling approach, where data from similar grid locations are pooled to increase the quality and stability of the resulting model parameter estimates to compensate for the short data record. The regional frequency analysis is divided into two stages. In the first stage, the region defined by the TRMM measurements is partitioned into approximately 27,000 non-overlapping clusters using a recursive k-means clustering scheme. In the second stage, a statistical model is used to characterize the extreme precipitation events occurring in each cluster. Instead of utilizing the block-maxima approach used in the existing system, where annual maxima are fit to the Generalized Extreme Value (GEV) probability distribution at each cluster separately, the present work adopts the peak-over-threshold (POT) method of classifying points as extreme if they exceed a pre-specified threshold. Theoretical considerations motivate the use of the Generalized-Pareto (GP) distribution for fitting threshold exceedances. The fitted parameters can be used to construct simple and intuitive average recurrence interval (ARI) maps which reveal how rare a particular precipitation event is given its spatial location. The new methodology eliminates much of the random noise that was produced by the existing models due to a short data record, producing more reasonable ARI maps when compared with NOAA's long-term Climate Prediction Center (CPC) ground based observations. The resulting ARI maps can be useful for disaster preparation, warning, and management, as well as increased public awareness of the severity of precipitation events. Furthermore, the proposed methodology can be applied to various other extreme climate records.

  12. Classical model of intermediate statistics

    International Nuclear Information System (INIS)

    Kaniadakis, G.

    1994-01-01

    In this work we present a classical kinetic model of intermediate statistics. In the case of Brownian particles we show that the Fermi-Dirac (FD) and Bose-Einstein (BE) distributions can be obtained, just as the Maxwell-Boltzmann (MD) distribution, as steady states of a classical kinetic equation that intrinsically takes into account an exclusion-inclusion principle. In our model the intermediate statistics are obtained as steady states of a system of coupled nonlinear kinetic equations, where the coupling constants are the transmutational potentials η κκ' . We show that, besides the FD-BE intermediate statistics extensively studied from the quantum point of view, we can also study the MB-FD and MB-BE ones. Moreover, our model allows us to treat the three-state mixing FD-MB-BE intermediate statistics. For boson and fermion mixing in a D-dimensional space, we obtain a family of FD-BE intermediate statistics by varying the transmutational potential η BF . This family contains, as a particular case when η BF =0, the quantum statistics recently proposed by L. Wu, Z. Wu, and J. Sun [Phys. Lett. A 170, 280 (1992)]. When we consider the two-dimensional FD-BE statistics, we derive an analytic expression of the fraction of fermions. When the temperature T→∞, the system is composed by an equal number of bosons and fermions, regardless of the value of η BF . On the contrary, when T=0, η BF becomes important and, according to its value, the system can be completely bosonic or fermionic, or composed both by bosons and fermions

  13. Quantum Statistical Operator and Classically Chaotic Hamiltonian ...

    African Journals Online (AJOL)

    Quantum Statistical Operator and Classically Chaotic Hamiltonian System. ... Journal of the Nigerian Association of Mathematical Physics ... In a Hamiltonian system von Neumann Statistical Operator is used to tease out the quantum consequence of (classical) chaos engendered by the nonlinear coupling of system to its ...

  14. ANALYSIS OF STATISTICAL DATA FROM NETWORK INFRASTRUCTURE MONITORING TO DETECT ABNORMAL BEHAVIOR OF SYSTEM LOCAL SEGMENTS

    Directory of Open Access Journals (Sweden)

    N. A. Bazhayev

    2017-01-01

    Full Text Available We propose a method of information security monitoring for a wireless network segments of low-power devices, "smart house", "Internet of Things". We have carried out the analysis of characteristics of systems based on wireless technologies, resulting from passive surveillance and active polling of devices that make up the network infrastructure. We have considered a number of external signs of unauthorized access to a wireless network by the potential information security malefactor. The model for analysis of information security conditions is based on the identity, quantity, frequency, and time characteristics. Due to the main features of devices providing network infrastructure, estimation of information security state is directed to the analysis of the system normal operation, rather than the search for signatures and anomalies during performance of various kinds of information attacks. An experiment is disclosed that provides obtaining statistical information on the remote wireless devices, where the accumulation of data for decision-making is done by comparing the statistical information service messages from end nodes in passive and active modes. We present experiment results of the information influence on a typical system. The proposed approach to the analysis of network infrastructure statistical data based on naive Bayesian classifier can be used to determine the state of information security.

  15. Bandwidth Reservation Using Velocity and Handoff Statistics for Cellular Networks

    Institute of Scientific and Technical Information of China (English)

    Chuan-Lin Zhang; Kam Yiu Lam; Wei-Jia Jia

    2006-01-01

    The percentages of blocking and forced termination rates as parameters representing quality of services (QoS)requirements are presented. The relation between the connection statistics of mobile users in a cell and the handoff number and new call number in next duration in each cell is explored. Based on the relation, statistic reservation tactics are raised.The amount of bandwidth for new calls and handoffs of each cell in next period is determined by using the strategy. Using this method can guarantee the communication system suits mobile connection request dynamic. The QoS parameters:forced termination rate and blocking rate can be maintained steadily though they may change with the offered load. Some numerical experiments demonstrate this is a practical method with affordable overhead.

  16. Organ Donation and Transplantation Statistics

    Science.gov (United States)

    ... You are here Home » Organ Donation and Transplantation Statistics There are currently 121,678 people waiting for ... org/2015/view/v2_07.aspx Facts and statistics provided by the United States Renal Data System , ...

  17. Automated material accounting statistics system (AMASS)

    International Nuclear Information System (INIS)

    Messinger, M.; Lumb, R.F.; Tingey, F.H.

    1981-01-01

    In this paper the modeling and statistical analysis of measurement and process data for nuclear material accountability is readdressed under a more general framework than that provided in the literature. The result of this effort is a computer program (AMASS) which uses the algorithms and equations of this paper to accomplish the analyses indicated. The actual application of the method to process data is emphasized

  18. Statistical approach to bistable behaviour of a nonlinear system in a stationary field

    International Nuclear Information System (INIS)

    Luks, A.; Perina, J.; Perinova, V.; Bertolotti, M.; Sibilia, C.

    1984-01-01

    The quantum statistical properties of an elastic scattering process are investigated comprising crossed light beams which are in interaction with a particle (electron) beam treated as ''two-step'' system. Using the master equation and the generalized Fokker-Planck equation techniques, the integrated intensities are characterized by their probability distributions and it is demonstrated that single modes exhibit two-peak bistable behaviour. (author)

  19. Statistical data processing with automatic system for environmental radiation monitoring

    International Nuclear Information System (INIS)

    Zarkh, V.G.; Ostroglyadov, S.V.

    1986-01-01

    Practice of statistical data processing for radiation monitoring is exemplified, and some results obtained are presented. Experience in practical application of mathematical statistics methods for radiation monitoring data processing allowed to develop a concrete algorithm of statistical processing realized in M-6000 minicomputer. The suggested algorithm by its content is divided into 3 parts: parametrical data processing and hypotheses test, pair and multiple correlation analysis. Statistical processing programms are in a dialogue operation. The above algorithm was used to process observed data over radioactive waste disposal control region. Results of surface waters monitoring processing are presented

  20. Differences and discriminatory power of water polo game-related statistics in men in international championships and their relationship with the phase of the competition.

    Science.gov (United States)

    Escalante, Yolanda; Saavedra, Jose M; Tella, Victor; Mansilla, Mirella; García-Hermoso, Antonio; Domínguez, Ana M

    2013-04-01

    The aims of this study were (a) to compare water polo game-related statistics by context (winning and losing teams) and phase (preliminary, classification, and semifinal/bronze medal/gold medal), and (b) identify characteristics that discriminate performances for each phase. The game-related statistics of the 230 men's matches played in World Championships (2007, 2009, and 2011) and European Championships (2008 and 2010) were analyzed. Differences between contexts (winning or losing teams) in each phase (preliminary, classification, and semifinal/bronze medal/gold medal) were determined using the chi-squared statistic, also calculating the effect sizes of the differences. A discriminant analysis was then performed after the sample-splitting method according to context (winning and losing teams) in each of the 3 phases. It was found that the game-related statistics differentiate the winning from the losing teams in each phase of an international championship. The differentiating variables are both offensive and defensive, including action shots, sprints, goalkeeper-blocked shots, and goalkeeper-blocked action shots. However, the number of discriminatory variables decreases as the phase becomes more demanding and the teams become more equally matched. The discriminant analysis showed the game-related statistics to discriminate performance in all phases (preliminary, classificatory, and semifinal/bronze medal/gold medal phase) with high percentages (91, 90, and 73%, respectively). Again, the model selected both defensive and offensive variables.

  1. Level-statistics in Disordered Systems: A single parametric scaling and Connection to Brownian Ensembles

    OpenAIRE

    Shukla, Pragya

    2004-01-01

    We find that the statistics of levels undergoing metal-insulator transition in systems with multi-parametric Gaussian disorders and non-interacting electrons behaves in a way similar to that of the single parametric Brownian ensembles \\cite{dy}. The latter appear during a Poisson $\\to$ Wigner-Dyson transition, driven by a random perturbation. The analogy provides the analytical evidence for the single parameter scaling of the level-correlations in disordered systems as well as a tool to obtai...

  2. Symmetries and statistical behavior in fermion systems

    International Nuclear Information System (INIS)

    French, J.B.; Draayer, J.P.

    1978-01-01

    The interplay between statistical behavior and symmetries in nuclei, as revealed, for example, by spectra and by distributions for various kinds of excitations is considered. Methods and general results, rather than specific applications, are given. 16 references

  3. Thermodynamics and statistical physics. 2. rev. ed.

    International Nuclear Information System (INIS)

    Schnakenberg, J.

    2002-01-01

    This textbook covers tthe following topics: Thermodynamic systems and equilibrium, irreversible thermodynamics, thermodynamic potentials, stability, thermodynamic processes, ideal systems, real gases and phase transformations, magnetic systems and Landau model, low temperature thermodynamics, canonical ensembles, statistical theory, quantum statistics, fermions and bosons, kinetic theory, Bose-Einstein condensation, photon gas

  4. Two decades of change in transportation reflections from transportation statistics annual reports 1994–2014.

    Science.gov (United States)

    2015-01-01

    The Bureau of Transportation Statistics (BTS) provides information to support understanding and decision-making related to the transportation system, including the size and extent of the system, how it is used, how well it works, and its contribution...

  5. Computing physical properties with quantum Monte Carlo methods with statistical fluctuations independent of system size.

    Science.gov (United States)

    Assaraf, Roland

    2014-12-01

    We show that the recently proposed correlated sampling without reweighting procedure extends the locality (asymptotic independence of the system size) of a physical property to the statistical fluctuations of its estimator. This makes the approach potentially vastly more efficient for computing space-localized properties in large systems compared with standard correlated methods. A proof is given for a large collection of noninteracting fragments. Calculations on hydrogen chains suggest that this behavior holds not only for systems displaying short-range correlations, but also for systems with long-range correlations.

  6. EDI Performance Statistics

    Data.gov (United States)

    U.S. Department of Health & Human Services — This section contains statistical information and reports related to the percentage of electronic transactions being sent to Medicare contractors in the formats...

  7. Decision Support Systems: Applications in Statistics and Hypothesis Testing.

    Science.gov (United States)

    Olsen, Christopher R.; Bozeman, William C.

    1988-01-01

    Discussion of the selection of appropriate statistical procedures by educators highlights a study conducted to investigate the effectiveness of decision aids in facilitating the use of appropriate statistics. Experimental groups and a control group using a printed flow chart, a computer-based decision aid, and a standard text are described. (11…

  8. Symmetries and statistical behavior in fermion systems

    Energy Technology Data Exchange (ETDEWEB)

    French, J.B.; Draayer, J.P.

    1978-01-01

    The interplay between statistical behavior and symmetries in nuclei, as revealed, for example, by spectra and by distributions for various kinds of excitations is considered. Methods and general results, rather than specific applications, are given. 16 references. (JFP)

  9. Fluctuation-Response Relation and modeling in systems with fast and slow dynamics

    Directory of Open Access Journals (Sweden)

    G. Lacorata

    2007-10-01

    Full Text Available We show how a general formulation of the Fluctuation-Response Relation is able to describe in detail the connection between response properties to external perturbations and spontaneous fluctuations in systems with fast and slow variables. The method is tested by using the 360-variable Lorenz-96 model, where slow and fast variables are coupled to one another with reciprocal feedback, and a simplified low dimensional system. In the Fluctuation-Response context, the influence of the fast dynamics on the slow dynamics relies in a non trivial behavior of a suitable quadratic response function. This has important consequences for the modeling of the slow dynamics in terms of a Langevin equation: beyond a certain intrinsic time interval even the optimal model can give just statistical prediction.

  10. Statistical methods of estimating mining costs

    Science.gov (United States)

    Long, K.R.

    2011-01-01

    Until it was defunded in 1995, the U.S. Bureau of Mines maintained a Cost Estimating System (CES) for prefeasibility-type economic evaluations of mineral deposits and estimating costs at producing and non-producing mines. This system had a significant role in mineral resource assessments to estimate costs of developing and operating known mineral deposits and predicted undiscovered deposits. For legal reasons, the U.S. Geological Survey cannot update and maintain CES. Instead, statistical tools are under development to estimate mining costs from basic properties of mineral deposits such as tonnage, grade, mineralogy, depth, strip ratio, distance from infrastructure, rock strength, and work index. The first step was to reestimate "Taylor's Rule" which relates operating rate to available ore tonnage. The second step was to estimate statistical models of capital and operating costs for open pit porphyry copper mines with flotation concentrators. For a sample of 27 proposed porphyry copper projects, capital costs can be estimated from three variables: mineral processing rate, strip ratio, and distance from nearest railroad before mine construction began. Of all the variables tested, operating costs were found to be significantly correlated only with strip ratio.

  11. Beginning R The Statistical Programming Language

    CERN Document Server

    Gardener, Mark

    2012-01-01

    Conquer the complexities of this open source statistical language R is fast becoming the de facto standard for statistical computing and analysis in science, business, engineering, and related fields. This book examines this complex language using simple statistical examples, showing how R operates in a user-friendly context. Both students and workers in fields that require extensive statistical analysis will find this book helpful as they learn to use R for simple summary statistics, hypothesis testing, creating graphs, regression, and much more. It covers formula notation, complex statistics

  12. Statistical Relations for Yield Degradation in Inertial Confinement Fusion

    Science.gov (United States)

    Woo, K. M.; Betti, R.; Patel, D.; Gopalaswamy, V.

    2017-10-01

    In inertial confinement fusion (ICF), the yield-over-clean (YOC) is a quantity commonly used to assess the performance of an implosion with respect to the degradation caused by asymmetries. The YOC also determines the Lawson parameter used to identify the onset of ignition and the level of alpha heating in ICF implosions. In this work, we show that the YOC is a unique function of the residual kinetic energy in the compressed shell (with respect to the 1-D case) regardless of the asymmetry spectrum. This result is derived using a simple model of the deceleration phase as well as through an extensive set of 3-D radiation-hydrodynamics simulations using the code DEC3D. The latter has been recently upgraded to include a 3-D spherical moving mesh, the HYPRE solver for 3-D radiation transport and piecewise-parabolic method for robust shock-capturing hydrodynamic simulations. DEC3D is used to build a synthetic single-mode database to study the behavior of yield degradation caused by Rayleigh-Taylor instabilities in the deceleration phase. The relation between YOC and residual kinetic energy is compared with the result in an adiabatic implosion model. The statistical expression of YOC is also applied to the ignition criterion in the presence of multidimensional nonuniformities. This material is based upon work supported by the Department of Energy National Nuclear Security Administration under Award Number DE-NA0001944.

  13. Reproducible statistical analysis with multiple languages

    DEFF Research Database (Denmark)

    Lenth, Russell; Højsgaard, Søren

    2011-01-01

    This paper describes the system for making reproducible statistical analyses. differs from other systems for reproducible analysis in several ways. The two main differences are: (1) Several statistics programs can be in used in the same document. (2) Documents can be prepared using OpenOffice or ......Office or \\LaTeX. The main part of this paper is an example showing how to use and together in an OpenOffice text document. The paper also contains some practical considerations on the use of literate programming in statistics....

  14. Quantum mechanics as applied mathematical statistics

    International Nuclear Information System (INIS)

    Skala, L.; Cizek, J.; Kapsa, V.

    2011-01-01

    Basic mathematical apparatus of quantum mechanics like the wave function, probability density, probability density current, coordinate and momentum operators, corresponding commutation relation, Schroedinger equation, kinetic energy, uncertainty relations and continuity equation is discussed from the point of view of mathematical statistics. It is shown that the basic structure of quantum mechanics can be understood as generalization of classical mechanics in which the statistical character of results of measurement of the coordinate and momentum is taken into account and the most important general properties of statistical theories are correctly respected.

  15. Wind energy statistics 2012; Vindkraftsstatistik 2012

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    2013-04-15

    The publication 'Wind Energy Statistics' is an annual publication. Since 2010, the reported statistics of installed power, number of plants and regional distribution, even semi-annually, and in tabular form on the Agency's website. The publication is produced in a new way this year, which will result in some data differ from previous publications. Due to the certificate system there is basically full statistics on wind energy in this publication which are presented in different styles. Here we present the regional distribution, ie. how the number of turbines and installed capacity is allocated to counties and municipalities. The electricity produced divided by county, where for reasons of confidentiality possible, are also reported. The wind power is becoming increasingly important in the Swedish energy system which provides an increased demand for statistics and other divisions than that presented in the official statistics. Therefore, this publication, which are not official statistics, has been developed.

  16. Two statistical mechanics aspects of complex networks

    Science.gov (United States)

    Thurner, Stefan; Biely, Christoly

    2006-12-01

    By adopting an ensemble interpretation of non-growing rewiring networks, network theory can be reduced to a counting problem of possible network states and an identification of their associated probabilities. We present two scenarios of how different rewirement schemes can be used to control the state probabilities of the system. In particular, we review how by generalizing the linking rules of random graphs, in combination with superstatistics and quantum mechanical concepts, one can establish an exact relation between the degree distribution of any given network and the nodes’ linking probability distributions. In a second approach, we control state probabilities by a network Hamiltonian, whose characteristics are motivated by biological and socio-economical statistical systems. We demonstrate that a thermodynamics of networks becomes a fully consistent concept, allowing to study e.g. ‘phase transitions’ and computing entropies through thermodynamic relations.

  17. "The Two Brothers": Reconciling Perceptual-Cognitive and Statistical Models of Musical Evolution.

    Science.gov (United States)

    Jan, Steven

    2018-01-01

    While the "units, events and dynamics" of memetic evolution have been abstractly theorized (Lynch, 1998), they have not been applied systematically to real corpora in music. Some researchers, convinced of the validity of cultural evolution in more than the metaphorical sense adopted by much musicology, but perhaps skeptical of some or all of the claims of memetics, have attempted statistically based corpus-analysis techniques of music drawn from molecular biology, and these have offered strong evidence in favor of system-level change over time (Savage, 2017). This article argues that such statistical approaches, while illuminating, ignore the psychological realities of music-information grouping, the transmission of such groups with varying degrees of fidelity, their selection according to relative perceptual-cognitive salience, and the power of this Darwinian process to drive the systemic changes (such as the development over time of systems of tonal organization in music) that statistical methodologies measure. It asserts that a synthesis between such statistical approaches to the study of music-cultural change and the theory of memetics as applied to music (Jan, 2007), in particular the latter's perceptual-cognitive elements, would harness the strengths of each approach and deepen understanding of cultural evolution in music.

  18. Gyrokinetic Statistical Absolute Equilibrium and Turbulence

    International Nuclear Information System (INIS)

    Zhu, Jian-Zhou; Hammett, Gregory W.

    2011-01-01

    A paradigm based on the absolute equilibrium of Galerkin-truncated inviscid systems to aid in understanding turbulence (T.-D. Lee, 'On some statistical properties of hydrodynamical and magnetohydrodynamical fields,' Q. Appl. Math. 10, 69 (1952)) is taken to study gyrokinetic plasma turbulence: A finite set of Fourier modes of the collisionless gyrokinetic equations are kept and the statistical equilibria are calculated; possible implications for plasma turbulence in various situations are discussed. For the case of two spatial and one velocity dimension, in the calculation with discretization also of velocity v with N grid points (where N + 1 quantities are conserved, corresponding to an energy invariant and N entropy-related invariants), the negative temperature states, corresponding to the condensation of the generalized energy into the lowest modes, are found. This indicates a generic feature of inverse energy cascade. Comparisons are made with some classical results, such as those of Charney-Hasegawa-Mima in the cold-ion limit. There is a universal shape for statistical equilibrium of gyrokinetics in three spatial and two velocity dimensions with just one conserved quantity. Possible physical relevance to turbulence, such as ITG zonal flows, and to a critical balance hypothesis are also discussed.

  19. Assessment of Literature Related to Combustion Appliance Venting Systems

    Energy Technology Data Exchange (ETDEWEB)

    Rapp, Vi H. [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Singer, Brett C. [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Stratton, Chris [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Wray, Craig P. [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States)

    2012-06-01

    In many residential building retrofit programs, air tightening to increase energy efficiency is constrained by concerns about related impacts on the safety of naturally vented combustion appliances. Tighter housing units more readily depressurize when exhaust equipment is operated, making combustion appliances more prone to backdraft or spillage. Several test methods purportedly assess the potential for depressurization-induced backdrafting and spillage, but these tests are not robustly reliable and repeatable predictors of venting performance, in part because they do not fully capture weather effects on venting performance. The purpose of this literature review is to investigate combustion safety diagnostics in existing codes, standards, and guidelines related to combustion appliances. This review summarizes existing combustion safety test methods, evaluations of these test methods, and also discusses research related to wind effects and the simulation of vent system performance. Current codes and standards related to combustion appliance installation provide little information on assessing backdrafting or spillage potential. A substantial amount of research has been conducted to assess combustion appliance backdrafting and spillage test methods, but primarily focuses on comparing short-term (stress) induced tests and monitoring results. Monitoring, typically performed over one week, indicated that combinations of environmental and house operation characteristics most conducive to combustion spillage were rare. Research, to an extent, has assessed existing combustion safety diagnostics for house depressurization, but the objectives of the diagnostics, both stress and monitoring, are not clearly defined. More research is also needed to quantify the frequency of test “failure” occurrence throughout the building stock and assess the statistical effects of weather (especially wind) on house depressurization and in turn on combustion appliance venting

  20. Students’ perception of frequent assessments and its relation to motivation and grades in a statistics course: a pilot study

    NARCIS (Netherlands)

    Vaessen, B.E.; van den Beemt, A.A.J.; van de Watering, G.A.; van Meeuwen, L.W.; Lemmens, A.M.C.; den Brok, P.J.

    2017-01-01

    This pilot study measures university students’ perceptions of graded frequent assessments in an obligatory statistics course using a novel questionnaire. Relations between perceptions of frequent assessments, intrinsic motivation and grades were also investigated. A factor analysis of the

  1. Statistical mechanics of program systems

    International Nuclear Information System (INIS)

    Neirotti, Juan P; Caticha, Nestor

    2006-01-01

    We discuss the collective behaviour of a set of operators and variables that constitute a program and the emergence of meaningful computational properties in the language of statistical mechanics. This is done by appropriately modifying available Monte Carlo methods to deal with hierarchical structures. The study suggests, in analogy with simulated annealing, a method to automatically design programs. Reasonable solutions can be found, at low temperatures, when the method is applied to simple toy problems such as finding an algorithm that determines the roots of a function or one that makes a nonlinear regression. Peaks in the specific heat are interpreted as signalling phase transitions which separate regions where different algorithmic strategies are used to solve the problem

  2. Safety-related control air systems - approved 1977

    International Nuclear Information System (INIS)

    Anon.

    1978-01-01

    This standard applies to those portions of the control air system that furnish air required to support, control, or operate systems or portions of systems that are safety related in nuclear power plants. This standard relates only to the air supply system(s) for safety-related air operated devices and does not apply to the safety-related air operated device or to air operated actuators for such devices. The objectives of this standard are to provide (1) minimum system design requirements for equipment, piping, instruments, controls, and wiring that constitute the air supply system; and (2) the system and component testing and maintenance requirements

  3. Optimum design of automobile seat using statistical design support system; Tokeiteki sekkei shien system no jidoshayo seat eno tekiyo

    Energy Technology Data Exchange (ETDEWEB)

    Kashiwamura, T [NHK Spring Co. Ltd., Yokohama (Japan); Shiratori, M; Yu, Q; Koda, I [Yokohama National University, Yokohama (Japan)

    1997-10-01

    The authors proposed a new practical optimum design method called statistical design support system, which consists of five steps: the effectivity analysis, reanalysis, evaluation of dispersion, the optimiza4ion and evaluation of structural reliability. In this study, the authors applied the present system to analyze and optimum design of an automobile seat frame subjected to crushing. This study should that the present method could be applied to the complex nonlinear problems such as large deformation, material nonlinearity as well as impact problem. It was shown that the optimum design of the seat frame has been solved easily using the present system. 6 refs., 5 figs., 5 tabs.

  4. Two viewpoints for software failures and their relation in probabilistic safety assessment of digital instrumentation and control systems

    International Nuclear Information System (INIS)

    Kim, Man Cheol

    2015-01-01

    As the use of digital systems in nuclear power plants increases, the reliability of the software becomes one of the important issues in probabilistic safety assessment. In this paper, two viewpoints for a software failure during the operation of a digital system or a statistical software test are identified, and the relation between them is provided. In conventional software reliability analysis, a failure is mainly viewed with respect to the system operation. A new viewpoint with respect to the system input is suggested. The failure probability density functions for the two viewpoints are defined, and the relation between the two failure probability density functions is derived. Each failure probability density function can be derived from the other failure probability density function by applying the derived relation between the two failure probability density functions. The usefulness of the derived relation is demonstrated by applying it to the failure data obtained from the software testing of a real system. The two viewpoints and their relation, as identified in this paper, are expected to help us extend our understanding of the reliability of safety-critical software. (author)

  5. Statistics for Finance

    DEFF Research Database (Denmark)

    Lindström, Erik; Madsen, Henrik; Nielsen, Jan Nygaard

    Statistics for Finance develops students’ professional skills in statistics with applications in finance. Developed from the authors’ courses at the Technical University of Denmark and Lund University, the text bridges the gap between classical, rigorous treatments of financial mathematics...... that rarely connect concepts to data and books on econometrics and time series analysis that do not cover specific problems related to option valuation. The book discusses applications of financial derivatives pertaining to risk assessment and elimination. The authors cover various statistical...... and mathematical techniques, including linear and nonlinear time series analysis, stochastic calculus models, stochastic differential equations, Itō’s formula, the Black–Scholes model, the generalized method-of-moments, and the Kalman filter. They explain how these tools are used to price financial derivatives...

  6. Statistical analysis and dimensioning of a wind farm energy storage system

    Directory of Open Access Journals (Sweden)

    Waśkowicz Bartosz

    2017-06-01

    Full Text Available The growth in renewable power generation and more strict local regulations regarding power quality indices will make it necessary to use energy storage systems with renewable power plants in the near future. The capacity of storage systems can be determined using different methods most of which can be divided into either deterministic or stochastic. Deterministic methods are often complicated with numerous parameters and complex models for long term prediction often incorporating meteorological data. Stochastic methods use statistics for ESS (Energy Storage System sizing, which is somewhat intuitive for dealing with the random element of wind speed variation. The proposed method in this paper performs stabilization of output power at one minute intervals to reduce the negative influence of the wind farm on the power grid in order to meet local regulations. This paper shows the process of sizing the ESS for two selected wind farms, based on their levels of variation in generated power and also, for each, how the negative influences on the power grid in the form of voltage variation and a shortterm flicker factor are decreased.

  7. Transport statistics 1996

    CSIR Research Space (South Africa)

    Shepperson, L

    1997-12-01

    Full Text Available This publication contains transport and related statistics on roads, vehicles, infrastructure, passengers, freight, rail, air, maritime and road traffic, and international comparisons. The information compiled in this publication has been gathered...

  8. Frontiers in statistical quality control

    CERN Document Server

    Wilrich, Peter-Theodor

    2004-01-01

    This volume treats the four main categories of Statistical Quality Control: General SQC Methodology, On-line Control including Sampling Inspection and Statistical Process Control, Off-line Control with Data Analysis and Experimental Design, and, fields related to Reliability. Experts with international reputation present their newest contributions.

  9. On quantum statistical inference

    NARCIS (Netherlands)

    Barndorff-Nielsen, O.E.; Gill, R.D.; Jupp, P.E.

    2003-01-01

    Interest in problems of statistical inference connected to measurements of quantum systems has recently increased substantially, in step with dramatic new developments in experimental techniques for studying small quantum systems. Furthermore, developments in the theory of quantum measurements have

  10. Implementation of International Standards in Russia's Foreign Trade Statistics

    Directory of Open Access Journals (Sweden)

    Natalia E. Grigoruk

    2015-01-01

    Full Text Available The article analyzes the basic documents of international organizations in recent years, which have become the global standard for the development and improvement of statistics of foreign economic relations of most countries, including the Russian Federation. The article describes the key features of the theory and practice of modern foreign trade statistics in Russia and abroad, with an emphasis on the methodological problems of its main parts - the external trade statistics. It shows their interpretation in the most recent recommendations by UN statistical apparatus and other international organizations; considers a range of problems associated with the implementation of the national statistical practices of countries, including Russia and the countries of the Customs Union, the main international standard of foreign trade statistics - UN document "International Merchandise Trade Statistics". The main attention is paid to methodological issues such as: the criteria for selecting the objects of statistical accounting in accordance with international standards, quantitative and cost parameters of foreign trade statistics, statistical methods and estimates of commodity exports and imports, the problems of comparability of data; to a comparison of international standards in 2010 with documents on key precursor methodology of foreign trade statistics, characterized by the practice of introducing these standards in the foreign trade statistics of Russia and the countries of the Customs Union. The article analyzes the content given in the official statistical manuals of Russia foreign trade and foreign countries, covers the main methodological problems of World Trade in conjunction with the major current international statistical standards - System of National Accounts, Manual on Statistics of International Trade in Services and other documents; provides specific data describing the current structure of Russian foreign trade and especially its

  11. Statistical mechanics of lattice systems a concrete mathematical introduction

    CERN Document Server

    Friedli, Sacha

    2017-01-01

    This motivating textbook gives a friendly, rigorous introduction to fundamental concepts in equilibrium statistical mechanics, covering a selection of specific models, including the Curie–Weiss and Ising models, the Gaussian free field, O(n) models, and models with Kać interactions. Using classical concepts such as Gibbs measures, pressure, free energy, and entropy, the book exposes the main features of the classical description of large systems in equilibrium, in particular the central problem of phase transitions. It treats such important topics as the Peierls argument, the Dobrushin uniqueness, Mermin–Wagner and Lee–Yang theorems, and develops from scratch such workhorses as correlation inequalities, the cluster expansion, Pirogov–Sinai Theory, and reflection positivity. Written as a self-contained course for advanced undergraduate or beginning graduate students, the detailed explanations, large collection of exercises (with solutions), and appendix of mathematical results and concepts also make i...

  12. Nature and statistical properties of quasar associated absorption systems in the XQ-100 Legacy Survey

    DEFF Research Database (Denmark)

    Perrotta, Serena; D'Odorico, Valentina; Prochaska, J. Xavier

    2016-01-01

    We statistically study the physical properties of a sample of narrow absorption line (NAL) systems looking for empirical evidences to distinguish between intrinsic and intervening NALs without taking into account any a priori definition or velocity cut-off. We analyze the spectra of 100 quasars...

  13. Statistical validation of earthquake related observations

    Science.gov (United States)

    Kossobokov, V. G.

    2011-12-01

    The confirmed fractal nature of earthquakes and their distribution in space and time implies that many traditional estimations of seismic hazard (from term-less to short-term ones) are usually based on erroneous assumptions of easy tractable or, conversely, delicately-designed models. The widespread practice of deceptive modeling considered as a "reasonable proxy" of the natural seismic process leads to seismic hazard assessment of unknown quality, which errors propagate non-linearly into inflicted estimates of risk and, eventually, into unexpected societal losses of unacceptable level. The studies aimed at forecast/prediction of earthquakes must include validation in the retro- (at least) and, eventually, in prospective tests. In the absence of such control a suggested "precursor/signal" remains a "candidate", which link to target seismic event is a model assumption. Predicting in advance is the only decisive test of forecast/predictions and, therefore, the score-card of any "established precursor/signal" represented by the empirical probabilities of alarms and failures-to-predict achieved in prospective testing must prove statistical significance rejecting the null-hypothesis of random coincidental occurrence in advance target earthquakes. We reiterate suggesting so-called "Seismic Roulette" null-hypothesis as the most adequate undisturbed random alternative accounting for the empirical spatial distribution of earthquakes: (i) Consider a roulette wheel with as many sectors as the number of earthquake locations from a sample catalog representing seismic locus, a sector per each location and (ii) make your bet according to prediction (i.e., determine, which locations are inside area of alarm, and put one chip in each of the corresponding sectors); (iii) Nature turns the wheel; (iv) accumulate statistics of wins and losses along with the number of chips spent. If a precursor in charge of prediction exposes an imperfection of Seismic Roulette then, having in mind

  14. The estimation of differential counting measurements of possitive quantities with relatively large statistical errors

    International Nuclear Information System (INIS)

    Vincent, C.H.

    1982-01-01

    Bayes' principle is applied to the differential counting measurement of a positive quantity in which the statistical errors are not necessarily small in relation to the true value of the quantity. The methods of estimation derived are found to give consistent results and to avoid the anomalous negative estimates sometimes obtained by conventional methods. One of the methods given provides a simple means of deriving the required estimates from conventionally presented results and appears to have wide potential applications. Both methods provide the actual posterior probability distribution of the quantity to be measured. A particularly important potential application is the correction of counts on low radioacitvity samples for background. (orig.)

  15. Application of Statistical Increase in Industrial Quality

    International Nuclear Information System (INIS)

    Akhmad-Fauzy

    2000-01-01

    Application of statistical method in industrial field is slightly newcompared with agricultural and biology. Statistical method which is appliedin industrial field more focus on industrial system control and useful formaintaining economical control of produce quality which is produced on bigscale. Application of statistical method in industrial field has increasedrapidly. This fact is supported by release of ISO 9000 quality system in 1987as international quality standard which is adopted by more than 100countries. (author)

  16. Machine learning Z2 quantum spin liquids with quasiparticle statistics

    Science.gov (United States)

    Zhang, Yi; Melko, Roger G.; Kim, Eun-Ah

    2017-12-01

    After decades of progress and effort, obtaining a phase diagram for a strongly correlated topological system still remains a challenge. Although in principle one could turn to Wilson loops and long-range entanglement, evaluating these nonlocal observables at many points in phase space can be prohibitively costly. With growing excitement over topological quantum computation comes the need for an efficient approach for obtaining topological phase diagrams. Here we turn to machine learning using quantum loop topography (QLT), a notion we have recently introduced. Specifically, we propose a construction of QLT that is sensitive to quasiparticle statistics. We then use mutual statistics between the spinons and visons to detect a Z2 quantum spin liquid in a multiparameter phase space. We successfully obtain the quantum phase boundary between the topological and trivial phases using a simple feed-forward neural network. Furthermore, we demonstrate advantages of our approach for the evaluation of phase diagrams relating to speed and storage. Such statistics-based machine learning of topological phases opens new efficient routes to studying topological phase diagrams in strongly correlated systems.

  17. Generalized statistical criterion for distinguishing random optical groupings from physical multiple systems

    International Nuclear Information System (INIS)

    Anosova, Z.P.

    1988-01-01

    A statistical criterion is proposed for distinguishing between random and physical groupings of stars and galaxies. The criterion is applied to nearby wide multiple stars, triplets of galaxies in the list of Karachentsev, Karachentseva, and Shcherbanovskii, and double galaxies in the list of Dahari, in which the principal components are Seyfert galaxies. Systems that are almost certainly physical, probably physical, probably optical, and almost certainly optical are identified. The limiting difference between the radial velocities of the components of physical multiple galaxies is estimated

  18. Assessment of Field Experience Related to Pressurized Water Reactor Primary System Leaks

    International Nuclear Information System (INIS)

    Ware, A.G.; Hsu, C.; Atwood, C.L.; Sattison, M.B.; Hartley, R.S.; Shah, V.N.

    1999-01-01

    This paper presents our assessment of field experience related to pressurized water reactor (PWR) primary system leaks in terms of their number and rates, how aging affects frequency of leak events, the safety significance of such leaks, industry efforts to reduce leaks, and effectiveness of current leak detection systems. We have reviewed the licensee event reports to identify the events that took place during 1985 to the third quarter of 1996, and reviewed related technical literature and visited PWR plants to analyze these events. Our assessment shows that USNRC licensees have taken effective actions to reduce the number of leak events. One main reason for this decreasing trend was the elimination or reportable leakages from valve stem packing after 1991. Our review of leak events related to vibratory fatigue reveals a statistically significant decreasing trend with age (years of operation), but not in calendar time. Our assessment of worldwide data on leakage caused by thermal fatigue cracking is that the fatigue of aging piping is a safety significant issue. Our review of leak events has identified several susceptible sites in piping having high safety significance; but the inspection of some of these sites is not required by the ASME Code. These sites may be included in the risk-informed inspection programs

  19. Assessment of Field Experience Related to Pressurized Water Reactor Primary System Leaks

    International Nuclear Information System (INIS)

    Shah, Vikram Naginbhai; Ware, Arthur Gates; Atwood, Corwin Lee; Sattison, Martin Blaine; Hartley, Robert Scott; Hsu, C.

    1999-01-01

    This paper presents our assessment of field experience related to pressurized water reactor (PWR) primary system leaks in terms of their number of rates, how aging affects frequency of leak events, the safety significance of such leaks, industry efforts to reduce leaks, and effectiveness of current leak detection systems. We have reviewed the licensee event reports to identify the events that took place during 1985 to the third quarter of 1996, and reviewed related technical literature and visited PWR plants to analyze these events. Our assessment shows that USNRC licensees have taken effective actions to reduce the number of leak events. One main reason for this decreasing trend was the elimination or reportable leakages from valve stem packing after 1991. Our review of leak events related to vibratory fatigue reveals a statistically significant decreasing trend with age (years of operation), but not in calendar time. Our assessment of worldwide data on leakage caused by thermal fatigue cracking is that the fatigue of aging piping is a safety significant issue. Our review of leak events has identified several susceptible sites in piping having high safety significance; but the inspection of some of these sites is not required by the ASME Code. These sites may be included in the risk-informed inspection programs

  20. Augmented Automated Material Accounting Statistics System (AMASS)

    International Nuclear Information System (INIS)

    Lumb, R.F.; Messinger, M.; Tingey, F.H.

    1983-01-01

    This paper describes an extension of the AMASS methodology which was previously presented at the 1981 INMM annual meeting. The main thrust of the current effort is to develop procedures and a computer program for estimating the variance of an Inventory Difference when many sources of variability, other than measurement error, are admitted in the model. Procedures also are included for the estimation of the variances associated with measurement error estimates and their effect on the estimated limit of error of the inventory difference (LEID). The algorithm for the LEID measurement component uncertainty involves the propagated component measurement variance estimates as well as their associated degrees of freedom. The methodology and supporting computer software is referred to as the augmented Automated Material Accounting Statistics System (AMASS). Specifically, AMASS accommodates five source effects. These are: (1) measurement errors (2) known but unmeasured effects (3) measurement adjustment effects (4) unmeasured process hold-up effects (5) residual process variation A major result of this effort is a procedure for determining the effect of bias correction on LEID, properly taking into account all the covariances that exist. This paper briefly describes the basic models that are assumed; some of the estimation procedures consistent with the model; data requirements, emphasizing availability and other practical considerations; discusses implications for bias corrections; and concludes by briefly describing the supporting computer program

  1. Ragu: a free tool for the analysis of EEG and MEG event-related scalp field data using global randomization statistics.

    Science.gov (United States)

    Koenig, Thomas; Kottlow, Mara; Stein, Maria; Melie-García, Lester

    2011-01-01

    We present a program (Ragu; Randomization Graphical User interface) for statistical analyses of multichannel event-related EEG and MEG experiments. Based on measures of scalp field differences including all sensors, and using powerful, assumption-free randomization statistics, the program yields robust, physiologically meaningful conclusions based on the entire, untransformed, and unbiased set of measurements. Ragu accommodates up to two within-subject factors and one between-subject factor with multiple levels each. Significance is computed as function of time and can be controlled for type II errors with overall analyses. Results are displayed in an intuitive visual interface that allows further exploration of the findings. A sample analysis of an ERP experiment illustrates the different possibilities offered by Ragu. The aim of Ragu is to maximize statistical power while minimizing the need for a-priori choices of models and parameters (like inverse models or sensors of interest) that interact with and bias statistics.

  2. Statistical mechanics of a time-homogeneous system of money and antimoney

    Science.gov (United States)

    Schmitt, Matthias; Schacker, Andreas; Braun, Dieter

    2014-03-01

    Financial crises appear throughout human history. While there are many schools of thought on what the actual causes of such crises are, it has been suggested that the creation of credit money might be a source of financial instability. We discuss how the credit mechanism in a system of fractional reserve banking leads to non-local transfers of purchasing power that also affect non-involved agents. To overcome this issue, we impose the local symmetry of time homogeneity on the monetary system. A bi-currency system of non-bank assets (money) and bank assets (antimoney) is considered. A payment is either made by passing on money or by receiving antimoney. As a result, a free floating exchange rate between non-bank assets and bank assets is established. Credit creation is replaced by the simultaneous transfer of money and antimoney at a negotiated exchange rate. This is in contrast to traditional discussions of full reserve banking, which stalls creditary lending. With money and antimoney, the problem of credit crunches is mitigated while a full time symmetry of the monetary system is maintained. As a test environment for such a monetary system, we discuss an economy of random transfers. Random transfers are a strong criterion to probe the stability of monetary systems. The analysis using statistical physics provides analytical solutions and confirms that a money-antimoney system could be functional. Equally important to the probing of the stability of such a monetary system is the question of how to implement the credit default dynamics. This issue remains open.

  3. Statistical mechanics of a time-homogeneous system of money and antimoney

    International Nuclear Information System (INIS)

    Schmitt, Matthias; Schacker, Andreas; Braun, Dieter

    2014-01-01

    Financial crises appear throughout human history. While there are many schools of thought on what the actual causes of such crises are, it has been suggested that the creation of credit money might be a source of financial instability. We discuss how the credit mechanism in a system of fractional reserve banking leads to non-local transfers of purchasing power that also affect non-involved agents. To overcome this issue, we impose the local symmetry of time homogeneity on the monetary system. A bi-currency system of non-bank assets (money) and bank assets (antimoney) is considered. A payment is either made by passing on money or by receiving antimoney. As a result, a free floating exchange rate between non-bank assets and bank assets is established. Credit creation is replaced by the simultaneous transfer of money and antimoney at a negotiated exchange rate. This is in contrast to traditional discussions of full reserve banking, which stalls creditary lending. With money and antimoney, the problem of credit crunches is mitigated while a full time symmetry of the monetary system is maintained. As a test environment for such a monetary system, we discuss an economy of random transfers. Random transfers are a strong criterion to probe the stability of monetary systems. The analysis using statistical physics provides analytical solutions and confirms that a money–antimoney system could be functional. Equally important to the probing of the stability of such a monetary system is the question of how to implement the credit default dynamics. This issue remains open

  4. Introduction to the basic concepts of modern physics special relativity, quantum and statistical physics

    CERN Document Server

    Becchi, Carlo Maria

    2016-01-01

    This is the third edition of a well-received textbook on modern physics theory. This book provides an elementary but rigorous and self-contained presentation of the simplest theoretical framework that will meet the needs of undergraduate students. In addition, a number of examples of relevant applications and an appropriate list of solved problems are provided.Apart from a substantial extension of the proposed problems, the new edition provides more detailed discussion on Lorentz transformations and their group properties, a deeper treatment of quantum mechanics in a central potential, and a closer comparison of statistical mechanics in classical and in quantum physics. The first part of the book is devoted to special relativity, with a particular focus on space-time relativity and relativistic kinematics. The second part deals with Schrödinger's formulation of quantum mechanics. The presentation concerns mainly one-dimensional problems, but some three-dimensional examples are discussed in detail. The third...

  5. Visual statistical learning is related to natural language ability in adults: An ERP study.

    Science.gov (United States)

    Daltrozzo, Jerome; Emerson, Samantha N; Deocampo, Joanne; Singh, Sonia; Freggens, Marjorie; Branum-Martin, Lee; Conway, Christopher M

    2017-03-01

    Statistical learning (SL) is believed to enable language acquisition by allowing individuals to learn regularities within linguistic input. However, neural evidence supporting a direct relationship between SL and language ability is scarce. We investigated whether there are associations between event-related potential (ERP) correlates of SL and language abilities while controlling for the general level of selective attention. Seventeen adults completed tests of visual SL, receptive vocabulary, grammatical ability, and sentence completion. Response times and ERPs showed that SL is related to receptive vocabulary and grammatical ability. ERPs indicated that the relationship between SL and grammatical ability was independent of attention while the association between SL and receptive vocabulary depended on attention. The implications of these dissociative relationships in terms of underlying mechanisms of SL and language are discussed. These results further elucidate the cognitive nature of the links between SL mechanisms and language abilities. Copyright © 2017 Elsevier Inc. All rights reserved.

  6. Toward Global Comparability of Sexual Orientation Data in Official Statistics: A Conceptual Framework of Sexual Orientation for Health Data Collection in New Zealand's Official Statistics System

    Science.gov (United States)

    Gray, Alistair; Veale, Jaimie F.; Binson, Diane; Sell, Randell L.

    2013-01-01

    Objective. Effectively addressing health disparities experienced by sexual minority populations requires high-quality official data on sexual orientation. We developed a conceptual framework of sexual orientation to improve the quality of sexual orientation data in New Zealand's Official Statistics System. Methods. We reviewed conceptual and methodological literature, culminating in a draft framework. To improve the framework, we held focus groups and key-informant interviews with sexual minority stakeholders and producers and consumers of official statistics. An advisory board of experts provided additional guidance. Results. The framework proposes working definitions of the sexual orientation topic and measurement concepts, describes dimensions of the measurement concepts, discusses variables framing the measurement concepts, and outlines conceptual grey areas. Conclusion. The framework proposes standard definitions and concepts for the collection of official sexual orientation data in New Zealand. It presents a model for producers of official statistics in other countries, who wish to improve the quality of health data on their citizens. PMID:23840231

  7. Statistical evaluation of failures and repairs of the V-1 measuring and control system

    International Nuclear Information System (INIS)

    Laurinec, R.; Korec, J.; Mitosinka, J.; Zarnovican, V.

    1984-01-01

    A failure record card system was introduced for evaluating the reliability of the measurement and control equipment of the V-1 nuclear power plant. The SPU-800 microcomputer system is used for recording data on magnetic tape and their transmission to the central data processing department. The data are used for evaluating the reliability of components and circuits and a selection is made of the most failure-prone components, and the causes of failures are evaluated as are failure identification, repair and causes of outages. The system provides monthly, annual and total assessment data since the system was commissioned. The results of the statistical evaluation of failures are used for planning preventive maintenance and for determining optimal repair intervals. (E.S.)

  8. Environmental accounting and statistics

    International Nuclear Information System (INIS)

    Bartelmus, P.L.P.

    1992-01-01

    The objective of sustainable development is to integrate environmental concerns with mainstream socio-economic policies. Integrated policies need to be supported by integrated data. Environmental accounting achieves this integration by incorporating environmental costs and benefits into conventional national accounts. Modified accounting aggregates can thus be used in defining and measuring environmentally sound and sustainable economic growth. Further development objectives need to be assessed by more comprehensive, though necessarily less integrative, systems of environmental statistics and indicators. Integrative frameworks for the different statistical systems in the fields of economy, environment and population would facilitate the provision of comparable data for the analysis of integrated development. (author). 19 refs, 2 figs, 2 tabs

  9. The statistics of multi-step direct reactions

    International Nuclear Information System (INIS)

    Koning, A.J.; Akkermans, J.M.

    1991-01-01

    We propose a quantum-statistical framework that provides an integrated perspective on the differences and similarities between the many current models for multi-step direct reactions in the continuum. It is argued that to obtain a statistical theory two physically different approaches are conceivable to postulate randomness, respectively called leading-particle statistics and residual-system statistics. We present a new leading-particle statistics theory for multi-step direct reactions. It is shown that the model of Feshbach et al. can be derived as a simplification of this theory and thus can be founded solely upon leading-particle statistics. The models developed by Tamura et al. and Nishioka et al. are based upon residual-system statistics and hence fall into a physically different class of multi-step direct theories, although the resulting cross-section formulae for the important first step are shown to be the same. The widely used semi-classical models such as the generalized exciton model can be interpreted as further phenomenological simplifications of the leading-particle statistics theory. A more comprehensive exposition will appear before long. (author). 32 refs, 4 figs

  10. Statistical projection effects in a hydrodynamic pilot-wave system

    Science.gov (United States)

    Sáenz, Pedro J.; Cristea-Platon, Tudor; Bush, John W. M.

    2018-03-01

    Millimetric liquid droplets can walk across the surface of a vibrating fluid bath, self-propelled through a resonant interaction with their own guiding or `pilot' wave fields. These walking droplets, or `walkers', exhibit several features previously thought to be peculiar to the microscopic, quantum realm. In particular, walkers confined to circular corrals manifest a wave-like statistical behaviour reminiscent of that of electrons in quantum corrals. Here we demonstrate that localized topological inhomogeneities in an elliptical corral may lead to resonant projection effects in the walker's statistics similar to those reported in quantum corrals. Specifically, we show that a submerged circular well may drive the walker to excite specific eigenmodes in the bath that result in drastic changes in the particle's statistical behaviour. The well tends to attract the walker, leading to a local peak in the walker's position histogram. By placing the well at one of the foci, a mode with maxima near the foci is preferentially excited, leading to a projection effect in the walker's position histogram towards the empty focus, an effect strongly reminiscent of the quantum mirage. Finally, we demonstrate that the mean pilot-wave field has the same form as the histogram describing the walker's statistics.

  11. Improvement of statistical methods for detecting anomalies in climate and environmental monitoring systems

    Science.gov (United States)

    Yakunin, A. G.; Hussein, H. M.

    2018-01-01

    The article shows how the known statistical methods, which are widely used in solving financial problems and a number of other fields of science and technology, can be effectively applied after minor modification for solving such problems in climate and environment monitoring systems, as the detection of anomalies in the form of abrupt changes in signal levels, the occurrence of positive and negative outliers and the violation of the cycle form in periodic processes.

  12. Radiation counting statistics

    Energy Technology Data Exchange (ETDEWEB)

    Suh, M. Y.; Jee, K. Y.; Park, K. K.; Park, Y. J.; Kim, W. H

    1999-08-01

    This report is intended to describe the statistical methods necessary to design and conduct radiation counting experiments and evaluate the data from the experiment. The methods are described for the evaluation of the stability of a counting system and the estimation of the precision of counting data by application of probability distribution models. The methods for the determination of the uncertainty of the results calculated from the number of counts, as well as various statistical methods for the reduction of counting error are also described. (Author). 11 refs., 8 tabs., 8 figs.

  13. Radiation counting statistics

    Energy Technology Data Exchange (ETDEWEB)

    Suh, M. Y.; Jee, K. Y.; Park, K. K. [Korea Atomic Energy Research Institute, Taejon (Korea)

    1999-08-01

    This report is intended to describe the statistical methods necessary to design and conduct radiation counting experiments and evaluate the data from the experiments. The methods are described for the evaluation of the stability of a counting system and the estimation of the precision of counting data by application of probability distribution models. The methods for the determination of the uncertainty of the results calculated from the number of counts, as well as various statistical methods for the reduction of counting error are also described. 11 refs., 6 figs., 8 tabs. (Author)

  14. Radiation counting statistics

    International Nuclear Information System (INIS)

    Suh, M. Y.; Jee, K. Y.; Park, K. K.; Park, Y. J.; Kim, W. H.

    1999-08-01

    This report is intended to describe the statistical methods necessary to design and conduct radiation counting experiments and evaluate the data from the experiment. The methods are described for the evaluation of the stability of a counting system and the estimation of the precision of counting data by application of probability distribution models. The methods for the determination of the uncertainty of the results calculated from the number of counts, as well as various statistical methods for the reduction of counting error are also described. (Author). 11 refs., 8 tabs., 8 figs

  15. Relative mass resolution technique for optimum design of a gamma nondestructive assay system

    International Nuclear Information System (INIS)

    Koh, Duck Joon

    1995-02-01

    Nondestructive assay(NDA) is a widely used nuclear technology for quantitative elemental and isotopic assay. Nondestructive assay is performed by the detection of an identifying radiation emerging from the sample, which can be unambiguously related to the element or isotope of interest. In every assay we can identify two distinct factors that lead to measurement uncertainty. We refer to these as statistical and spatial uncertainties. If the spatial distribution of the analyte and the matrix material in the sample are known and fairly constant from sample to sample, then the major source of measurement uncertainty is the statistical uncertainty resulting from randomness in the counting process. The spatial uncertainty is independent of the measurement time and therefore sets a lower limit to the measurement uncertainty, which is inherent in the assay system in conjunction with the population of samples to be measured. The only way to minimize the spatial uncertainty is an optimized design of the assay system. Therefore we have to decide on the type and number of detectors to be used, their deployment around the sample, the type of radiation to be measured, the duration of each measurement, the size and shape of the sample drum. The design procedure leading to the optimal assay system should be based on a quantitative(RMR:Relative Mass Resolution) comparison of the performance of each proposed design. For NDA system design of low level radwaste, a specific purpose Monte Carlo code has been developed to simulate point-source responses for sources within an assayed radwaste drum and to analyze the effect of scattered gammas from higher energy gammas on the spectrum of a low energy gamma-ray. We could use the well-known Monte Carlo code, such as MCNP for the simulation of NDA in the case of low level radwaste. But, MCNP is a multi-purpose Monte Carlo transport code for several geometries which requires large memory and long CPU time. For some cases in nuclear

  16. A New Approach for the Statistical Thermodynamic Theory of the Nonextensive Systems Confined in Different Finite Traps

    Science.gov (United States)

    Tang, Hui-Yi; Wang, Jian-Hui; Ma, Yong-Li

    2014-06-01

    For a small system at a low temperature, thermal fluctuation and quantum effect play important roles in quantum thermodynamics. Starting from micro-canonical ensemble, we generalize the Boltzmann-Gibbs statistical factor from infinite to finite systems, no matter the interactions between particles are considered or not. This generalized factor, similar to Tsallis's q-form as a power-law distribution, has the restriction of finite energy spectrum and includes the nonextensivities of the small systems. We derive the exact expression for distribution of average particle numbers in the interacting classical and quantum nonextensive systems within a generalized canonical ensemble. This expression in the almost independent or elementary excitation quantum finite systems is similar to the corresponding ones obtained from the conventional grand-canonical ensemble. In the reconstruction for the statistical theory of the small systems, we present the entropy of the equilibrium systems and equation of total thermal energy. When we investigate the thermodynamics for the interacting nonextensive systems, we obtain the system-bath heat exchange and "uncompensated heat" which are in the thermodynamical level and independent on the detail of the system-bath coupling. For ideal finite systems, with different traps and boundary conditions, we calculate some thermodynamic quantities, such as the specific heat, entropy, and equation of state, etc. Particularly at low temperatures for the small systems, we predict some novel behaviors in the quantum thermodynamics, including internal entropy production, heat exchanges between the system and its surroundings and finite-size effects on the free energy.

  17. Statistical nuclear reactions

    International Nuclear Information System (INIS)

    Hilaire, S.

    2001-01-01

    A review of the statistical model of nuclear reactions is presented. The main relations are described, together with the ingredients necessary to perform practical calculations. In addition, a substantial overview of the width fluctuation correction factor is given. (author)

  18. Statistical mechanics rigorous results

    CERN Document Server

    Ruelle, David

    1999-01-01

    This classic book marks the beginning of an era of vigorous mathematical progress in equilibrium statistical mechanics. Its treatment of the infinite system limit has not been superseded, and the discussion of thermodynamic functions and states remains basic for more recent work. The conceptual foundation provided by the Rigorous Results remains invaluable for the study of the spectacular developments of statistical mechanics in the second half of the 20th century.

  19. Rigorous force field optimization principles based on statistical distance minimization

    Energy Technology Data Exchange (ETDEWEB)

    Vlcek, Lukas, E-mail: vlcekl1@ornl.gov [Chemical Sciences Division, Geochemistry & Interfacial Sciences Group, Oak Ridge National Laboratory, Oak Ridge, Tennessee 37831-6110 (United States); Joint Institute for Computational Sciences, University of Tennessee, Oak Ridge National Laboratory, Oak Ridge, Tennessee 37831-6173 (United States); Chialvo, Ariel A. [Chemical Sciences Division, Geochemistry & Interfacial Sciences Group, Oak Ridge National Laboratory, Oak Ridge, Tennessee 37831-6110 (United States)

    2015-10-14

    We use the concept of statistical distance to define a measure of distinguishability between a pair of statistical mechanical systems, i.e., a model and its target, and show that its minimization leads to general convergence of the model’s static measurable properties to those of the target. We exploit this feature to define a rigorous basis for the development of accurate and robust effective molecular force fields that are inherently compatible with coarse-grained experimental data. The new model optimization principles and their efficient implementation are illustrated through selected examples, whose outcome demonstrates the higher robustness and predictive accuracy of the approach compared to other currently used methods, such as force matching and relative entropy minimization. We also discuss relations between the newly developed principles and established thermodynamic concepts, which include the Gibbs-Bogoliubov inequality and the thermodynamic length.

  20. Strong Statistical Convergence in Probabilistic Metric Spaces

    OpenAIRE

    Şençimen, Celaleddin; Pehlivan, Serpil

    2008-01-01

    In this article, we introduce the concepts of strongly statistically convergent sequence and strong statistically Cauchy sequence in a probabilistic metric (PM) space endowed with the strong topology, and establish some basic facts. Next, we define the strong statistical limit points and the strong statistical cluster points of a sequence in this space and investigate the relations between these concepts.

  1. Fundamental statistical features and self-similar properties of tagged networks

    International Nuclear Information System (INIS)

    Palla, Gergely; Farkas, Illes J; Pollner, Peter; Vicsek, Tamas; Derenyi, Imre

    2008-01-01

    We investigate the fundamental statistical features of tagged (or annotated) networks having a rich variety of attributes associated with their nodes. Tags (attributes, annotations, properties, features, etc) provide essential information about the entity represented by a given node, thus, taking them into account represents a significant step towards a more complete description of the structure of large complex systems. Our main goal here is to uncover the relations between the statistical properties of the node tags and those of the graph topology. In order to better characterize the networks with tagged nodes, we introduce a number of new notions, including tag-assortativity (relating link probability to node similarity), and new quantities, such as node uniqueness (measuring how rarely the tags of a node occur in the network) and tag-assortativity exponent. We apply our approach to three large networks representing very different domains of complex systems. A number of the tag related quantities display analogous behaviour (e.g. the networks we studied are tag-assortative, indicating possible universal aspects of tags versus topology), while some other features, such as the distribution of the node uniqueness, show variability from network to network allowing for pin-pointing large scale specific features of real-world complex networks. We also find that for each network the topology and the tag distribution are scale invariant, and this self-similar property of the networks can be well characterized by the tag-assortativity exponent, which is specific to each system.

  2. Statistical physics of pairwise probability models

    Directory of Open Access Journals (Sweden)

    Yasser Roudi

    2009-11-01

    Full Text Available Statistical models for describing the probability distribution over the states of biological systems are commonly used for dimensional reduction. Among these models, pairwise models are very attractive in part because they can be fit using a reasonable amount of data: knowledge of the means and correlations between pairs of elements in the system is sufficient. Not surprisingly, then, using pairwise models for studying neural data has been the focus of many studies in recent years. In this paper, we describe how tools from statistical physics can be employed for studying and using pairwise models. We build on our previous work on the subject and study the relation between different methods for fitting these models and evaluating their quality. In particular, using data from simulated cortical networks we study how the quality of various approximate methods for inferring the parameters in a pairwise model depends on the time bin chosen for binning the data. We also study the effect of the size of the time bin on the model quality itself, again using simulated data. We show that using finer time bins increases the quality of the pairwise model. We offer new ways of deriving the expressions reported in our previous work for assessing the quality of pairwise models.

  3. Statistical modeling for degradation data

    CERN Document Server

    Lio, Yuhlong; Ng, Hon; Tsai, Tzong-Ru

    2017-01-01

    This book focuses on the statistical aspects of the analysis of degradation data. In recent years, degradation data analysis has come to play an increasingly important role in different disciplines such as reliability, public health sciences, and finance. For example, information on products’ reliability can be obtained by analyzing degradation data. In addition, statistical modeling and inference techniques have been developed on the basis of different degradation measures. The book brings together experts engaged in statistical modeling and inference, presenting and discussing important recent advances in degradation data analysis and related applications. The topics covered are timely and have considerable potential to impact both statistics and reliability engineering.

  4. Cerebral blood flow and related factors in hyperthyroidism patients by SPECT imaging and statistical parametric mapping analysis

    International Nuclear Information System (INIS)

    Xiu Yan; Shi Hongcheng; Liu Wenguan; Chen Xuefen; Gu Yushen; Chen Shuguang; Yu Haojun; Yu Yiping

    2010-01-01

    Objective: To investigate the cerebral blood flow (CBF) perfusion patterns and related factors in hyperthyroidism patients. Methods: Twenty-five patients with hyperthyroidism and twenty-two healthy controls matched for age, sex, education were enrolled. 99 Tc m -ethylene cysteinate dimer (ECD) SPECT CBF perfusion imaging was performed at rest. Statistical parametric mapping 5.0 software (SPM5) was used and a statistical threshold of P 3 , FT 4 ), thyroid autoimmune antibodies: sensitive thyroid stimulating hormone (sTSH), thyroid peroxidase antibody (TPOAb) and TSH receptor antibody (TRAb) by Pearson analysis, with disease duration by Spearman analysis. Results: rCBF was decreased significantly in limbic system and frontal lobe, including parahippocampal gyrus, uncus (posterior entorhinal cortex, posterior parolfactory cortex, parahippocampal cortex, anterior cingulate, right inferior temporal gyrus), left hypothalamus and caudate nucleus (P 3 (r=-0.468, -0.417, both P 4 (r=-0.4M, -0.418, -0.415, -0.459, all P 4 (r=0.419, 0.412, both P<0.05). rCBF in left insula was negatively correlated with concentration of sTSH, and right auditory associated cortex was positively correlated with concentration of sTSH (r=-0.504, 0.429, both P<0.05). rCBF in left middle temporal gyrus, left angular gyrus was positively correlated with concentration of TRAb while that in right thalamus, right hypothalamus, left anterior nucleus,left ventralis nucleus was negatively correlated with concentration of TRAb (r=0.750, 0.862, -0.691, -0.835, -0.713, -0.759, all P<0.05). rCBF in right anterior cingulate, right cuneus, right rectus gyrus, right superior marginal gyrus was positively correlated with concentration of TPOAb (r=0.696, 0.581, 0.779, 0.683, all P<0.05). rCBF in postcentral gyrus, temporal gyrus, left superior marginal gyrus and auditory associated cortex was positively correlated with disease duration (r=0.502, 0.457, 0.524, 0.440, all P<0.05). Conclusion: Hypoperfusions in

  5. Thermodynamics, Gibbs Method and Statistical Physics of Electron Gases Gibbs Method and Statistical Physics of Electron Gases

    CERN Document Server

    Askerov, Bahram M

    2010-01-01

    This book deals with theoretical thermodynamics and the statistical physics of electron and particle gases. While treating the laws of thermodynamics from both classical and quantum theoretical viewpoints, it posits that the basis of the statistical theory of macroscopic properties of a system is the microcanonical distribution of isolated systems, from which all canonical distributions stem. To calculate the free energy, the Gibbs method is applied to ideal and non-ideal gases, and also to a crystalline solid. Considerable attention is paid to the Fermi-Dirac and Bose-Einstein quantum statistics and its application to different quantum gases, and electron gas in both metals and semiconductors is considered in a nonequilibrium state. A separate chapter treats the statistical theory of thermodynamic properties of an electron gas in a quantizing magnetic field.

  6. A Statistical Programme Assignment Model

    DEFF Research Database (Denmark)

    Rosholm, Michael; Staghøj, Jonas; Svarer, Michael

    When treatment effects of active labour market programmes are heterogeneous in an observable way  across the population, the allocation of the unemployed into different programmes becomes a particularly  important issue. In this paper, we present a statistical model designed to improve the present...... duration of unemployment spells may result if a statistical programme assignment model is introduced. We discuss several issues regarding the  plementation of such a system, especially the interplay between the statistical model and  case workers....

  7. STATISTICAL INVESTIGATION OF THE GROUNDWATER SYSTEM IN DARB EL-ARBAEIN, SOUTHWESTERN DESERT, EGYPT

    Directory of Open Access Journals (Sweden)

    Kashouty Mohamed El

    2009-12-01

    Full Text Available In Darb El Arbaein, the groundwater is the only water resources. The aquifer system starts from Paleozoic-Mesozoic to Upper Cretaceous sandstone rocks. They overlay the basement rocks and the aquifer is confined. In the present research, the performance of the statistical analyses to classify groundwater samples depending on their chemical characters has been tested. The hydrogeological and hydrogeochemical data of 92 groundwater samples was obtained from the GARPAD authority in northern, central, and southern Darb El Arbaein. A robust classification scheme for partitioning groundwater chemistry into homogeneous groups was an important tool for the characterization of Nubian sandstone aquifer. We test the performance of the many available graphical and statistical methodologies used to classify water samples. R-mode, Q-mode, correlation analysis, and principal component analysis were investigated. All the methods were discussed and compared as to their ability to cluster, ease of use, and ease of interpretation. The correlation investigation clarifies the relationship among the lithology, hydrogeology, and anthropogenic. Factor investigation revealed three factors namely; the evaporation process-agriculturalimpact-lithogenic dissolution, the hydrogeological characteristics of the aquifer system, and the surface meteoric water that rechargethe aquifer system. Two main clusters that subdivided into four sub clusters were identified in groundwater system based on hydrogeological and hydrogeochemical data. They reflect the impact of geomedia, hydrogeology, geographic position, and agricultural wastewater. The groundwater is undersaturated with respect to most selected minerals. The groundwater was supersaturated with respect to iron minerals in northern and southern Darb El Arbaein. The partial pressure of CO2 of the groundwater versus saturation index of calcite shows the gradual change in PCO2 from atmospheric to the present aquifer

  8. Remarks about the thermodynamics of astrophysical systems in mutual interaction and related notions

    International Nuclear Information System (INIS)

    Velazquez, L

    2016-01-01

    Aspects concerning the thermodynamics of astrophysical systems are discussed, generally, and also more specifically those relating to astrophysical systems in mutual interaction (or the so-called open astrophysical systems). A special interest is devoted in this paper to clarifying several misconceptions that are still common in the recent literature, such as the direct application to the astrophysical scenario of notions and theoretical frameworks that were originally conceived to deal with extensive systems of everyday practice (large systems with short-range interactions). This discussion starts by reviewing the current understanding of the notion of negative heat capacity. Beyond this, to clarify its physical relevance, the conciliation of this notion with classical fluctuation theory is discussed, as well as equilibrium conditions concerning systems with negative heat capacities. These results prompt a revision of our understanding about critical phenomena, phase transitions and the so-called zeroth law of thermodynamics. Afterwards, general features about the thermodynamics of astrophysical systems are presented through the consideration of simple models available in the literature. Particular attention is devoted to the influence of evaporation on the macroscopic behavior of these systems. These antecedents are then applied to a critical approach towards the thermodynamics of astrophysical systems in mutual interaction. It is discussed that the long-range character of gravitation leads to the incidence of long-range correlations. This peculiarity imposes a series of important consequences, such as the non-separability of a single astrophysical structure into independent subsystems, the breakdown of additivity and conventional thermodynamic limit, a great sensibility of the macroscopic behavior to the external conditions, the restricted applicability of the so-called thermal contact in astrophysics, and hence, the non-relevance of conventional statistical

  9. Comparative Statistical Mechanics of Muscle and Non-Muscle Contractile Systems: Stationary States of Near-Equilibrium Systems in A Linear Regime

    Directory of Open Access Journals (Sweden)

    Yves Lecarpentier

    2017-10-01

    Full Text Available A. Huxley’s equations were used to determine the mechanical properties of muscle myosin II (MII at the molecular level, as well as the probability of the occurrence of the different stages in the actin–myosin cycle. It was then possible to use the formalism of statistical mechanics with the grand canonical ensemble to calculate numerous thermodynamic parameters such as entropy, internal energy, affinity, thermodynamic flow, thermodynamic force, and entropy production rate. This allows us to compare the thermodynamic parameters of a non-muscle contractile system, such as the normal human placenta, with those of different striated skeletal muscles (soleus and extensor digitalis longus as well as the heart muscle and smooth muscles (trachea and uterus in the rat. In the human placental tissues, it was observed that the kinetics of the actin–myosin crossbridges were considerably slow compared with those of smooth and striated muscular systems. The entropy production rate was also particularly low in the human placental tissues, as compared with that observed in smooth and striated muscular systems. This is partly due to the low thermodynamic flow found in the human placental tissues. However, the unitary force of non-muscle myosin (NMII generated by each crossbridge cycle in the myofibroblasts of the human placental tissues was similar in magnitude to that of MII in the myocytes of both smooth and striated muscle cells. Statistical mechanics represents a powerful tool for studying the thermodynamics of all contractile muscle and non-muscle systems.

  10. Leak detection and localization in a pipeline system by application of statistical analysis techniques

    International Nuclear Information System (INIS)

    Fukuda, Toshio; Mitsuoka, Toyokazu.

    1985-01-01

    The detection of leak in piping system is an important diagnostic technique for facilities to prevent accidents and to take maintenance measures, since the occurrence of leak lowers productivity and causes environmental destruction. As the first step, it is necessary to detect the occurrence of leak without delay, and as the second step, if the place of leak occurrence in piping system can be presumed, accident countermeasures become easy. The detection of leak by pressure is usually used for detecting large leak. But the method depending on pressure is simple and advantageous, therefore the extension of the detecting technique by pressure gradient method to the detection of smaller scale leak using statistical analysis techniques was examined for a pipeline in steady operation in this study. Since the flow in a pipe irregularly varies during pumping, statistical means is required for the detection of small leak by pressure. The index for detecting leak proposed in this paper is the difference of the pressure gradient at the both ends of a pipeline. The experimental results on water and air in nylon tubes are reported. (Kako, I.)

  11. Statistical aspects of nuclear structure

    International Nuclear Information System (INIS)

    Parikh, J.C.

    1977-01-01

    The statistical properties of energy levels and a statistical approach to transition strengths are discussed in relation to nuclear structure studies at high excitation energies. It is shown that the calculations can be extended to the ground state domain also. The discussion is based on the study of random matrix theory of level density and level spacings, using the Gaussian Orthogonal Ensemble (GOE) concept. The short range and long range correlations are also studied statistically. The polynomial expansion method is used to obtain excitation strengths. (A.K.)

  12. Measurement and statistics for teachers

    CERN Document Server

    Van Blerkom, Malcolm

    2008-01-01

    Written in a student-friendly style, Measurement and Statistics for Teachers shows teachers how to use measurement and statistics wisely in their classes. Although there is some discussion of theory, emphasis is given to the practical, everyday uses of measurement and statistics. The second part of the text provides more complete coverage of basic descriptive statistics and their use in the classroom than in any text now available.Comprehensive and accessible, Measurement and Statistics for Teachers includes:Short vignettes showing concepts in action Numerous classroom examples Highlighted vocabulary Boxes summarizing related concepts End-of-chapter exercises and problems Six full chapters devoted to the essential topic of Classroom Tests Instruction on how to carry out informal assessments, performance assessments, and portfolio assessments, and how to use and interpret standardized tests A five-chapter section on Descriptive Statistics, giving instructors the option of more thoroughly teaching basic measur...

  13. Basics of statistical physics

    CERN Document Server

    Müller-Kirsten, Harald J W

    2013-01-01

    Statistics links microscopic and macroscopic phenomena, and requires for this reason a large number of microscopic elements like atoms. The results are values of maximum probability or of averaging. This introduction to statistical physics concentrates on the basic principles, and attempts to explain these in simple terms supplemented by numerous examples. These basic principles include the difference between classical and quantum statistics, a priori probabilities as related to degeneracies, the vital aspect of indistinguishability as compared with distinguishability in classical physics, the differences between conserved and non-conserved elements, the different ways of counting arrangements in the three statistics (Maxwell-Boltzmann, Fermi-Dirac, Bose-Einstein), the difference between maximization of the number of arrangements of elements, and averaging in the Darwin-Fowler method. Significant applications to solids, radiation and electrons in metals are treated in separate chapters, as well as Bose-Eins...

  14. Explanation of the methods employed in the statistical evaluation of SALE program data

    International Nuclear Information System (INIS)

    Bracey, J.T.; Soriano, M.

    1981-01-01

    The analysis of Safeguards Analytical Laboratory Evaluation (SALE) bimonthly data is described. Statistical procedures are discussed in Section A, followed by the descriptions of tabular and graphic values in Section B. Calculation formulae for the various statistics in the reports are presented in Section C. SALE data reported to New Brunswick Laboratory (NBL) are entered into a computerized system through routine data processing procedures. Bimonthly and annual reports are generated from this data system. In the bimonthly data analysis, data from the six most recent reporting periods of each laboratory-material-analytical method combination are utilized. Analysis results in the bimonthly reports are only presented for those participants who have reported data at least once during the last 12-month period. Reported values are transformed to relative percent difference values calculated by [(reported value - reference value)/reference value] x 100. Analysis of data is performed on these transformed values. Accordingly, the results given in the bimonthly report are (relative) percent differences (% DIFF). Suspect, large variations are verified with individual participants to eliminate errors in the transcription process. Statistical extreme values are not excluded from bimonthly analysis; all data are used

  15. A statistical characterization method for damping material properties and its application to structural-acoustic system design

    International Nuclear Information System (INIS)

    Jung, Byung C.; Lee, Doo Ho; Youn, Byeng D.; Lee, Soo Bum

    2011-01-01

    The performance of surface damping treatments may vary once the surface is exposed to a wide range of temperatures, because the performance of viscoelastic damping material is highly dependent on operational temperature. In addition, experimental data for dynamic responses of viscoelastic material are inherently random, which makes it difficult to design a robust damping layout. In this paper a statistical modeling procedure with a statistical calibration method is suggested for the variability characterization of viscoelastic damping material in constrained-layer damping structures. First, the viscoelastic material property is decomposed into two sources: (I) a random complex modulus due to operational temperature variability, and (II) experimental/model errors in the complex modulus. Next, the variability in the damping material property is obtained using the statistical calibration method by solving an unconstrained optimization problem with a likelihood function metric. Two case studies are considered to show the influence of the material variability on the acoustic performances in the structural-acoustic systems. It is shown that the variability of the damping material is propagated to that of the acoustic performances in the systems. Finally, robust and reliable damping layout designs of the two case studies are obtained through the reliability-based design optimization (RBDO) amidst severe variability in operational temperature and the damping material

  16. Nonmaterialized Relations and the Support of Information Retrieval Applications by Relational Database Systems.

    Science.gov (United States)

    Lynch, Clifford A.

    1991-01-01

    Describes several aspects of the problem of supporting information retrieval system query requirements in the relational database management system (RDBMS) environment and proposes an extension to query processing called nonmaterialized relations. User interactions with information retrieval systems are discussed, and nonmaterialized relations are…

  17. To what extent does variability of historical rainfall series influence extreme event statistics of sewer system surcharge and overflows?

    DEFF Research Database (Denmark)

    Schaarup-Jensen, Kjeld; Rasmussen, Michael R.; Thorndahl, Søren

    2008-01-01

    In urban drainage modeling long term extreme statistics has become an important basis for decision-making e.g. in connection with renovation projects. Therefore it is of great importance to minimize the uncertainties concerning long term prediction of maximum water levels and combined sewer...... overflow (CSO) in drainage systems. These uncertainties originate from large uncertainties regarding rainfall inputs, parameters, and assessment of return periods. This paper investigates how the choice of rainfall time series influences the extreme events statistics of max water levels in manholes and CSO...... gauges are located at a distance of max 20 kilometers from the catchment. All gauges are included in the Danish national rain gauge system which was launched in 1976. The paper describes to what extent the extreme events statistics based on these 9 series diverge from each other and how this diversity...

  18. To what extent does variability of historical rainfall series influence extreme event statistics of sewer system surcharge and overflows?

    DEFF Research Database (Denmark)

    Schaarup-Jensen, Kjeld; Rasmussen, Michael R.; Thorndahl, Søren

    2009-01-01

    In urban drainage modelling long term extreme statistics has become an important basis for decision-making e.g. in connection with renovation projects. Therefore it is of great importance to minimize the uncertainties concerning long term prediction of maximum water levels and combined sewer...... overflow (CSO) in drainage systems. These uncertainties originate from large uncertainties regarding rainfall inputs, parameters, and assessment of return periods. This paper investigates how the choice of rainfall time series influences the extreme events statistics of max water levels in manholes and CSO...... gauges are located at a distance of max 20 kilometers from the catchment. All gauges are included in the Danish national rain gauge system which was launched in 1976. The paper describes to what extent the extreme events statistics based on these 9 series diverge from each other and how this diversity...

  19. Proceedings of the 1980 DOE statistical symposium

    International Nuclear Information System (INIS)

    Truett, T.; Margolies, D.; Mensing, R.W.

    1981-04-01

    Separate abstracts were prepared for 8 of the 16 papers presented at the DOE Statistical Symposium in California in October 1980. The topics of those papers not included cover the relative detection efficiency on sets of irradiated fuel elements, estimating failure rates for pumps in nuclear reactors, estimating fragility functions, application of bounded-influence regression, the influence function method applied to energy time series data, reliability problems in power generation systems and uncertainty analysis associated with radioactive waste disposal. The other 8 papers have previously been added to the data base

  20. Phase transition for the system of finite volume in the ϕ4 theory in the Tsallis nonextensive statistics

    Science.gov (United States)

    Ishihara, Masamichi

    2018-04-01

    We studied the effects of nonextensivity on the phase transition for the system of finite volume V in the ϕ4 theory in the Tsallis nonextensive statistics of entropic parameter q and temperature T, when the deviation from the Boltzmann-Gibbs (BG) statistics, |q ‑ 1|, is small. We calculated the condensate and the effective mass to the order q ‑ 1 with the normalized q-expectation value under the free particle approximation with zero bare mass. The following facts were found. The condensate Φ divided by v, Φ/v, at q (v is the value of the condensate at T = 0) is smaller than that at q‧ for q > q‧ as a function of Tph/v which is the physical temperature Tph divided by v. The physical temperature Tph is related to the variation of the Tsallis entropy and the variation of the internal energies, and Tph at q = 1 coincides with T. The effective mass decreases, reaches minimum, and increases after that, as Tph increases. The effective mass at q > 1 is lighter than the effective mass at q = 1 at low physical temperature and heavier than the effective mass at q = 1 at high physical temperature. The effects of the nonextensivity on the physical quantity as a function of Tph become strong as |q ‑ 1| increases. The results indicate the significance of the definition of the expectation value, the definition of the physical temperature, and the constraints for the density operator, when the terms including the volume of the system are not negligible.

  1. A statistical-based approach for fault detection and diagnosis in a photovoltaic system

    KAUST Repository

    Garoudja, Elyes

    2017-07-10

    This paper reports a development of a statistical approach for fault detection and diagnosis in a PV system. Specifically, the overarching goal of this work is to early detect and identify faults on the DC side of a PV system (e.g., short-circuit faults; open-circuit faults; and partial shading faults). Towards this end, we apply exponentially-weighted moving average (EWMA) control chart on the residuals obtained from the one-diode model. Such a choice is motivated by the greater sensitivity of EWMA chart to incipient faults and its low-computational cost making it easy to implement in real time. Practical data from a 3.2 KWp photovoltaic plant located within an Algerian research center is used to validate the proposed approach. Results show clearly the efficiency of the developed method in monitoring PV system status.

  2. Toward Global Comparability of Sexual Orientation Data in Official Statistics: A Conceptual Framework of Sexual Orientation for Health Data Collection in New Zealand’s Official Statistics System

    Directory of Open Access Journals (Sweden)

    Frank Pega

    2013-01-01

    Full Text Available Objective. Effectively addressing health disparities experienced by sexual minority populations requires high-quality official data on sexual orientation. We developed a conceptual framework of sexual orientation to improve the quality of sexual orientation data in New Zealand’s Official Statistics System. Methods. We reviewed conceptual and methodological literature, culminating in a draft framework. To improve the framework, we held focus groups and key-informant interviews with sexual minority stakeholders and producers and consumers of official statistics. An advisory board of experts provided additional guidance. Results. The framework proposes working definitions of the sexual orientation topic and measurement concepts, describes dimensions of the measurement concepts, discusses variables framing the measurement concepts, and outlines conceptual grey areas. Conclusion. The framework proposes standard definitions and concepts for the collection of official sexual orientation data in New Zealand. It presents a model for producers of official statistics in other countries, who wish to improve the quality of health data on their citizens.

  3. The neurobiology of uncertainty: implications for statistical learning.

    Science.gov (United States)

    Hasson, Uri

    2017-01-05

    The capacity for assessing the degree of uncertainty in the environment relies on estimating statistics of temporally unfolding inputs. This, in turn, allows calibration of predictive and bottom-up processing, and signalling changes in temporally unfolding environmental features. In the last decade, several studies have examined how the brain codes for and responds to input uncertainty. Initial neurobiological experiments implicated frontoparietal and hippocampal systems, based largely on paradigms that manipulated distributional features of visual stimuli. However, later work in the auditory domain pointed to different systems, whose activation profiles have interesting implications for computational and neurobiological models of statistical learning (SL). This review begins by briefly recapping the historical development of ideas pertaining to the sensitivity to uncertainty in temporally unfolding inputs. It then discusses several issues at the interface of studies of uncertainty and SL. Following, it presents several current treatments of the neurobiology of uncertainty and reviews recent findings that point to principles that serve as important constraints on future neurobiological theories of uncertainty, and relatedly, SL. This review suggests it may be useful to establish closer links between neurobiological research on uncertainty and SL, considering particularly mechanisms sensitive to local and global structure in inputs, the degree of input uncertainty, the complexity of the system generating the input, learning mechanisms that operate on different temporal scales and the use of learnt information for online prediction.This article is part of the themed issue 'New frontiers for statistical learning in the cognitive sciences'. © 2016 The Author(s).

  4. Transportation statistics annual report, 2015

    Science.gov (United States)

    2016-01-01

    The Transportation Statistics Annual Report : describes the Nations transportation system, : the systems performance, its contributions to : the economy, and its effects on people and the : environment. This 20th edition of the report is : base...

  5. Transportation statistics annual report, 2013

    Science.gov (United States)

    2014-01-01

    The Transportation Statistics Annual Report : describes the Nations transportation system, : the systems performance, its contributions to : the economy, and its effects on people and the : environment. This 18th edition of the report is : base...

  6. A statistical analysis on failure-to open/close probability of pneumatic valve in sodium cooling systems

    International Nuclear Information System (INIS)

    Kurisaka, Kenichi

    1999-11-01

    The objective of this study is to develop fundamental data for examination on efficiency of preventive maintenance and surveillance test from the standpoint of failure probability. In this study, as a major standby component, a pneumatic valve in sodium cooling systems was selected. A statistical analysis was made about a trend of valve in sodium cooling systems was selected. A statistical analysis was made about a trend of valve failure-to-open/close (FTOC) probability depending on number of demands ('n'), time since installation ('t') and standby time since last open/close action ('T'). The analysis is based on the field data of operating- and failure-experiences stored in the Component Reliability Database and Statistical Analysis System for LMFBR's (CORDS). In the analysis, the FTOC probability ('P') was expressed as follows: P=1-exp{-C-En-F/n-λT-aT(t-T/2)-AT 2 /2}. The functional parameters, 'C', 'E', 'F', 'λ', 'a' and 'A', were estimated with the maximum likelihood estimation method. As a result, the FTOC probability is almost expressed with the failure probability being derived from the failure rate under assumption of the Poisson distribution only when valve cycle (i.e. open-close-open cycle) exceeds about 100 days. When the valve cycle is shorter than about 100 days, the FTOC probability can be adequately estimated with the parameter model proposed in this study. The results obtained from this study may make it possible to derive an adequate frequency of surveillance test for a given target of the FTOC probability. (author)

  7. The derivation and application of a risk related value for saving a statistical life

    International Nuclear Information System (INIS)

    Jackson, D.; Stone, D.; Butler, G.G.; Mcglynn, G.

    2004-01-01

    A risk related value of spend for saving a statistical life (VSSSL) is proposed for cost-benefit studies across the power generation sector, and the nuclear industry in particular. An upper bound on VSSSL is set based on the UK government standard of around pound 1 M or, in particular circumstances, pound 2 M and the observation that excessive spend (probably of the order of more than pound 5 M per statistical life) will actually cost lives. Above a risk of 10 -3 a -1 it is assumed that VSSSL approaches maximum sustainable value around pound 2 M, whereas below a risk of 10 -9 a -1 the value of further risk reduction approaches zero. At risks around 10 -6 a -1 it is proposed that an appropriate VSSL lies in the range pound 0.25 M to pound 1 M. With respect to radiological protection, it is suggested that where collective doses are dominated by average individual doses no more than a few μSv, the detriment arising from a man-Sv can be valued at about pound 15 k to pound 60 k. It is further suggested that for individual dose contributions below 0.01 μSv (representing a risk equivalent to less than 10 -9 ) a low residual VSSSL should be applied in cost-benefit analyses based on collective dose exposures. (author)

  8. Bayesian analysis of systems with random chemical composition: renormalization-group approach to Dirichlet distributions and the statistical theory of dilution.

    Science.gov (United States)

    Vlad, Marcel Ovidiu; Tsuchiya, Masa; Oefner, Peter; Ross, John

    2002-01-01

    We investigate the statistical properties of systems with random chemical composition and try to obtain a theoretical derivation of the self-similar Dirichlet distribution, which is used empirically in molecular biology, environmental chemistry, and geochemistry. We consider a system made up of many chemical species and assume that the statistical distribution of the abundance of each chemical species in the system is the result of a succession of a variable number of random dilution events, which can be described by using the renormalization-group theory. A Bayesian approach is used for evaluating the probability density of the chemical composition of the system in terms of the probability densities of the abundances of the different chemical species. We show that for large cascades of dilution events, the probability density of the composition vector of the system is given by a self-similar probability density of the Dirichlet type. We also give an alternative formal derivation for the Dirichlet law based on the maximum entropy approach, by assuming that the average values of the chemical potentials of different species, expressed in terms of molar fractions, are constant. Although the maximum entropy approach leads formally to the Dirichlet distribution, it does not clarify the physical origin of the Dirichlet statistics and has serious limitations. The random theory of dilution provides a physical picture for the emergence of Dirichlet statistics and makes it possible to investigate its validity range. We discuss the implications of our theory in molecular biology, geochemistry, and environmental science.

  9. Statistical mechanics of few-particle systems: exact results for two useful models

    Science.gov (United States)

    Miranda, Enrique N.

    2017-11-01

    The statistical mechanics of small clusters (n ˜ 10-50 elements) of harmonic oscillators and two-level systems is studied exactly, following the microcanonical, canonical and grand canonical formalisms. For clusters with several hundred particles, the results from the three formalisms coincide with those found in the thermodynamic limit. However, for clusters formed by a few tens of elements, the three ensembles yield different results. For a cluster with a few tens of harmonic oscillators, when the heat capacity per oscillator is evaluated within the canonical formalism, it reaches a limit value equal to k B , as in the thermodynamic case, while within the microcanonical formalism the limit value is k B (1-1/n). This difference could be measured experimentally. For a cluster with a few tens of two-level systems, the heat capacity evaluated within the canonical and microcanonical ensembles also presents differences that could be detected experimentally. Both the microcanonical and grand canonical formalism show that the entropy is non-additive for systems this small, while the canonical ensemble reaches the opposite conclusion. These results suggest that the microcanonical ensemble is the most appropriate for dealing with systems with tens of particles.

  10. Willingness to share research data is related to the strength of the evidence and the quality of reporting of statistical results.

    Directory of Open Access Journals (Sweden)

    Jelte M Wicherts

    Full Text Available BACKGROUND: The widespread reluctance to share published research data is often hypothesized to be due to the authors' fear that reanalysis may expose errors in their work or may produce conclusions that contradict their own. However, these hypotheses have not previously been studied systematically. METHODS AND FINDINGS: We related the reluctance to share research data for reanalysis to 1148 statistically significant results reported in 49 papers published in two major psychology journals. We found the reluctance to share data to be associated with weaker evidence (against the null hypothesis of no effect and a higher prevalence of apparent errors in the reporting of statistical results. The unwillingness to share data was particularly clear when reporting errors had a bearing on statistical significance. CONCLUSIONS: Our findings on the basis of psychological papers suggest that statistical results are particularly hard to verify when reanalysis is more likely to lead to contrasting conclusions. This highlights the importance of establishing mandatory data archiving policies.

  11. Statistically interacting quasiparticles in Ising chains

    International Nuclear Information System (INIS)

    Lu Ping; Vanasse, Jared; Piecuch, Christopher; Karbach, Michael; Mueller, Gerhard

    2008-01-01

    The exclusion statistics of two complementary sets of quasiparticles, generated from opposite ends of the spectrum, are identified for Ising chains with spin s = 1/2, 1. In the s = 1/2 case the two sets are antiferromagnetic domain walls (solitons) and ferromagnetic domains (strings). In the s = 1 case they are soliton pairs and nested strings, respectively. The Ising model is equivalent to a system of two species of solitons for s = 1/2 and to a system of six species of soliton pairs for s = 1. Solitons exist on single bonds but soliton pairs may be spread across many bonds. The thermodynamics of a system of domains spanning up to M lattice sites is amenable to exact analysis and shown to become equivalent, in the limit M → ∞, to the thermodynamics of the s = 1/2 Ising chain. A relation is presented between the solitons in the Ising limit and the spinons in the XX limit of the s = 1/2 XXZ chain

  12. Statistical laws in urban mobility from microscopic GPS data in the area of Florence

    International Nuclear Information System (INIS)

    Bazzani, Armando; Giorgini, Bruno; Rambaldi, Sandro; Gallotti, Riccardo; Giovannini, Luca

    2010-01-01

    The application of Statistical Physics to social systems is mainly related to the search for macroscopic laws that can be derived from experimental data averaged in time or space, assuming the system in a steady state. One of the major goals would be to find a connection between the statistical laws and the microscopic properties: for example, to understand the nature of the microscopic interactions or to point out the existence of interaction networks. Probability theory suggests the existence of a few classes of stationary distributions in the thermodynamics limit, so that the question is if a statistical physics approach could be able to enroll the complex nature of the social systems. We have analyzed a large GPS database for single-vehicle mobility in the Florence urban area, obtaining statistical laws for path lengths, for activity downtimes and for activity degrees. We show also that simple generic assumptions on the microscopic behavior could explain the existence of stationary macroscopic laws, with a universal function describing the distribution. Our conclusion is that understanding the system complexity requires a dynamical database for the microscopic evolution, which allows us to solve both small space and time scales in order to study the transients

  13. Statistical limitations in functional neuroimaging. I. Non-inferential methods and statistical models.

    Science.gov (United States)

    Petersson, K M; Nichols, T E; Poline, J B; Holmes, A P

    1999-01-01

    Functional neuroimaging (FNI) provides experimental access to the intact living brain making it possible to study higher cognitive functions in humans. In this review and in a companion paper in this issue, we discuss some common methods used to analyse FNI data. The emphasis in both papers is on assumptions and limitations of the methods reviewed. There are several methods available to analyse FNI data indicating that none is optimal for all purposes. In order to make optimal use of the methods available it is important to know the limits of applicability. For the interpretation of FNI results it is also important to take into account the assumptions, approximations and inherent limitations of the methods used. This paper gives a brief overview over some non-inferential descriptive methods and common statistical models used in FNI. Issues relating to the complex problem of model selection are discussed. In general, proper model selection is a necessary prerequisite for the validity of the subsequent statistical inference. The non-inferential section describes methods that, combined with inspection of parameter estimates and other simple measures, can aid in the process of model selection and verification of assumptions. The section on statistical models covers approaches to global normalization and some aspects of univariate, multivariate, and Bayesian models. Finally, approaches to functional connectivity and effective connectivity are discussed. In the companion paper we review issues related to signal detection and statistical inference. PMID:10466149

  14. Determination of dominant biogeochemical processes in a contaminated aquifer-wetland system using multivariate statistical analysis

    Science.gov (United States)

    Baez-Cazull, S. E.; McGuire, J.T.; Cozzarelli, I.M.; Voytek, M.A.

    2008-01-01

    Determining the processes governing aqueous biogeochemistry in a wetland hydrologically linked to an underlying contaminated aquifer is challenging due to the complex exchange between the systems and their distinct responses to changes in precipitation, recharge, and biological activities. To evaluate temporal and spatial processes in the wetland-aquifer system, water samples were collected using cm-scale multichambered passive diffusion samplers (peepers) to span the wetland-aquifer interface over a period of 3 yr. Samples were analyzed for major cations and anions, methane, and a suite of organic acids resulting in a large dataset of over 8000 points, which was evaluated using multivariate statistics. Principal component analysis (PCA) was chosen with the purpose of exploring the sources of variation in the dataset to expose related variables and provide insight into the biogeochemical processes that control the water chemistry of the system. Factor scores computed from PCA were mapped by date and depth. Patterns observed suggest that (i) fermentation is the process controlling the greatest variability in the dataset and it peaks in May; (ii) iron and sulfate reduction were the dominant terminal electron-accepting processes in the system and were associated with fermentation but had more complex seasonal variability than fermentation; (iii) methanogenesis was also important and associated with bacterial utilization of minerals as a source of electron acceptors (e.g., barite BaSO4); and (iv) seasonal hydrological patterns (wet and dry periods) control the availability of electron acceptors through the reoxidation of reduced iron-sulfur species enhancing iron and sulfate reduction. Copyright ?? 2008 by the American Society of Agronomy, Crop Science Society of America, and Soil Science Society of America. All rights reserved.

  15. Beyond δ : Tailoring marked statistics to reveal modified gravity

    Science.gov (United States)

    Valogiannis, Georgios; Bean, Rachel

    2018-01-01

    Models that seek to explain cosmic acceleration through modifications to general relativity (GR) evade stringent Solar System constraints through a restoring, screening mechanism. Down-weighting the high-density, screened regions in favor of the low density, unscreened ones offers the potential to enhance the amount of information carried in such modified gravity models. In this work, we assess the performance of a new "marked" transformation and perform a systematic comparison with the clipping and logarithmic transformations, in the context of Λ CDM and the symmetron and f (R ) modified gravity models. Performance is measured in terms of the fractional boost in the Fisher information and the signal-to-noise ratio (SNR) for these models relative to the statistics derived from the standard density distribution. We find that all three statistics provide improved Fisher boosts over the basic density statistics. The model parameters for the marked and clipped transformation that best enhance signals and the Fisher boosts are determined. We also show that the mark is useful both as a Fourier and real-space transformation; a marked correlation function also enhances the SNR relative to the standard correlation function, and can on mildly nonlinear scales show a significant difference between the Λ CDM and the modified gravity models. Our results demonstrate how a series of simple analytical transformations could dramatically increase the predicted information extracted on deviations from GR, from large-scale surveys, and give the prospect for a much more feasible potential detection.

  16. Statistical yearbook 2002-2004. Data available as of February 2005. 49 ed

    International Nuclear Information System (INIS)

    2005-09-01

    This is the forty-ninth issue of the United Nations Statistical Yearbook, prepared by the Statistics Division, Department of Economic and Social Affairs of the United Nations Secretariat. The data included generally cover the years between 1993 and 2003 and are, for the most part, those statistics which were available to the Statistics Division as of February 2005. The 81 tables of the Yearbook are based on data compiled by the Statistics Division from over 35 international and national sources. These sources include the United Nations Statistics Division in the fields of national accounts, industry, energy, transport and international trade, the United Nations Statistics Division and Population Division in the field of demographic statistics, and over 20 offices of the United Nations system and international organizations in other specialized fields. The Yearbook is organized in four parts. The first part, World and Region Summary, presents key world and regional aggregates and totals. In the other three parts, the subject matter is generally presented by countries or areas, with world and regional aggregates shown in some cases only. Parts two, three and four cover, respectively, population and social topics, national economic activity, and international economic relations. Each chapter ends with brief technical notes on statistical sources and methods for the tables it includes. References to sources and related methodological publications are provided at the end of the Yearbook in the section 'Statistical sources and references'. Annex I provides complete information on country and area nomenclature, and regional and other groupings used in the Yearbook. Annex II lists conversion coefficients and factors used in various tables. A list of tables added to or omitted from the last issue of the Yearbook is given in annex III. Symbols and conventions used in the Yearbook are shown in the section 'Explanatory notes, preceding the Introduction

  17. Transportation Statistics Annual Report, 2017

    Science.gov (United States)

    2018-01-01

    The Transportation Statistics Annual Report describes the Nations transportation system, the systems performance, its contributions to the economy, and its effects on people and the environment. This 22nd edition of the report is based on infor...

  18. A Statistical Methodology for Determination of Safety Systems Actuation Setpoints Based on Extreme Value Statistics

    Directory of Open Access Journals (Sweden)

    D. R. Novog

    2008-01-01

    Full Text Available This paper provides a novel and robust methodology for determination of nuclear reactor trip setpoints which accounts for uncertainties in input parameters and models, as well as accounting for the variations in operating states that periodically occur. Further it demonstrates that in performing best estimate and uncertainty calculations, it is critical to consider the impact of all fuel channels and instrumentation in the integration of these uncertainties in setpoint determination. This methodology is based on the concept of a true trip setpoint, which is the reactor setpoint that would be required in an ideal situation where all key inputs and plant responses were known, such that during the accident sequence a reactor shutdown will occur which just prevents the acceptance criteria from being exceeded. Since this true value cannot be established, the uncertainties in plant simulations and plant measurements as well as operational variations which lead to time changes in the true value of initial conditions must be considered. This paper presents the general concept used to determine the actuation setpoints considering the uncertainties and changes in initial conditions, and allowing for safety systems instrumentation redundancy. The results demonstrate unique statistical behavior with respect to both fuel and instrumentation uncertainties which has not previously been investigated.

  19. A fuzzy expert system based on relations

    International Nuclear Information System (INIS)

    Hall, L.O.; Kandel, A.

    1986-01-01

    The Fuzzy Expert System (FESS) is an expert system which makes use of the theory of fuzzy relations to perform inference. Relations are very general and can be used for any application, which only requires different types of relations be implemented and used. The incorporation of fuzzy reasoning techniques enables the expert system to deal with imprecision in a well-founded manner. The knowledge is represented in relational frames. FESS may operate in either a forward chaining or backward chaining manner. It uses primarily implication and factual relations. A unique methodology for combination of evidence has been developed. It makes uses of a blackboard for communication between the various knowledge sources which may operate in parallel. The expert system has been designed in such a manner that it may be used for diverse applications

  20. Statistical properties of highly excited quantum eigenstates of a strongly chaotic system

    International Nuclear Information System (INIS)

    Aurich, R.; Steiner, F.

    1992-06-01

    Statistical properties of highly excited quantal eigenstates are studied for the free motion (geodesic flow) on a compact surface of constant negative curvature (hyperbolic octagon) which represents a strongly chaotic system (K-system). The eigenstates are expanded in a circular-wave basis, and it turns out that the expansion coefficients behave as Gaussian pseudo-random numbers. It is shown that this property leads to a Gaussian amplitude distribution P(ψ) in the semiclassical limit, i.e. the wavefunctions behave as Gaussian random functions. This behaviour, which should hold for chaotic systems in general, is nicely confirmed for eigenstates lying 10000 states above the ground state thus probing the semiclassical limit. In addition, the autocorrelation function and the path-correlation function are calculated and compared with a crude semiclassical Bessel-function approximation. Agreement with the semiclassical prediction is only found, if a local averaging is performed over roughly 1000 de Broglie wavelengths. On smaller scales, the eigenstates show much more structure than predicted by the first semiclassical approximation. (orig.)